Author Topic: Looking for old program or help recreating it  (Read 27983 times)

0 Members and 1 Guest are viewing this topic.

Offline STxAxTIC

  • Library Staff
  • Forum Resident
  • Posts: 1091
  • he lives
    • View Profile
Re: Looking for old program or help recreating it
« Reply #90 on: December 28, 2021, 08:54:35 pm »
What if we change the output so that (0)=miss and (1)=hit and then run that string back through
the predictor to see what happens.

Ya know, this thought crossed my mind but I haven't had the nuts to try it. It's kindof a fascinating idea - for the model to study and find patterns in its own performance (it's a binary sequence after all), so that it may predict its own performance and improve upon it. There is so much to mess with there...!

This reminds me of another hunt I wanted to launch that looks for "eigenstrings". I just made up that word up, it's a play on "eigenvector" which (from linear algebra) is a vector x that would satisfy Ax=x, where A is a matrix. In our case, A is the predictor and x is the sequence. This means that the sequence would be identical to the correctness record. Crazy stuff!

Reeling things back in a second... this kind of thing gets dangerously close to alchemy, which is to say just peering into sacrificial goat entrails for patterns. We could do this for eternity and never notice when the mathematics stopped and the mad hattery began. Luckily we aren't there yet. I think the future of this work is less about the algorithms working blindly, and more about using any meta-knowledge you have about the input sequences themselves. This is to say, "where is this data coming from?"

Anyhow I'll keep writing up the interesting stuff - will let you know if anything new interesting stuff pops up.
You're not done when it works, you're done when it's right.

Offline random1

  • Newbie
  • Posts: 86
    • View Profile
Re: Looking for old program or help recreating it
« Reply #91 on: December 29, 2021, 01:59:45 am »
 STxAxTIC

Mad hattery, so many rabbit holes to explore, lol.  Just hope this does not cause you
to move to Lincoln Montana and start making bombs.  My math skills ran out a ways
back and I am feeling a little parasitic feeding off your knowledge.   

I think if it can be done, you are the one to do it.  It's already surpassed my expectations
so everything right, our should I say, left of here is a bonus. 

Thanks

R1
 

   
   

Offline random1

  • Newbie
  • Posts: 86
    • View Profile
Re: Looking for old program or help recreating it
« Reply #92 on: December 29, 2021, 03:48:44 am »
Here is a simple Gap / Frequency tool.  I once used something like this as a secondary
predictor to kind of second guess the main predictors output.  Don't remember how
well if fared.  Notice that a gap of 35 shows a perfect string of (0's), 6 I think.  The
sample data string is very short and is not a real output string.  Just a simple code bit
to illustrate an idea.

Code: QB64: [Select]
  1. '''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''
  2. 'Frequency / Gap tool.  Cartoon code                                                                                                       '
  3. 'This is a gaps analysis tool.  It is set go check every gap from 2 to 50 between any two points of the output data string  '
  4. 'The goal is to find the best gap value between two consecutive 1 or 0.                                                     '
  5. 'The first count is offset by one to sync up with the next event                                                            '
  6. '''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''
  7. Title1$ = "Gap Analysis"
  8. _Title Title1$
  9. A& = _NewImage(600,200,32)
  10.  
  11. Dim OutString as string '<- The output from the predictor
  12. 'small sample string
  13. OutString = "01010000100010000110100100000101100110000110101100110100100000110000000010010001101101110011100001000110000001100100000001111001011011000000001000000111010100001100001000010010001110100011000111111000110000000000100000111000010000100000010011110110000000001101"
  14. Prediction$=""
  15.  
  16. Gap = 2  '<- start gap value
  17. Do while Gap <= 50
  18. NewStr$="" '<- build a result string' here for illistration only.  This string will be a condenced version of the one being processed
  19.  
  20. hit=0:mis=0
  21. L1=Gap-1 '<- offset counter by 1 so data is synced for prediction
  22.   Do while L1 <= Len(OutString)-Gap
  23.       If Mid$(OutString,L1,1)="1" Then hit=hit+1 else mis=mis+1
  24.       If Mid$(OutString,L1,1)="1" Then NewStr$=NewStr$+"1" else NewStr$=NewStr$+"0" '<- create a results string for multi processing if needed
  25.   L1=L1+Gap
  26.  
  27. 'simple analysis
  28. 'if results consist entirely of 0's or 1's then update predicted value
  29. If Instr(NewStr$,"1") < 1 Then Prediction$=Prediction$+"0"
  30. If Instr(NewStr$,"0") < 1 Then Prediction$=Prediction$+"1"
  31.  
  32. 'Print results for each gap
  33. Color _RGB32(192,192,192),_RGB32(0,0,0)
  34. _PrintString (2, 4), "Gap -> " + _Trim$(Str$(Gap))
  35. _PrintString (2,20), "Gap hit (1) -> " + _Trim$(Str$(hit))
  36. _PrintString (2,36), "Gap hit (0) -> " + _Trim$(Str$(mis))
  37. _PrintString (2,52), NewStr$
  38.  
  39. Sleep  '<- comment out to automate the process
  40. Gap=Gap+1 '<- Add 1 to gap
  41. Color _RGB32(0,0,0),_RGB32(0,0,0)
  42. If Gap < 50 Then CLS
  43.  
  44. 'overall results
  45. Color _RGB32(192,192,192),_RGB32(0,0,0)
  46. _PrintString (2,68), "Overall findings -> " + Prediction$
  47. _PrintString(2,180), "Press any key to exit"
  48.  
  49.  

R1

Offline STxAxTIC

  • Library Staff
  • Forum Resident
  • Posts: 1091
  • he lives
    • View Profile
Re: Looking for old program or help recreating it
« Reply #93 on: January 01, 2022, 12:46:14 am »
Hey R1,

I have a happy little update and a little storyline to go with it. So apart from all of the eggheaded things I was planning to do with this code, I was quiet about *one* endeavor - just in case it failed - and it held up production until today - it's finally finished. What I've managed to do is use my unaltered Analyze function as a random number generator. I went through serious pain to make sure it gives "real" randomness, and almost quit looking several times thinking it may be impossible. Way too much to unpack here, but the conclusion is I used what was laying around to find a full-blown PRNG.

This came about by thinking along the lines of what you said, namely
Quote
What if we change the output so that (0)=miss and (1)=hit and then run that string back through
the predictor to see what happens.
... and thought about it a lot, and realized that's kindof what's going on when I build "pathological" sequences. To refresh: suppose you have a given model W. The "pathological" sequence is the sequence that W gets *wrong* every single turn. I detailed how to build these things on the web page. Anyway, to make a random number generator, I figured, "let's build a pathological string but find a way to randomize W while building it." The way I "randomize" W is to replace W with a sample of the output sequence itself.

Just for comparison, here is how I use QB64's in-house RND function to generate a random sequence of a given length from a given seed:

Code: QB64: [Select]
  1. Function QBPRNG$ (TheSeed As Double, TheLength As Integer)
  2.     Dim TheReturn As String
  3.     Randomize TheSeed
  4.     Do
  5.         If (Rnd > .5) Then
  6.             TheReturn = TheReturn + "1"
  7.         Else
  8.             TheReturn = TheReturn + "0"
  9.         End If
  10.     Loop Until (Len(TheReturn) = TheLength)
  11.     QBPRNG$ = TheReturn

And here is the new kid in town. Also makes random sequences but without RND:

Code: QB64: [Select]
  1. Function ProtoRand$ (TheSeed As String, TheLength As Integer)
  2.     Dim TheReturn As String
  3.     Dim WorkingString As String
  4.     TheReturn = ""
  5.     WorkingString = "0100100010" + TheSeed
  6.     Do
  7.         Call InitializeModelLiteral(Right$(WorkingString, 10))
  8.         If (Analyze(WorkingString, AlphaWeight(), 0) = 1) Then
  9.             TheReturn = TheReturn + "0"
  10.             WorkingString = WorkingString + "0"
  11.         Else
  12.             TheReturn = TheReturn + "1"
  13.             WorkingString = WorkingString + "1"
  14.         End If
  15.         If (Len(WorkingString) > 80) Then WorkingString = Right$(WorkingString, 80)
  16.     Loop Until Len(TheReturn) = TheLength
  17.     ProtoRand$ = TheReturn

You can see that I "replace W with a sample of the output sequence itself" right after the DO statement. Pretty esoteric stuff if you don't remember exactly how we got to this point, but nonetheless there it is! That move puzzles things enough so that the output is just utterly jumbled. (I can explain why the number 80 appears in this code, it can really be any number bigger than a few dozen.) Now I can exhale a second time. (And write some of this stuff up. All of these notes will go up on the page soon enough.)

Alright happy new year!
« Last Edit: January 01, 2022, 01:58:32 am by STxAxTIC »
You're not done when it works, you're done when it's right.

Offline random1

  • Newbie
  • Posts: 86
    • View Profile
Re: Looking for old program or help recreating it
« Reply #94 on: January 02, 2022, 07:30:44 am »
STxAxTIC

Cool, can't wait for the update.  I finished the little 3 stage predictor tool have been working
on the last couple weeks.  It's more of a idea I decided to test and see what happens.

What I did was to combine three simple predictors where each one outputted a probability
which are then combined in the final prediction.

The calculations go like this, the predictors end count for 0 and 1 are converted into probabilities.
Next I add a small boost weight to the 1's and then use p^2 to both the 0's and 1's calculation. 
It then moves to the next prediction tool and repeats the process. 

After all active predictors have ran the last stage sums the probabilities and then divides by the
number of active predictors, Remember I can run any configuration of the three, 1 & 2 or 1 & 3
etc...   The value with the greatest probability is the end prediction.

The config/settings menu allows adjusting certain things like the length of the string to process,
search sample size, number of iterations, weights etc...  I was able to fine the three predictors
close to 70 percent hit rates when ran one at a time.

The problem.
Tuning the predictors one at a time for maximum hit rates seemed fine but when I run them 
combined the results dropped like a rock, around 70% down to mid 40's to low 50's mostly
because it predicts way too many ones, 60 to 80 for a average.  Lowering the boost value
does not seem to help.   This tool is designed to only predict (1's), all other value are ignored
and the data strings are non-standard so to say.   There are around 112 strings used by this
tool where around 30% repeat day to day.  The idea is to predict 20 to 25 1's out of the 112
strings with a very high accuracy.   

Anyway, that's where I am at.  Still thinking I might be able to improve it, maybe set one of the
predictors to boost (0) counts to decrease the overall 1's predicted.  I am still playing around with
the settings in the configuration menu but think I may have to go back into the code and change
a few things.

I check your links each day for updates, thanks again.  Hope you and yours have a great new year.
 
R1   
 

Offline random1

  • Newbie
  • Posts: 86
    • View Profile
Re: Looking for old program or help recreating it
« Reply #95 on: January 03, 2022, 04:33:32 am »
Hi all

I need some help with a scoring method for the above mentioned prediction tool.
Below is a picture of the scoreboard so to say but what I want to do is combine
all the data into a overall score so that I can track the overall effects of my fine
tuning.


https://i.postimg.cc/L6sKrkMs/SCORE.png


This results shown in the picture are based on a real data back-test.  This allows
me to analyze the predictors results to the actual event.  The first item in the pic
shows the back-test number, the second shows the number of strings processed
by the predictor.   The third is the number of predictions made by the predictor.
The forth value shows the actual number of strings that did repeat. The fifth value
shows the number of correct predictions from the possible 42 that did repeat.
The sixth line is the number of incorrect predictions made.  The seventh is a
simple probability gotten form dividing the total correct guesses by the number
of predictions made.  36/67= .5373. 

So out of 105 strings processed, there were a possible 42 correct choices for a (1)
to repeat.  Of these the predictor managed to predict 36 for 36/42=0.857 correct.

I need to combine everything in such a way that I can score the predictors overall
performance.  Also remember the predictor only predicts (1's) ie repeats.
         
overall predictions = 67/105=.638
correct predictions = 36/67=.537 but we need to factor in the 105 strings processed
at least I think this needs to be included.  etc..

Thanks in advance.

PS.  The predictor guessed correctly 36 times from a pool of the 42 possible in 67 predictions
made from analyzing 105 strings.  Calculate a score given all the variables.

R1






« Last Edit: January 03, 2022, 04:45:28 am by random1 »

Offline random1

  • Newbie
  • Posts: 86
    • View Profile
Re: Looking for old program or help recreating it
« Reply #96 on: January 03, 2022, 05:22:56 am »
Hi all

I figured out my own question, since the predictor predicts both zeros and ones
even though the ones are the only values tracked.  All I had to do was count the
number of correct (0) guesses and add that to the ones then divide the sum by
the number of strings processed.  This gives me a overall hit rate which is really
all I needed.  Stupid monkey, get some sleep. LOL

R1
 

Offline STxAxTIC

  • Library Staff
  • Forum Resident
  • Posts: 1091
  • he lives
    • View Profile
Re: Looking for old program or help recreating it
« Reply #97 on: January 03, 2022, 06:40:55 am »
Good evening R1 -

Close call with that question you just pitched - I was about to roll up my sleeves before the alarm was suddenly silenced.

Just wanted to let you know that the page is now "fully" caught up to everything I was talking about. I'll probably notice little typos as time goes on, but the main flow is there. Doing a quick scroll up and down the page, it's a lot more dense than part 1. I know you have the link but for completeness: http://barnes.x10host.com/pages/Binary-Analyzer/Binary-Analyzer-Two.php

I've also done a very little bit of quality-of-life cleanup to the code. It's now set to simply process a file if you drag+drop onto it, or if you don't, it can be hacked by band. All of the delays and limits are commented out. Right now it's default thing is to crank through the "troublesome" sequence you pasted on (I think) the 16's of Dec. I'm still running it right now, but that string is at least 63% predictable so far.

I'm basically happy with zipping up this project for long-term storage at this point. (The game I play is, pretend I forget all the details and rediscover my own project later on - will it still make sense? My answer right now is "prolly".) I'll attach the "final" code at the bottom. It can of course be butchered for its Analyze() function. Everything in the main loop can be deleted and the core will still do its job.

You said some interesting stuff above, particularly when combining models. Isn't it funny how the results drop when you do that? It reminds me of how hurricanes are modeled on the news. You always see this bundle of possible trajectories on the statewide map which represent competing models. You might say, "hey, why not bundle those lines together and just imagine one thick bold line?" Sure, but then you have to combine uncertainies too. The possible trajectory widens to a huge cone, and you're back to wondering at random where the hurricane's gonna go. It might be better to just pick one model from the many and not melt them all together.

I'm not sure how exactly you're optimizing your solution, but imagine a search in parameter space. Lemme freestyle this... Suppose your models are M1, M2, and M3. Then the nth digit in your sequence is predicted by each model, denoted as M1(n), M2(n), M3(n). Okay, so you talked about switching these on or off. Did you try blending them? Like this:

P(n) = w(1)*M1(n) + w(2)*M2(n) + w(3)*M3(n)

This means the total prediction of the nth bit is the weighted average of all three models, and the question becomes what the hell do we do with w(1), w(2), and w(3)? The most prudent thing possible is to imagine a sphere... actually ya know what? Let me do this in two dimensions first. Suppose for clarity you have only two models. Then we can skip the third everything:

P(n) = w(1)*M1(n) + w(2)*M2(n)

Now check this out. Suppose w(1) = cos(q), w(2)=sin(q). If you're trigonometrically inclined, I'm now imagining a circle of radius 1, and the weights w(1), w(2) are the x,y coordinates of a point on this circle at angle q. Then, if you want to cycle through all possible combinations of models M1 and M2, let the q-variable step from 0 to 2pi at a fine interval. (Adjust the interval to avoid negative weights but the gist is there.)

Tell me if I'm out in right field. If this sounds appealing, we can develop the 3d case based on a sphere.


Code: QB64: [Select]
  1. ' Version: 2022-01-03
  2.  
  3.  
  4. _TITLE "Binary Analyzer"
  5.  
  6. SCREEN _NEWIMAGE(120, 40)
  7.  
  8. TYPE LetterBin
  9.     Signature AS STRING
  10.     Count AS INTEGER
  11.  
  12. DIM SHARED Alphabet1(2) AS LetterBin ' 0 1
  13. DIM SHARED Alphabet2(4) AS LetterBin ' 00 01 10 11
  14. DIM SHARED Alphabet3(8) AS LetterBin ' 000 001 010 011 100 101 110 111
  15. DIM SHARED Alphabet4(16) AS LetterBin ' etc.
  16. DIM SHARED Alphabet5(32) AS LetterBin
  17. DIM SHARED Alphabet6(64) AS LetterBin
  18. DIM SHARED Alphabet7(128) AS LetterBin
  19. DIM SHARED Alphabet8(256) AS LetterBin
  20. DIM SHARED Alphabet9(512) AS LetterBin
  21. DIM SHARED Alphabet10(1024) AS LetterBin
  22. DIM SHARED Alphabet11(2048) AS LetterBin
  23. DIM SHARED Alphabet12(4096) AS LetterBin
  24. DIM SHARED Alphabet13(8192) AS LetterBin
  25.  
  26. Alphabet1(1).Signature = "0"
  27. Alphabet1(2).Signature = "1"
  28. CALL NewAlphabet(Alphabet1(), Alphabet2())
  29. CALL NewAlphabet(Alphabet2(), Alphabet3())
  30. CALL NewAlphabet(Alphabet3(), Alphabet4())
  31. CALL NewAlphabet(Alphabet4(), Alphabet5())
  32. CALL NewAlphabet(Alphabet5(), Alphabet6())
  33. CALL NewAlphabet(Alphabet6(), Alphabet7())
  34. CALL NewAlphabet(Alphabet7(), Alphabet8())
  35. CALL NewAlphabet(Alphabet8(), Alphabet9())
  36. CALL NewAlphabet(Alphabet9(), Alphabet10())
  37. CALL NewAlphabet(Alphabet10(), Alphabet11())
  38. CALL NewAlphabet(Alphabet11(), Alphabet12())
  39. CALL NewAlphabet(Alphabet12(), Alphabet13())
  40.  
  41. ' Specification of weight model.
  42. DIM SHARED AlphaWeight(1 TO 13) AS DOUBLE
  43.  
  44. ' Array for test sequences.
  45. REDIM SHARED TestData(256 ^ 2) AS STRING
  46.  
  47. ' Statistics and metrics:
  48. DIM GuessPredicted AS INTEGER
  49. DIM GuessCorrect AS DOUBLE
  50. DIM GuessTotal AS DOUBLE
  51. DIM GuessRatioBest AS DOUBLE
  52. DIM GuessRatioWorst AS DOUBLE
  53. DIM GuessStreak AS INTEGER
  54. DIM GuessStreakMax AS INTEGER
  55. DIM GuessStreakBest AS INTEGER
  56. DIM Grade(256 ^ 2, 2) AS DOUBLE
  57. DIM BinaryModelIndex AS INTEGER
  58. DIM BestBinaryModel AS INTEGER
  59. DIM WorstBinaryModel AS INTEGER
  60.  
  61. ' Working varaibles:
  62. DIM TheString AS STRING
  63. DIM AS INTEGER k, m, n
  64.  
  65. ' Load test data from file or from sub.
  66.     REDIM _PRESERVE TestData(LoadTestFile(0, COMMAND$(1), 1))
  67.     '''
  68.     ' Load test cases at bottom of code:
  69.     'ReDim _Preserve TestData(LoadTestData(0))
  70.     '''
  71.  
  72.     '''
  73.     ' Play area:
  74.     REDIM _PRESERVE TestData(1)
  75.     'Call InitializeModelCustom("-1")
  76.     'Call InitializeModelLiteral("0010101000")
  77.     'Call InitializeModelIndexed(23)
  78.     TestData(1) = "00101000111101101011000010000100001100110110100001100001100000000010011001000101000010001000000111011100000100000000010111010110000000010001110010110111000001011010111100101101010000011101110011010000110010011101011111000001000011001000000010011110011000000010101010011100100000010100010010111100100100100001000010011001101101001101110100011001100000010100000010001000010110100010100011000010001010101011110000000000110000011001000000100001110000100100010000001110000010100000101010100010001101000010"
  79.     'TestData(1) = Pathological$("1", 25000)
  80.     'TestData(1) = ProtoRand$("", 1000)
  81.     'TestData(1) = QBPRNG$(1, 1000)
  82.     '''
  83.  
  84. GuessRatioBest = 0
  85. GuessRatioWorst = 1
  86. GuessStreakBest = 0
  87. WorstBinaryModel = 1
  88.  
  89. ' This outer loop is for cycling through models.
  90. BinaryModelIndex = -1
  91. DO WHILE (BinaryModelIndex < 1024)
  92.  
  93.     '''
  94.     ' Automatic increment of model index number.
  95.     BinaryModelIndex = BinaryModelIndex + 1
  96.     CALL InitializeModelIndexed(BinaryModelIndex)
  97.     '''
  98.  
  99.     ' This enclosed loop is for looping through test strings.
  100.     FOR m = 1 TO UBOUND(TestData)
  101.  
  102.         GuessPredicted = -1
  103.         GuessCorrect = 0
  104.         GuessTotal = 0
  105.         GuessStreak = 0
  106.         GuessStreakMax = 0
  107.  
  108.         ' This core loop goes through ine (literal) bit of a test string per iteration.
  109.         ' Set upper limit to infinity for game mode.
  110.         FOR n = 1 TO LEN(TestData(m)) '9999
  111.  
  112.             '''
  113.             ' Auto-feed Mode:
  114.             TheString = LEFT$(TestData(m), n)
  115.             '''
  116.  
  117.             '''
  118.             ' Gaming Mode:
  119.             'Call InitializeModelIndexed(16)               ' This 5-gram case (16) works well for game mode.
  120.             'Call InitializeModelIndexed(18)               ' Learning algo discoverd this is good against humans.
  121.             'Call InitializeModelIndexed(82)               ' Learning algo discoverd this is good against humans.
  122.             'Call InitializeModelLiteral("0000101000")     ' Experimental default.
  123.             'Cls
  124.             'Locate 1, 1
  125.             'Print "Press LEFT or RIGHT."
  126.             'k = 0
  127.             'Do: k = _KeyHit: Loop Until ((k = 19200) Or (k = 19712))
  128.             'Select Case k
  129.             '    Case 19200
  130.             '        TheString = TheString + "0"
  131.             '    Case 19712
  132.             '        TheString = TheString + "1"
  133.             'End Select
  134.             '_KeyClear
  135.             '''
  136.  
  137.             CLS
  138.             COLOR 7
  139.             LOCATE 1, 1
  140.             FOR k = 1 TO _WIDTH
  141.                 PRINT "_";
  142.             NEXT
  143.             PRINT "Model";
  144.             PRINT " ("; _TRIM$(STR$(BinaryModelIndex)); ")";
  145.             PRINT ":";
  146.             FOR k = 1 TO 10 'UBound(AlphaWeight)
  147.                 PRINT AlphaWeight(k);
  148.             NEXT
  149.             PRINT
  150.             PRINT
  151.             PRINT "Sequence (index "; _TRIM$(STR$(m)); ") (length "; _TRIM$(STR$(LEN(TheString))); "):"
  152.             PRINT RIGHT$(TheString, 400);
  153.             COLOR 8
  154.             PRINT LEFT$(RIGHT$(TestData(m), LEN(TestData(m)) - n), 400);
  155.             COLOR 7
  156.             PRINT
  157.  
  158.             ' Reconciliation
  159.             IF (GuessPredicted <> -1) THEN
  160.                 PRINT
  161.                 PRINT "I predicted "; _TRIM$(STR$(GuessPredicted)); " and you typed "; RIGHT$(TheString, 1); "."
  162.                 IF (GuessPredicted = VAL(RIGHT$(TheString, 1))) THEN
  163.                     PRINT "I am RIGHT this round."
  164.                     GuessCorrect = GuessCorrect + 1
  165.                     GuessStreak = GuessStreak + 1
  166.                     IF (GuessStreak > GuessStreakMax) THEN GuessStreakMax = GuessStreak
  167.                     Grade(n, 2) = 1
  168.                 ELSE
  169.                     PRINT "I am WRONG this round."
  170.                     GuessStreak = 0
  171.                     Grade(n, 2) = 0
  172.                 END IF
  173.                 GuessTotal = GuessTotal + 1
  174.                 Grade(n, 1) = GuessCorrect / GuessTotal
  175.             END IF
  176.  
  177.             IF (GuessTotal > 0) THEN
  178.                 PRINT
  179.                 PRINT "I'm on a "; _TRIM$(STR$(GuessStreak)); "-round winning streak."
  180.                 PRINT "My best streak has been "; _TRIM$(STR$(GuessStreakMax)); "."
  181.                 IF (GuessTotal <> 0) THEN
  182.                     PRINT "My correctness rate is "; _TRIM$(STR$(INT(100 * GuessCorrect / GuessTotal))); "% in "; _TRIM$(STR$(GuessTotal)); " trials."
  183.                 END IF
  184.             END IF
  185.  
  186.             GuessPredicted = Analyze(TheString, AlphaWeight(), 0)
  187.  
  188.             '''
  189.             ' Reverse polarity if needed for any reason.
  190.             'If (GuessPredicted = 0) Then
  191.             '    GuessPredicted = 1
  192.             'Else
  193.             '    GuessPredicted = 0
  194.             'End If
  195.             '''
  196.  
  197.             PRINT
  198.             'Print "I have made a new prediction."
  199.             'Print "Press LEFT or RIGHT to test me."
  200.             PRINT "The best performer was model #"; _TRIM$(STR$(BestBinaryModel)); ", rated "; _TRIM$(STR$(INT(GuessRatioBest * 100))); "%, best streak of "; _TRIM$(STR$(GuessStreakBest)); "."
  201.             PRINT "The worst performer was model #"; _TRIM$(STR$(WorstBinaryModel)); ", rated "; _TRIM$(STR$(INT(GuessRatioWorst * 100))); "%."
  202.  
  203.             ' Draw bottom graph if there's enough room.
  204.             IF (CSRLIN <= 23) THEN
  205.                 IF (GuessTotal <> 0) THEN
  206.                     CALL PrintGraph(TheString, Grade())
  207.                 END IF
  208.             END IF
  209.  
  210.             _DISPLAY
  211.             '_Delay .02
  212.             '_Limit 240
  213.         NEXT
  214.  
  215.         IF (GuessTotal > 0) THEN
  216.             IF (GuessCorrect / GuessTotal >= GuessRatioBest) THEN
  217.                 BestBinaryModel = BinaryModelIndex
  218.                 GuessRatioBest = GuessCorrect / GuessTotal
  219.                 GuessStreakBest = GuessStreakMax
  220.             END IF
  221.             IF (GuessCorrect / GuessTotal <= GuessRatioWorst) THEN
  222.                 WorstBinaryModel = BinaryModelIndex
  223.                 GuessRatioWorst = GuessCorrect / GuessTotal
  224.             END IF
  225.         END IF
  226.  
  227.         '_Delay 3
  228.     NEXT
  229.  
  230.  
  231.  
  232. FUNCTION Analyze (TheStringIn AS STRING, arrweight() AS DOUBLE, pswitch AS INTEGER)
  233.     DIM TheReturn AS INTEGER
  234.     DIM AS INTEGER n
  235.     DIM AS DOUBLE r, j, k
  236.     DIM StringPhase(UBOUND(arrweight)) AS STRING
  237.     DIM Partialguess(LBOUND(arrweight) TO UBOUND(arrweight), 2) AS DOUBLE
  238.  
  239.     StringPhase(1) = TheStringIn
  240.     'For n = 2 To UBound(StringPhase) ' Phase analysis.
  241.     '    StringPhase(n) = Right$(StringPhase(n - 1), Len(StringPhase(n - 1)) - 1) + Left$(StringPhase(n - 1), 1)
  242.     'Next
  243.  
  244.     IF (pswitch = 1) THEN
  245.         PRINT
  246.         FOR n = 1 TO _WIDTH
  247.             PRINT "-";
  248.         NEXT
  249.         PRINT
  250.     END IF
  251.  
  252.     IF (arrweight(1) <> 0) THEN CALL CreateHisto(StringPhase(), 1, Alphabet1())
  253.     IF (arrweight(2) <> 0) THEN CALL CreateHisto(StringPhase(), 2, Alphabet2())
  254.     IF (arrweight(3) <> 0) THEN CALL CreateHisto(StringPhase(), 3, Alphabet3())
  255.     IF (arrweight(4) <> 0) THEN CALL CreateHisto(StringPhase(), 4, Alphabet4())
  256.     IF (arrweight(5) <> 0) THEN CALL CreateHisto(StringPhase(), 5, Alphabet5())
  257.     IF (arrweight(6) <> 0) THEN CALL CreateHisto(StringPhase(), 6, Alphabet6())
  258.     IF (arrweight(7) <> 0) THEN CALL CreateHisto(StringPhase(), 7, Alphabet7())
  259.     IF (arrweight(8) <> 0) THEN CALL CreateHisto(StringPhase(), 8, Alphabet8())
  260.     IF (arrweight(9) <> 0) THEN CALL CreateHisto(StringPhase(), 9, Alphabet9())
  261.     IF (arrweight(10) <> 0) THEN CALL CreateHisto(StringPhase(), 10, Alphabet10())
  262.     IF (arrweight(11) <> 0) THEN CALL CreateHisto(StringPhase(), 11, Alphabet11())
  263.     IF (arrweight(12) <> 0) THEN CALL CreateHisto(StringPhase(), 12, Alphabet12())
  264.     IF (arrweight(13) <> 0) THEN CALL CreateHisto(StringPhase(), 13, Alphabet13())
  265.  
  266.     IF (pswitch = 1) THEN ' Set the last argument >=1 to print stats for that histogram.
  267.         IF ((LEN(TheStringIn) >= 1) AND (arrweight(1) <> 0)) THEN CALL PrintHisto(Alphabet1(), 0)
  268.         IF ((LEN(TheStringIn) >= 2) AND (arrweight(2) <> 0)) THEN CALL PrintHisto(Alphabet2(), 0)
  269.         IF ((LEN(TheStringIn) >= 3) AND (arrweight(3) <> 0)) THEN CALL PrintHisto(Alphabet3(), 0)
  270.         IF ((LEN(TheStringIn) >= 4) AND (arrweight(4) <> 0)) THEN CALL PrintHisto(Alphabet4(), 0)
  271.         IF ((LEN(TheStringIn) >= 5) AND (arrweight(5) <> 0)) THEN CALL PrintHisto(Alphabet5(), 4)
  272.         IF ((LEN(TheStringIn) >= 6) AND (arrweight(6) <> 0)) THEN CALL PrintHisto(Alphabet6(), 0)
  273.         IF ((LEN(TheStringIn) >= 7) AND (arrweight(7) <> 0)) THEN CALL PrintHisto(Alphabet7(), 0)
  274.         IF ((LEN(TheStringIn) >= 8) AND (arrweight(8) <> 0)) THEN CALL PrintHisto(Alphabet8(), 0)
  275.         IF ((LEN(TheStringIn) >= 9) AND (arrweight(9) <> 0)) THEN CALL PrintHisto(Alphabet9(), 0)
  276.         IF ((LEN(TheStringIn) >= 10) AND (arrweight(10) <> 0)) THEN CALL PrintHisto(Alphabet10(), 0)
  277.         IF ((LEN(TheStringIn) >= 11) AND (arrweight(11) <> 0)) THEN CALL PrintHisto(Alphabet11(), 0)
  278.         IF ((LEN(TheStringIn) >= 12) AND (arrweight(12) <> 0)) THEN CALL PrintHisto(Alphabet12(), 0)
  279.         IF ((LEN(TheStringIn) >= 13) AND (arrweight(13) <> 0)) THEN CALL PrintHisto(Alphabet13(), 0)
  280.         PRINT
  281.     END IF
  282.  
  283.     ' Set the last argument =1 to print guess for that histogram.
  284.     IF ((LEN(TheStringIn) >= 1) AND (arrweight(1) <> 0)) THEN CALL MakeGuess(TheStringIn, 1, Alphabet1(), Partialguess(), 0)
  285.     IF ((LEN(TheStringIn) >= 2) AND (arrweight(2) <> 0)) THEN CALL MakeGuess(TheStringIn, 2, Alphabet2(), Partialguess(), 0)
  286.     IF ((LEN(TheStringIn) >= 3) AND (arrweight(3) <> 0)) THEN CALL MakeGuess(TheStringIn, 3, Alphabet3(), Partialguess(), 0)
  287.     IF ((LEN(TheStringIn) >= 4) AND (arrweight(4) <> 0)) THEN CALL MakeGuess(TheStringIn, 4, Alphabet4(), Partialguess(), 0)
  288.     IF ((LEN(TheStringIn) >= 5) AND (arrweight(5) <> 0)) THEN CALL MakeGuess(TheStringIn, 5, Alphabet5(), Partialguess(), pswitch)
  289.     IF ((LEN(TheStringIn) >= 6) AND (arrweight(6) <> 0)) THEN CALL MakeGuess(TheStringIn, 6, Alphabet6(), Partialguess(), 0)
  290.     IF ((LEN(TheStringIn) >= 7) AND (arrweight(7) <> 0)) THEN CALL MakeGuess(TheStringIn, 7, Alphabet7(), Partialguess(), 0)
  291.     IF ((LEN(TheStringIn) >= 8) AND (arrweight(8) <> 0)) THEN CALL MakeGuess(TheStringIn, 8, Alphabet8(), Partialguess(), 0)
  292.     IF ((LEN(TheStringIn) >= 9) AND (arrweight(9) <> 0)) THEN CALL MakeGuess(TheStringIn, 9, Alphabet9(), Partialguess(), 0)
  293.     IF ((LEN(TheStringIn) >= 10) AND (arrweight(10) <> 0)) THEN CALL MakeGuess(TheStringIn, 10, Alphabet10(), Partialguess(), 0)
  294.     IF ((LEN(TheStringIn) >= 11) AND (arrweight(11) <> 0)) THEN CALL MakeGuess(TheStringIn, 11, Alphabet11(), Partialguess(), 0)
  295.     IF ((LEN(TheStringIn) >= 12) AND (arrweight(12) <> 0)) THEN CALL MakeGuess(TheStringIn, 12, Alphabet12(), Partialguess(), 0)
  296.     IF ((LEN(TheStringIn) >= 13) AND (arrweight(13) <> 0)) THEN CALL MakeGuess(TheStringIn, 13, Alphabet13(), Partialguess(), 0)
  297.     IF (pswitch = 1) THEN PRINT
  298.  
  299.     IF (pswitch = 1) THEN
  300.         PRINT "Thinking:   ";
  301.         FOR k = LBOUND(Partialguess) TO UBOUND(Partialguess)
  302.             IF ((LEN(TheStringIn) >= k) AND (arrweight(k) <> 0)) THEN
  303.                 PRINT Partialguess(k, 1);
  304.             ELSE
  305.                 PRINT "_ ";
  306.             END IF
  307.         NEXT
  308.         PRINT
  309.     END IF
  310.  
  311.     j = 0
  312.     r = 0
  313.  
  314.     ' Weighted average calculation
  315.     FOR k = LBOUND(Partialguess) TO UBOUND(Partialguess)
  316.         IF ((LEN(TheStringIn) >= k) AND (arrweight(k) <> 0)) THEN
  317.             r = r + arrweight(k) * Partialguess(k, 1)
  318.             j = j + arrweight(k)
  319.         END IF
  320.     NEXT
  321.     IF (j <> 0) THEN
  322.         r = r / j
  323.     END IF
  324.  
  325.     IF (pswitch = 1) THEN PRINT "Predicting:  "; _TRIM$(STR$(r))
  326.  
  327.     IF (r > .5) THEN
  328.         r = 1
  329.     ELSE
  330.         r = 0
  331.     END IF
  332.  
  333.     IF (pswitch = 1) THEN PRINT "Rounding to: "; _TRIM$(STR$(r))
  334.  
  335.     IF (pswitch = 1) THEN
  336.         FOR n = 1 TO _WIDTH
  337.             PRINT "-";
  338.         NEXT
  339.         PRINT
  340.     END IF
  341.  
  342.     TheReturn = r
  343.     Analyze = TheReturn
  344.  
  345. SUB MakeGuess (TheStringIn AS STRING, wid AS INTEGER, arralpha() AS LetterBin, arrguess() AS DOUBLE, pswitch AS INTEGER)
  346.     DIM TheReturn AS DOUBLE
  347.     DIM AS INTEGER j, k, n
  348.     TheReturn = 0
  349.     j = 1
  350.     k = 0
  351.     FOR n = 1 TO UBOUND(arralpha)
  352.         IF (LEFT$(arralpha(n).Signature, wid - 1) = RIGHT$(TheStringIn, wid - 1)) THEN
  353.             IF (arralpha(n).Count >= j) THEN
  354.                 IF (pswitch = 1) THEN PRINT "Order-"; RIGHT$("0" + _TRIM$(STR$(wid)), 2); " guess: "; arralpha(n).Signature; " . "; _TRIM$(STR$(arralpha(n).Count))
  355.                 TheReturn = TheReturn + VAL(RIGHT$(arralpha(n).Signature, 1))
  356.                 k = k + 1
  357.                 j = arralpha(n).Count
  358.             END IF
  359.         END IF
  360.     NEXT
  361.     IF (k <> 0) THEN
  362.         TheReturn = TheReturn / k
  363.     ELSE
  364.         TheReturn = .5
  365.     END IF
  366.     arrguess(wid, 1) = TheReturn
  367.     arrguess(wid, 2) = j
  368.  
  369. SUB InitializeModelIndexed (TheIndexIn AS INTEGER)
  370.     '0 to 1023
  371.     CALL InitializeModelLiteral(BIN$(TheIndexIn))
  372.  
  373. SUB InitializeModelLiteral (Weights AS STRING)
  374.     DIM AS INTEGER k
  375.     FOR k = 1 TO 10
  376.         AlphaWeight(k) = VAL(MID$(Weights, k, 1))
  377.     NEXT
  378.     AlphaWeight(11) = 0
  379.     AlphaWeight(12) = 0
  380.     AlphaWeight(13) = 0
  381.  
  382. SUB InitializeModelCustom (Weights AS STRING)
  383.     DIM AS INTEGER k
  384.     IF (Weights = "-1") THEN
  385.         FOR k = LBOUND(AlphaWeight) TO UBOUND(AlphaWeight)
  386.             AlphaWeight(k) = k ^ 2
  387.         NEXT
  388.     END IF
  389.     AlphaWeight(11) = 0
  390.     AlphaWeight(12) = 0
  391.     AlphaWeight(13) = 0
  392.  
  393.     ' Butchered from the Wiki. Ugliest function ever.
  394.     DIM AS INTEGER max, i, msb
  395.     DIM AS STRING b
  396.     max% = 8 * LEN(n%)
  397.     FOR i = max% - 1 TO 0 STEP -1
  398.         IF (n% AND 2 ^ i) THEN msb% = 1: b$ = "1" + b$ ELSE IF msb% THEN b$ = "0" + b$
  399.     NEXT
  400.     b$ = LEFT$(b$ + "0000000000", 10)
  401.     BIN$ = b$
  402.  
  403. SUB CreateHisto (arrseqphase() AS STRING, wid AS INTEGER, arralpha() AS LetterBin)
  404.     DIM AS INTEGER j, k, n
  405.     FOR n = 1 TO UBOUND(arralpha)
  406.         arralpha(n).Count = 0
  407.     NEXT
  408.     ' Hack j=1 to use base string only.
  409.     ' Let the loop's upper limit =wid for phase analysis.
  410.     FOR j = 1 TO 1 'wid
  411.         FOR k = 1 TO LEN(arrseqphase(j)) - (LEN(arrseqphase(j)) MOD wid) STEP wid
  412.             FOR n = 1 TO UBOUND(arralpha)
  413.                 IF (MID$(arrseqphase(j), k, wid) = arralpha(n).Signature) THEN
  414.                     arralpha(n).Count = arralpha(n).Count + 1
  415.                 END IF
  416.             NEXT
  417.         NEXT
  418.     NEXT
  419.     CALL QuickSort(arralpha(), 1, UBOUND(arralpha))
  420.  
  421. SUB PrintHisto (arr() AS LetterBin, wid AS INTEGER)
  422.     DIM AS INTEGER j, n
  423.     IF (wid > 0) THEN
  424.         IF (wid > UBOUND(arr)) THEN
  425.             j = UBOUND(arr)
  426.         ELSE
  427.             j = wid
  428.         END IF
  429.         PRINT "Histogram: "; _TRIM$(STR$(UBOUND(arr))); "-letter regroup, showing top "; _TRIM$(STR$(wid))
  430.         FOR n = 1 TO j
  431.             PRINT arr(n).Signature; arr(n).Count
  432.         NEXT
  433.     END IF
  434.  
  435. SUB PrintGraph (TheString AS STRING, arrgrade() AS DOUBLE)
  436.     DIM AS INTEGER j, k
  437.     DIM AS DOUBLE f, g
  438.     FOR k = 1 TO _WIDTH
  439.         LOCATE _HEIGHT - 5, k: PRINT "_"
  440.         LOCATE _HEIGHT - 5 - 10, k: PRINT "_"
  441.     NEXT
  442.     LOCATE _HEIGHT - 5 + 1, 1: PRINT "0%"
  443.     LOCATE _HEIGHT - 5 - 10 - 1, 1: PRINT "100%"
  444.     f = (_WIDTH) / LEN(TheString)
  445.     IF (f > 1) THEN f = 1
  446.     FOR j = 2 TO LEN(TheString)
  447.         g = INT(j * f)
  448.         IF (g = 0) THEN g = 1
  449.         LOCATE _HEIGHT - 5 - INT(10 * arrgrade(j, 1)), g
  450.         IF (arrgrade(j, 2) = 1) THEN
  451.             PRINT CHR$(251)
  452.         ELSE
  453.             PRINT "x"
  454.         END IF
  455.     NEXT
  456.  
  457. SUB NewAlphabet (arrold() AS LetterBin, arrnew() AS LetterBin)
  458.     DIM AS INTEGER j, k, n
  459.     n = 0
  460.     FOR k = 1 TO 2
  461.         FOR j = 1 TO UBOUND(arrold)
  462.             n = n + 1
  463.             arrnew(n).Signature = arrold(j).Signature
  464.         NEXT
  465.     NEXT
  466.     FOR j = 1 TO UBOUND(arrnew)
  467.         IF (j <= UBOUND(arrnew) / 2) THEN
  468.             arrnew(j).Signature = "0" + arrnew(j).Signature
  469.         ELSE
  470.             arrnew(j).Signature = "1" + arrnew(j).Signature
  471.         END IF
  472.     NEXT
  473.  
  474. SUB QuickSort (arr() AS LetterBin, LowLimit AS LONG, HighLimit AS LONG)
  475.     DIM AS LONG piv
  476.     IF (LowLimit < HighLimit) THEN
  477.         piv = Partition(arr(), LowLimit, HighLimit)
  478.         CALL QuickSort(arr(), LowLimit, piv - 1)
  479.         CALL QuickSort(arr(), piv + 1, HighLimit)
  480.     END IF
  481.  
  482. FUNCTION Partition (arr() AS LetterBin, LowLimit AS LONG, HighLimit AS LONG)
  483.     DIM AS LONG i, j
  484.     DIM AS DOUBLE pivot, tmp
  485.     pivot = arr(HighLimit).Count
  486.     i = LowLimit - 1
  487.     FOR j = LowLimit TO HighLimit - 1
  488.         tmp = arr(j).Count - pivot
  489.         IF (tmp >= 0) THEN
  490.             i = i + 1
  491.             SWAP arr(i), arr(j)
  492.         END IF
  493.     NEXT
  494.     SWAP arr(i + 1), arr(HighLimit)
  495.     Partition = i + 1
  496.  
  497. FUNCTION Pathological$ (TheSeed AS STRING, TheLength AS INTEGER)
  498.     DIM TheReturn AS STRING
  499.     TheReturn = TheSeed
  500.     DO
  501.         IF (Analyze(TheReturn, AlphaWeight(), 0) = 1) THEN
  502.             TheReturn = TheReturn + "0"
  503.         ELSE
  504.             TheReturn = TheReturn + "1"
  505.         END IF
  506.     LOOP UNTIL LEN(TheReturn) = TheLength
  507.     Pathological$ = TheReturn
  508.  
  509. FUNCTION ProtoRand$ (TheSeed AS STRING, TheLength AS INTEGER)
  510.     DIM TheReturn AS STRING ' Actual output.
  511.     DIM WorkingString AS STRING ' So-called working string.
  512.     TheReturn = ""
  513.     WorkingString = "0100100010" + TheSeed ' ...in case user sends blank seed.
  514.     DO
  515.         ' Change prediction model based on working string.
  516.         CALL InitializeModelLiteral(RIGHT$(WorkingString, 10))
  517.         ' Make prediction based on working string.
  518.         ' Store the opposite result.
  519.         IF (Analyze(WorkingString, AlphaWeight(), 0) = 1) THEN
  520.             TheReturn = TheReturn + "0" ' Analyze returned a 1.
  521.             WorkingString = WorkingString + "0"
  522.         ELSE
  523.             TheReturn = TheReturn + "1" ' Analyze returned a 0.
  524.             WorkingString = WorkingString + "1"
  525.         END IF
  526.         ' Keep working string reasonably short so runtime remains O(L).
  527.         IF (LEN(WorkingString) > 80) THEN WorkingString = RIGHT$(WorkingString, 80)
  528.     LOOP UNTIL LEN(TheReturn) = TheLength
  529.     ProtoRand$ = TheReturn
  530.  
  531. FUNCTION QBPRNG$ (TheSeed AS DOUBLE, TheLength AS INTEGER)
  532.     DIM TheReturn AS STRING
  533.     RANDOMIZE TheSeed
  534.     DO
  535.         IF (RND > .5) THEN
  536.             TheReturn = TheReturn + "1"
  537.         ELSE
  538.             TheReturn = TheReturn + "0"
  539.         END IF
  540.     LOOP UNTIL (LEN(TheReturn) = TheLength)
  541.     QBPRNG$ = TheReturn
  542.  
  543. FUNCTION LoadTestFile (alwayszero AS INTEGER, TheFile AS STRING, ReversalToggle AS INTEGER)
  544.     DIM AS INTEGER j, k
  545.     DIM n AS INTEGER
  546.     DIM a AS STRING
  547.     n = alwayszero
  548.     OPEN TheFile FOR INPUT AS #1
  549.     DO WHILE NOT EOF(1)
  550.         n = n + 1
  551.         LINE INPUT #1, a
  552.         TestData(n) = a
  553.     LOOP
  554.     CLOSE #1
  555.     IF (ReversalToggle = -1) THEN
  556.         FOR k = 1 TO n
  557.             a = TestData(k)
  558.             TestData(k) = ""
  559.             FOR j = LEN(a) TO 1 STEP -1
  560.                 TestData(k) = TestData(k) + MID$(a, j, 1)
  561.             NEXT
  562.         NEXT
  563.     END IF
  564.     LoadTestFile = n
  565.  
  566. FUNCTION LoadTestData (alwayszero AS INTEGER)
  567.     DIM n AS INTEGER
  568.     n = alwayszero
  569.  
  570.     '''
  571.     ' Percussive cases:
  572.     '''
  573.     n = n + 1: TestData(n) = "1111111111111111111111111111111111111111111111111111111111111111111111111111111111111111111111111111111111111111111111111111111111111111111111111111111111111111"
  574.     n = n + 1: TestData(n) = "0000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000"
  575.     n = n + 1: TestData(n) = "0101010101010101010101010101010101010101010101010101010101010101010101010101010101010101010101010101010101010101010101010101010101010101010101010101010101010101"
  576.     n = n + 1: TestData(n) = "1010101010101010101010101010101010101010101010101010101010101010101010101010101010101010101010101010101010101010101010101010101010101010101010101010101010101010"
  577.     n = n + 1: TestData(n) = "0001110011000111001100011100110001110011000111001100011100110001110011000111001100011100110001110011000111001100011100110001110011000111001100011100110001110011"
  578.     n = n + 1: TestData(n) = "0100010111010001011101000101110100010111010001011101000101110100010111010001011101000101110100010111010001011101000101110100010111010001011101000101110100010111"
  579.  
  580.     '''
  581.     ' Human samples:
  582.     '''
  583.     ' (from Keybone)
  584.     n = n + 1: TestData(n) = "101010101010101010101001010101010101001010111001010101010101010101010100010101010101010100101001010101001100101010001010100101010100101010100101010101010101010011011110010101010100100101010110010011001010011001010100010100101010010101010101010010101010101010010101001010101010100110010101010100101010101010011001001010100101010010101010100101010010101001010100101001010010101010111010100110011001010101010100110101001010101010100101001010111010101010101010100101001010101010010101010101001010101001010101001010100101010100101010010101010101001010101001010101010101001010101001010100101010101010010101010010010101010101010101010010100101010101001010100101001010101001111101010101010100101010110011001010101010101010110101010101101010101010100101010010101010010101010101101110010101001010101010110010100101010101001011010101010100110101010100101010010101010100101010101001010101010101001010101010011010101010101110110100101010111010101011011001011001010101001010101010101010101010011001010101010100101010101010101010010100101"
  585.     ' (from Keybone)
  586.     n = n + 1: TestData(n) = "0101110101100011010100101011001110001011001010001110101111010100111011100100101001010011110101101000101010001010101111001010111010101010100001010101000101101100101111101010010101110110111001000101000011010101010001001001001111101011101010100010110101110101100000101010101110111010100100100001110111100101011110101010001010001110010110111110110010101001001011101000101001011100011101000010101010101101010010110100101101000101111010101110111001010011101111010101000010101111100010101011110101011011110100001010110"
  587.     ' (from Loudar)
  588.     n = n + 1: TestData(n) = "1011001010010100100100110010101010101001010101010101011010010101001010101001010010100110101011010101010101011010101101010101010101010010110101010101100101010101010110101101011010010101010010100110101101001010110101011010010101101010110100101111010101010011011011010010110101010010110100101101010100101011010010101001010101010001011101011010010101011100111010010001101011110010011010001011100110101010010011010101001001010010000101010110001"
  589.     ' (from Luke)
  590.     n = n + 1: TestData(n) = "01100101001010001100001101101111011010010101010110110101001000001111001111110101000101111011010101111101010101101010101001010101011000010101010101001011010100110100110100110011010101010101110101010111111101011010100000001101111000010111000110111001000010100001101010110100000111101011111100001011001010110010110"
  591.     ' (from Sarafromct)
  592.     n = n + 1: TestData(n) = "10101010101011101000011101010111010101010101100111001010100111100001011011110101000001111010101101010000001111110011111110111101110111001110110010000100010101010101010100101011010110101010101010101001000000001111110000011110101010101010100010101110101010101101111111111111111111101010101010101000000"
  593.     ' (from Spriggs)
  594.     n = n + 1: TestData(n) = "10111010101010101010101001010101010101001010101001010101010101010101010101010101010101010101010101010101001010100100100101010101010101001010100101010101010100101010100101010101010101010101001010010110010101010010101010101010101010101010100101001001001010101010101010101010101001010101001001101010010"
  595.     ' (from Spriggs)
  596.     n = n + 1: TestData(n) = "11111011110100101011111111110100000011011110101100111100111111110111101110100111100110011111110101111111010111101111100111110111111111111011100111110111111110010000101011111001110101101010110111110"
  597.     ' (from Hotpants)
  598.     n = n + 1: TestData(n) = "01010100011001010010101010101010101000110101010111101010100100011010101010100100101110010010010100001010101001010101010110010001001011000100100110101001001001010000000001010101101111101001010100010101001001010101000100101001100100010011010101010101010111010010101011101011011010110100100010010100100100010010001001"
  599.     ' (from [banned user])
  600.     n = n + 1: TestData(n) = "11011011011101011010001011001110001101011001001111110000110111011101100100101110110001110011001100111110011011001110100000101001011010001100011100011100011100010011100101110100011011001101000111001111100110111011101110110000111011101010010010011000101100011010001111100100100011100111001100011110001"
  601.  
  602.     LoadTestData = n
  603.  
You're not done when it works, you're done when it's right.

Offline random1

  • Newbie
  • Posts: 86
    • View Profile
Re: Looking for old program or help recreating it
« Reply #98 on: January 03, 2022, 03:09:25 pm »
 STxAxTIC

Thanks, it will take me a while to digest everything so as to decide what will best fit
what I am doing right now.  You mentioned my combining the tools but last night I
got to thinking and had the though, what if I added a negative value as the weight.
I set M1 positive but M2 and M3 both to a negative value and guess what?  I saw a
improvement right off.  After toying around I was able to get above 70% and the
big thing was the overall predicted (1's) dropped but the hit ratio improved.

Most days after a data update I check to see how many values repeat from the previous
update.  I have several sets of data but the one set I use most often normally has around
35 to 40 repeats out of 105 strings I track.  Setting the M2 and M3 weights to a negative
value gave the thing a big push.  I am seeing around 35 (1's) in the output with a hit rate
close to 70%.  In one test the program came up with 38 predictions and 32 were correct,
Wow, never seen that coming.   Anyway, still tinkering with the settings, must be millions
of ways to adjust things, each run takes a few minutes so it's a slow slow process and already
thinking of how to automate it.  I have six or seven old Dell 6000 that were configured for
max performance just setting in my closet collecting dust.  I may have to dig them out and
put them to work.  Come to think of it, a couple of them are in like new condition which were
never used.       

Anyway, mucho gracias and will let you know how it all works out.

R1

Offline random1

  • Newbie
  • Posts: 86
    • View Profile
Re: Looking for old program or help recreating it
« Reply #99 on: January 06, 2022, 12:53:57 am »
I think I got everything put together, the picture show the results before any fine tuning
and without the weights set.  Looking good so far.
R1


https://i.postimg.cc/6q3V8W0F/RESULTS.png

Offline random1

  • Newbie
  • Posts: 86
    • View Profile
Re: Looking for old program or help recreating it
« Reply #100 on: January 06, 2022, 10:51:50 am »
STxAxTIC


Quote
I'm not sure how exactly you're optimizing your solution, but imagine a search in parameter space. Lemme freestyle this... Suppose your models are M1, M2, and M3. Then the nth digit in your sequence is predicted by each model, denoted as M1(n), M2(n), M3(n). Okay, so you talked about switching these on or off. Did you try blending them? Like this:

P(n) = w(1)*M1(n) + w(2)*M2(n) + w(3)*M3(n)

This means the total prediction of the nth bit is the weighted average of all three models, and the question becomes what the hell do we do with w(1), w(2), and w(3)? The most prudent thing possible is to imagine a sphere... actually ya know what? Let me do this in two dimensions first. Suppose for clarity you have only two models. Then we can skip the third everything:

P(n) = w(1)*M1(n) + w(2)*M2(n)

I'm still working on several methods, I added a setting to the configuration so that I and have more
that one option.  Right now it allows up to 4 different methods to be used.  what I did was add the
raw data output, ie, probabilities from each predictor to an array which can be passed directly to the
method set in the config utility.  This way I can test them against each other.  I also automated the
test option so that it's possible to run 100's of test unsupervised.  The results are written to a text
file which can then be analyzed by hand.

If your still tinkering around with this and have some suggestions then I will try them out.  I am in
the process of adding the methods mentioned above.

R1

Picture of the config setting for selecting the blending method.  I suppose it's only limited to the
imagination, but right now it's set to allow 4 different methods.  Who knows, maybe some kind
of crazy strange code will take it over the top.  I am open to anything.   


https://i.postimg.cc/DZ8Yst1N/BLEND.png

Offline random1

  • Newbie
  • Posts: 86
    • View Profile
Re: Looking for old program or help recreating it
« Reply #101 on: January 07, 2022, 12:11:57 pm »
I have been tinkering with fine tuning the prediction tool and wanted to post the test results
of my last test run.  So far the best hit rate I have been able to achieve was 38 of 40 correct
for a score of  95% for  40 strings.  Most test show mid 70's with a few 80's in the results
but I am just getting started with the fine tuning.  Below I attached a file showing the results
of 50 back-test that I use to evaluate my configuration settings.  This is not the best but one
the many I have tried so far.  I also attached a picture that shows to read the results. 

Anyway, anyone interested in how the project is going can take a peek.

R1

Picture instructions on reading the results

https://i.postimg.cc/Pq4BR3Kj/TEST.png
   

The results file
« Last Edit: January 07, 2022, 12:19:04 pm by random1 »

Offline random1

  • Newbie
  • Posts: 86
    • View Profile
Re: Looking for old program or help recreating it
« Reply #102 on: January 07, 2022, 04:33:11 pm »
Best score today

Test # 76
Total Strings = 40
[P1 =10]  [Hits = 8]  [Misses = 2]
[P0 =30]  [Hits =29]  [Misses = 1]
Actual 1's = 9   H-Ratio =.8888
Actual 0's =31   H-Ratio =.9354
Overall hits=37 of 40
Predictors score =.925
1's Missed on string =24,32
0's Missed on string =23
A=0000-0001-0001-0011-0001-0010-0001-0000-0000-0011
P=0000-0001-0001-0011-0001-0001-0001-0001-0000-0011

Still tinkering

R1

Offline random1

  • Newbie
  • Posts: 86
    • View Profile
Re: Looking for old program or help recreating it
« Reply #103 on: January 10, 2022, 08:48:31 am »
All set up to do some serious fine tuning, wife walked in and laughed.
Ever see a old nerd's man cave.  On the bright side,  all seven of the
dell inspiron 6000 booted without a hitch.  Some have not been booted
for five or six years, even the batteries seem to have taken a full charge.
Anyway, I am ready to do some serious testing.

https://i.postimg.cc/prLkPy01/posted.png 
 

R1

Offline STxAxTIC

  • Library Staff
  • Forum Resident
  • Posts: 1091
  • he lives
    • View Profile
Re: Looking for old program or help recreating it
« Reply #104 on: January 10, 2022, 09:37:20 am »
Dude that is a serious mancave, you should pull up a matrix screensaver on each machine and recreate the setup from the movies!

Happy tuning btw!
You're not done when it works, you're done when it's right.