Author Topic: Artificial neural network and back propagation  (Read 8517 times)

0 Members and 1 Guest are viewing this topic.

Offline Dimster

  • Forum Resident
  • Posts: 500
    • View Profile
Re: Artificial neural network and back propagation
« Reply #30 on: April 27, 2021, 11:36:29 am »
Thanks for raising this topic - working with neural nets in layman's language is hard when calculus isn't a strong suit in the arsenal of knowledge. I have spent more time reading and sketching out a neural net than writing up code but that being said I have over 25000 lines of code in my present ai program. I have been working on Back Propagation and in doing so I found out my layers (or Girds as you have them in your program) needed to retain a lot more than just a value. To Back Propagate I need the layers to remember every element which was used to compose the value. Likely over kill. Then I was also looking at the Layers as sequentially triggered like the RNN but once again I found that, in Back Propagation I need to simply turn off a layer, or somehow skip right over it if my error code was strong enough to find only the problematic Layer(s).

But where I have struggled with Back Propagation is that damn Sigmoid Function. Or more to the point, trying to back it out of the my backward passes of the network. At the moment my planning is to back propagate from just before the application of the Sigmoid Function (by the way, I'm using Sigmoid Function to mean the Activation Function). That way I don't need to deal with it. My thinking is .. if the Sigmoid is the ultimate Activation (ie triggers error, triggers learning, triggers the final answer) then why not just avoid it altogether rather than trying to reverse engineer it's effects on the outputted data which feeds it.

If I'm reading your program correctly  HGRID2(x,y) = 1/(1+EXP(-HGRID2(x,y) is the Sigmoid and being backed out by HGRID2E(y,x) = HGRID2E(y,x) * (1-HGRID2E(y,x)). being the inverse. I was looking at a mighty complex formula for the Logit Function which I thought was the way to back out a Sigmoid Function. Another calculus function to "layman it". As soon as I read the Logit Function was the inverse of the Sigmoid and scanned its' formula, I gabbed a few beers and watched some baseball. Is the inverse you are using in fact this Logit Function I have been reading about?

I know the Sigmoid is just a method to smooth out results by incorporating every integral of a curve but it is my Achilles Heal to imagine the math behind the formula.

Offline The Jazz Man

  • Newbie
  • Posts: 19
    • View Profile
Re: Artificial neural network and back propagation
« Reply #31 on: April 29, 2021, 07:07:09 pm »
.
« Last Edit: May 17, 2021, 01:12:25 am by The Jazz Man »

Offline Dimster

  • Forum Resident
  • Posts: 500
    • View Profile
Re: Artificial neural network and back propagation
« Reply #32 on: April 30, 2021, 10:37:33 am »
Jazz Man, first let me apologize in advance for long gaps in any future replies. I am involved in a project which will take me out of town for various periods of time. Your project here is a great one to work on as the inputted data is solid, unlike my ai that inputs data whose accuracy needs to be constantly checked. Also, it sometimes takes me awhile to research the calculus formula, conceptulize their mean and then Laymanize the formula into qb64 jargon. Find I often need to tinker with my formulas.

I have been trying to run your program but can't get it go yet. It will not locate and load the supporting files. That seems to be the way I have written the path to those files. I will sort that out .

Is the relationship between your 2 grids TIME? For example is Grid 1 producing the relevant material data before the race is run, and Grid 2 projecting the outcome at the end of the race?  Which I am then imaging that the Sigmoid Grid 1 is refining the weights for input into Grid 2? Actually, I "think" it's more complex than that - it would be more like Neuron (x1,Y1) in Grid 1 is producing relevant and material data for every Neuron (x1...Xn, Y1...Yn) in Grid 2. And is it that everyone of those outputs to Grid 2 are evaluated for activation by the Sigmoid function, or will the Sigmoid function determine if Neuron (X1,Y1) in Grid 1 has a weight of zero and therefore not pass anything?

In my approach to back propagation, the backward adjustment of a Neuron (or the neuron's weight) is made to those neurons which have the largest impact. (similar to the Delta Rule). So Neuron (X1,Y1) in Grid 1 offering zero would not be adjusted. I do recognize  Learning Rates and Error codes do need to be  contended with in backpropagation, but is this basically the approach you are taking?


Offline The Jazz Man

  • Newbie
  • Posts: 19
    • View Profile
Re: Artificial neural network and back propagation
« Reply #33 on: April 30, 2021, 09:27:17 pm »
.
« Last Edit: May 17, 2021, 01:12:36 am by The Jazz Man »

Offline Dimster

  • Forum Resident
  • Posts: 500
    • View Profile
Re: Artificial neural network and back propagation
« Reply #34 on: May 01, 2021, 11:00:10 am »
I'm hung up on line 125 "Open "C:\HORSE\TESTDATA\2.TXT" For Input As #21".

I can't seem to find this text file. Are you able to send it to me again? Or, could this be a matter that I'm not looking hard enough and it is within one of the 3 Text files you have already provided.


Offline SMcNeill

  • QB64 Developer
  • Forum Resident
  • Posts: 3972
    • View Profile
    • Steve’s QB64 Archive Forum
Re: Artificial neural network and back propagation
« Reply #35 on: May 01, 2021, 02:04:37 pm »
Do you have a directory "C:\HORSE\..."??

Seems to me that you might need to change that path to point to a folder on your own drive.
https://github.com/SteveMcNeill/Steve64 — A github collection of all things Steve!

Offline Dimster

  • Forum Resident
  • Posts: 500
    • View Profile
Re: Artificial neural network and back propagation
« Reply #36 on: May 01, 2021, 04:10:54 pm »
Hi Steve - ya the brain.bas and the TXT files are on my d drive and I have altered the path in Jazz'z code to load from my d drive. They seem to all load ok but when I run the program it can't seem to find the file on line 125. I have been  messing with back slashes and forward slashes and the spaces within the names in the paths with no luck. The only thing that is coming to me is that I'm missing the test data somehow and that could mean I need to search line by line in the files I have, to find if I have done something to the data itself, like failed to download a complete file, getting only part of it.

Offline The Jazz Man

  • Newbie
  • Posts: 19
    • View Profile
Re: Artificial neural network and back propagation
« Reply #37 on: May 02, 2021, 11:30:06 pm »
.
« Last Edit: May 17, 2021, 01:12:51 am by The Jazz Man »

Offline Dimster

  • Forum Resident
  • Posts: 500
    • View Profile
Re: Artificial neural network and back propagation
« Reply #38 on: May 04, 2021, 04:42:10 pm »
Thanks Jazz - I'm up and running. Just to clarify some of the terminology
- TRAINING COUNT: does this just keep count of the number of times the training data has gone through network or will it count until Predicted = Actual?
- ERROR LEVEL: So is this a measure of the level, the total errors had on the input. For example, 10 input errors contributed to the output being off by 60% of the expected output for that Grid?
- TOTAL NETWORK ERRORS FOR THE CYCLE: This would be an integer counting errors by ( ??? each neuron or each grid). And is a "Cycle = 1 Race Run" which is probably the same as 1 Prediction?
- ACTUAL POSITION: This is the target value being solved for - meaning the output after the network training (the Prediction value) is expect to equal  Actual Position.
                             : Is this value also used in the Error and Learning/Training (ie if Predicted Position does NOT equal Actual Position then ...)

There is also a term "Bias Weights". In discussion with the gang on this forum on Bias and Weights, it was felt that these were interchangeable terms. It does seem to work in many ai algorythms but not sure how you are meaning it here in brain.bas. (ie meaning Bias to the Weight or Bias and/or Weight)

What a fascinating program you have here. I've filled my white board about a dozen times now trying to follow the flow of the data to see where the errors are being caught and which weights are the culprits.

I did want to ask you about the beginning values of each acceptor. So in the ai program I'm writing, I have 50 Events that I'm tracking on a weekly basis. Events are numbered 1 to 50. The beginning values that I use makes each Event equal ( 1/50 = .02) .Then to this beginning value I calculate a weight so the value going into the neuron is .02 * weight. Are your acceptors starting with an equal value (meaning if they are equal at the start of the race means that any acceptor could be in the money) or is it a Calculated Value  (meaning there are a number of factors which form the beginning value before the weight is applied because we already know if the horse was in the money or not).



Offline The Jazz Man

  • Newbie
  • Posts: 19
    • View Profile
Re: Artificial neural network and back propagation
« Reply #39 on: May 06, 2021, 10:24:19 pm »
.
« Last Edit: May 17, 2021, 01:13:04 am by The Jazz Man »

Offline The Jazz Man

  • Newbie
  • Posts: 19
    • View Profile
Re: Artificial neural network and back propagation
« Reply #40 on: May 07, 2021, 12:21:01 am »
.
 
« Last Edit: May 17, 2021, 01:13:18 am by The Jazz Man »