• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Updating Weights in Neural Networks

PTho9305

Junior Member
I have been using the weight update function from Russell and Norvig's AI book:

Wj <- Wj + alpha x Err x g'(in) x Xj

Where g' is the derivative of the activation function (I think it can be ignored?), and alpha is just for scaling the amount of change. Xj is the j'th input bit. Err is the difference between the expected output and the output produced by the current weights.

(all the js are subscript, and alpha is the greek character).

This is for updating one node attached to all inputs, with one output. It will generate correct weightings if the function it is solving for is linearly seperable.

If I have many feed-forward nodes, how do I update the weights for all of them?
 
Just update all of them in order from the output nodes back.
But you only know if the output node is wrong - how do you know which of those providing its inputs provided useless/useful values?
 
Back
Top