"/Users/sw/Dropbox/GeneralBox/Blogs/MachineLearning/FromRichard/RuleArrayBackpropagation/Backprop_1011_RuleArray.nb"
In[]:=
With[{raHeight=3,raWidth=4,rules={8,6},opts=Sequence["Radius"->1/2,"Orientation"->"Both","Neighborhood"->"Alternating","Darkness"->0.75],ruleArray=#},ICAEvolutionPlot[raHeight,raWidth,ruleArray,rules,{0,0,1,0},opts]]&@{{1,1,1,1},{1,1,1,1},{1,1,1,1}}
Out[]=
In[]:=
With[{raHeight=3,raWidth=5,rules={8,6},opts=Sequence["Radius"->1/2,"Orientation"->"Both","Neighborhood"->"Alternating","Darkness"->0.75],ruleArray=#},ICAEvolutionPlot[raHeight,raWidth,ruleArray,rules,{0,0,1,0,0},opts]]&@{{1,2,2,1,1},{1,2,2,2,1},{1,1,1,1,1}}
Out[]=
In[]:=
With[{raHeight=3,raWidth=5,rules={8,6},opts=Sequence["Radius"->1/2,"Orientation"->"Both","Neighborhood"->"Alternating","Darkness"->0.75],ruleArray=#},ICAEvolutionPlot[raHeight,raWidth,ruleArray,rules,{0,0,1,0,0},opts]]&@{{1,2,2,1,1},{1,2,2,2,1},{1,1,1,1,2}}
Out[]=
In[]:=
With[{raHeight=3,raWidth=5,rules={8,6},opts=Sequence["Radius"->1/2,"Orientation"->"Both","Neighborhood"->"Alternating","Darkness"->0.75],ruleArray=#},ICAEvolutionPlot[raHeight,raWidth,ruleArray,rules,{0,0,1,0,0},opts]]&@{{1,2,2,1,1},{1,2,2,2,1},{1,1,1,2,1}}
Out[]=
In[]:=
With[{raHeight=3,raWidth=5,rules={8,6},opts=Sequence["Radius"->1/2,"Orientation"->"Both","Neighborhood"->"Alternating","Darkness"->0.75],ruleArray=#},ICAEvolutionPlot[raHeight,raWidth,ruleArray,rules,{0,0,1,0,0},opts]]&@{{1,2,2,1,1},{1,2,2,2,1},{1,1,2,1,1}}
Out[]=
What will it take to make the lowest call black?
Out[]=
In[]:=
(*GraphicsGrid[Partition[*)(*Parallel*)QuietEcho[Table[SeedRandom[667+i];With[{raHeight=3,raWidth=5},With[{ra={{1,2,2,1,1},{1,2,2,2,1},{1,1,2,1,1}}(*Echo@RandomChoice[{1,2},{raHeight,raWidth}]*)(*RandomChoice[{1,2},{raHeight,raWidth}]*)},With[{rules={8,6},opts=Sequence["Radius"->1/2,"Orientation"->"Both","Neighborhood"->"Alternating","Appearance"->"Hexagonal","MeshStyle"->Directive[Black,Opacity[0.05]]]},With[{raOpts=Sequence@@FilterRules[{opts},Options[RuleArray]]},With[{fn=(x|->Rescale[oneZeroOneOneFunction[Rescale[x,{1,raWidth},{-3,3}]],{0,1},{Floor[raWidth*1/4],Floor[raWidth*3/4]}])},With[{loss=BitPatternLoss[{0,0,1,0,0}]},(*Print[ra];*)(*Print[SensitivityValues[raHeight,raWidth,ra,rules,loss,"Grad"->True,opts],Appearance->"Hexagonal"];*)SensitivityPlot[raHeight,raWidth,ra,rules,loss,opts,Mesh->True,ColorFunction->(Blend[{Blue,GrayLevel[0.95],Red},#]&),"LossNormalization"->"Polarities","Grad"->True,"RowShift"->1]]]]]]],{i,1}]](*,UpTo[8]]]*)
Out[]=
If a bit gets smaller in the array, what does it mean for the loss?
A gradient is pinned to a particular configuration of the system .... like f’[x] means evaluate at the point x, and it would be different at a different point..
Λ’[a] : loss as a function of array
When you flip at position p ... then you get a new a, and you get a new Λ
Given a flip from XXX->XXX at position p, does this make Λ go up or down?
All possible one-call-change transitions: