General MLP NN has different structure on each layer
General MLP NN has different structure on each layer
In[]:=
Out[]=
Can we make a CA-like NN?
Can we make a CA-like NN?
ConvolutionLayer[]
Want a neural net layer where the data is repeatedly passed through the layer...
Want a neural net layer where the data is repeatedly passed through the layer...
ICCA : inhomogeneous continuous CA
ICCA : inhomogeneous continuous CA
E.g. ramp + antiramp [i.e. change activation functions at each cell]