이름 |
설명 |
Connection |
Represents a single connection between two neurons in a CyclicNetwork. |
CyclicNetwork |
This class is provided for debugging and educational purposes. FastCyclicNetwork is functionally equivalent and is much faster and therefore should be used instead of CyclicNetwork in most circumstances. A neural network class that represents a network with recurrent (cyclic) connections. Recurrent connections are handled by each neuron storing two values, a pre- and post-activation value (InputValue and OutputValue). This allows us to calculate the output value for the current iteration/timestep without modifying the output values from the previous iteration. That is, we calculate all of this timestep's state based on state from the previous timestep. When activating networks of this class the network's state is updated for a fixed number of timesteps, the number of which is specified by the maxIterations parameter on the constructor. See RelaxingCyclicNetwork for an alternative activation scheme. |
FastAcyclicNetwork |
A neural network implementation for acyclic networks. Activation of acyclic networks can be far more efficient than cyclic networks because we can activate the network by propagating a signal 'wave' from the input nodes through each depth layer through to the output nodes, thus each node requires activating only once at most, whereas in cyclic networks we have to activate each node multiple times and we must have a scheme for determining when to stop activating. Algorithm Overview. 1) The nodes are assigned a depth number based on how many connection hops they are from an input node. Where multiple paths to a node exist the longest path determines the node's depth. 2) Connections are similarly assigned a depth value which is defined as the depth of a connection's source node. Note. Steps 1 and 2 are actually performed by FastAcyclicNetworkFactory. 3) Reset all node activation values to zero. This resets any state from a previous activation. 4) Each layer of the network can now be activated in turn to propagate the signals on the input nodes through the network. Input nodes do no apply an activation function so we start by activating the connections on the first layer (depth == 0), this accumulates node pre-activation signals on all of the target nodes which can be anywhere from depth 1 to the highest depth level. Having done this we apply the node activation function for all nodes at the layer 1 because we can now guarantee that there will be no more incoming signals to those nodes. Repeat for all remaining layers in turn. |
FastConnection |
Working data struct for use in FastCyclicNetwork and sub-classes. Represents a single connection - its weight and source/target neurons. |
FastCyclicNetwork |
A neural network class that represents a network with recurrent (cyclic) connections. This is a much faster implementation of CyclicNetwork. The speedup is approximately 5x depending on hardware and CLR platform, see http://sharpneat.sourceforge.net/network_optimization.html for detailed info. The speedup is achieved by compactly storing all required data in arrays and in a way that maximizes in-order memory accesses; This allows us to maximize use of CPU caches. In contrast the CyclicNetwork class represents the network directly, that is, as a network of neuron/node objects; This has additional overhead such as the standard data associated with each object in dotNet which results in less efficient packing of the true neural net data in memory, which in turns results in less efficient use of CPU memory caches. Finally, representing the network directly as a graph of connected nodes is not conducive to writing code with in-order memory accesses. Algorithm Overview. 1) Loop connections. Each connection gets its input signal from its source neuron, applies its weight and stores its output value./ Connections are ordered by source neuron index, thus all memory accesses here are sequential/in-order. 2) Loop connections (again). Each connection adds its output value to its target neuron, thus each neuron accumulates or 'collects' its input signal in its pre-activation variable. Because connections are sorted by source neuron index and not target index, this loop generates out-of order memory accesses, but is the only loop to do so. 3) Loop neurons. Pass each neuron's pre-activation signal through the activation function and set its post-activation signal value. The activation loop is now complete and we can go back to (1) or stop. |
FastRelaxingCyclicNetwork |
A version of FastCyclicNetwork that activates a network until it becomes 'relaxed' rather than for some fixed number of iterations. This class is exactly the same as FastCyclicNetwork in all other respects; See that class for more detailed info. A network is defined as being relaxed when the change in output signal value between two successive update iterations is less than some threshold value (defined by maxAllowedSignalDelta on the constructor) for all hidden and output neurons (input and bias neurons have a fixed output value). |
LayerInfo |
Stores a node and connection index that represent a layer within the network (the nodes and connections at a given depth in a network). |
Neuron |
Represents a single neuron in a CyclicNetwork. |
RelaxingCyclicNetwork |
A version of CyclicNetwork that activates a network until it becomes 'relaxed' rather than for some fixed number of iterations. This class is exactly the same as CyclicNetwork in all other respects; See that class for more detailed info. A network is defined as being relaxed when the change in output signal value between two successive update iterations is less than some threshold value (defined by maxAllowedSignalDelta on the constructor) for all hidden and output neurons (inputs and bias neurons have a fixed output value). |