!Converted with LaTeX2HTML 95.1 (Fri Jan 20 1995) by Nikos Drakos (firstname.lastname@example.org), CBLU, University of Leeds >
Suppose that the output of a network is the linear transformation of it input plus some output noise , i.e., , and both input and noise are Gaussian signals. Without losing generality, suppose that the network capacity is 1.
The optimal information transmition requires to choose the transformation which maximizes the information of the ouput about the input while keeping the output variance equal to the capacity. This can be easily solved by Lagrangian multiplier method through maximizing
in which the first term is the information of the output about the input and the second term is the Lagangian multiplier. The derivative of at the maximum is
The solution is and , i.e., the output activities are decorrelated.
The dynamics of feedforward connections is very similar to the feedback case. We are not going into the detail of that since our emphasis is on the feedback path. The dynamic stability and information optimization still hold. It is not difficult to show that the functions and shown earlier are still the Lyapunov function and Lagrangian function, respectively, for the joint dynamics of feedforward and feedback connections. The decorrelation of output activities is still the stable solution.