Abstract :
In most neural network applications, the network outputs are required to be binary, i.e. to correspond to a vertex of the output hypercube space. It is shown how, in the presence of positive self-feedback, binary outputs can be guaranteed even with finite sigmoid slope, or with asymmetric connection matrices. An expression is derived, which gives a lower bound on the sigmoid slope, in order that equilibrium points, corresponding to nonbinary solutions, be unstable.