Issue No. 02 - March/April (2001 vol. 13)
DOI Bookmark: http://doi.ieeecomputersociety.org/10.1109/69.917555
<p><b>Abstract</b>—Recently, a number of authors have explored the use of recursive neural nets (RNN) for the adaptive processing of trees or tree-like structures. One of the most important language-theoretical formalizations of the processing of tree-structured data is that of deterministic finite-state tree automata (DFSTA). DFSTA may easily be realized as RNN using discrete-state units, such as the threshold linear unit. A recent result by Síima (<it>Neural Network World</it><b>7</b> (1997), pp. 679–686) shows that any threshold linear unit operating on binary inputs can be implemented in an analog unit using a continuous activation function and bounded real inputs. The constructive proof finds a scaling factor for the weights and reestimates the bias accordingly. In this paper, we explore the application of this result to simulate DFSTA in sigmoid RNN (that is, analog RNN using monotonically growing activation functions) and also present an alternative scheme for <it>one-hot</it> encoding of the input that yields smaller weight values and, therefore, works at a lower saturation level.</p>
Tree automata, recursive neural networks, neural computation, analog neural networks.
R. C. Carrasco and M. L. Forcada, "Simple Strategies to Encode Tree Automata in Sigmoid Recursive Neural Networks," in IEEE Transactions on Knowledge & Data Engineering, vol. 13, no. , pp. 148-156, 2001.