Encoded Neural Networks (ENNs) associate lowcomplexity algorithm with a storage capacity much larger than Hopfield Neural Networks (HNNs) for the same number of nodes. Moreover, they have a lower density than HNNs in terms of connections, allowing a low-complexity circuit integration. The implementation of such a network requires low-complexity elements to take complete advantage of the assets of the model. This paper proposes an analog implementation of the ENNs. It is shown that this type of implementation is suitable for building network of thousands of nodes. To validate the proposed implementation, a prototype ENN of 30 computation nodes is designed, fabricated and tested on-chip for the ST 65-nm 1- V supply complementary metal-oxide silicon (CMOS) process. The circuit shows decoding performance similar to that of the theoretical model, and decodes one message in 58ns. Moreover, the entire network occupies a silicon area of 16470 µm² and consumes 145 µW, yielding a measured energy consumption per synaptic event per computation node of 68 fJ.