Using Multi-Layer Recurrent Neural Network for language models

Here is another example of how to use Multi-Layer Recurrent Neural Network (RNN package) designed for character-level language models. This neural network was trained using 165,000+ real titles of acts submitted to the Congress from CONGRESS.GOV. The training was performed using GPU. Then the trained RNN was used to create “fake” titles. Use this link to find which bill title is real and which is created… This example of the RNN package is provided by Jahred Adelman.