Subscribe to DSC Newsletter

Using Multi-Layer Recurrent Neural Network for language models

Here is another example of how to use Multi-Layer Recurrent Neural Network (RNN package) designed for character-level language models. This neural network was trained using 165,000+ real titles of acts submitted to the Congress from CONGRESS.GOV. The training was performed using GPU. Then the trained RNN was used to create "fake" titles. Use this link to find which bill title is real and which is created... This example of the RNN package is provided by Jahred Adelman.

Views: 122

Comments are closed for this blog post

Videos

  • Add Videos
  • View All

Follow Us

© 2018   Data Science Central ®   Powered by

Badges  |  Report an Issue  |  Privacy Policy  |  Terms of Service