Text Generation with LSTM

This project aims to implement a text generation model using Long Short-Term Memory (LSTM) networks to generate coherent and contextually relevant text. The model is trained on a sample of text from the classic novel "Moby Dick" by Herman Melville. The project is available at https://github.com/Yossranour1996/Text-Generation.

Data Preparation:

Download and install the spaCy library and the large English language model (en_core_web_lg) to leverage advanced natural language processing capabilities.

Text Preprocessing:

Tokenize and preprocess the text using spaCy to extract meaningful tokens while excluding unnecessary punctuation and whitespace.

Sequence Generation:

Organize the preprocessed tokens into sequences, each containing 25 training words followed by one target word.

Model Architecture:

Define a deep learning model using Keras with an Embedding layer, two LSTM layers, and a Dense layer for text generation.

Training the Model:

Train the created model using the prepared sequences to enable text generation

Outcome:

The model is capable of generating text that reflects the style and context of the training data, making it a valuable tool for creative text generation applications.

Skills:

#Deeplearning #LSTM #Keras