Sequence-to-Sequence Models
Explore Seq2Seq models for translation and text generation.
75 min•By Priygop Team•Last updated: Feb 2026
Applications
- Machine Translation
- Chatbots
- Text Summarization
Simple Seq2Seq with Keras
Example
from tensorflow import keras
encoder_inputs = keras.layers.Input(shape=(None,))
x = keras.layers.Embedding(1000, 64)(encoder_inputs)
encoder_outputs, state_h, state_c = keras.layers.LSTM(64, return_state=True)(x)
encoder_states = [state_h, state_c]
decoder_inputs = keras.layers.Input(shape=(None,))
x = keras.layers.Embedding(1000, 64)(decoder_inputs)
decoder_lstm = keras.layers.LSTM(64, return_sequences=True, return_state=True)
decoder_outputs, _, _ = decoder_lstm(x, initial_state=encoder_states)
decoder_dense = keras.layers.Dense(1000, activation='softmax')
decoder_outputs = decoder_dense(decoder_outputs)
model = keras.models.Model([encoder_inputs, decoder_inputs], decoder_outputs)
model.compile(optimizer='adam', loss='categorical_crossentropy')Try It Yourself — Natural Language Processing
Try It Yourself — Natural Language ProcessingHTML
HTML Editor
✓ ValidTab = 2 spaces
HTML|32 lines|1574 chars|✓ Valid syntax
UTF-8