Natural Language Processing (NLP)
RNNs are extensively used in Natural Language Processing (NLP) for tasks such as language translation, sentiment analysis, and text generation. By processing sequences of words, RNNs can learn contextual relationships and generate coherent sentences.
Example:
Let’s build a simple RNN for text generation using an LSTM:
from tensorflow.keras.preprocessing.text import Tokenizer
from tensorflow.keras.preprocessing.sequence import pad_sequences
import numpy as np
# Sample text data
text = "Deep learning is a subset of machine learning in artificial intelligence."
# Tokenize the text
tokenizer = Tokenizer()
tokenizer.fit_on_texts([text])
total_words = len(tokenizer.word_index) + 1
# Create sequences of words
input_sequences = []
for i in range(1, len(tokenizer.word_index)):
sequence = tokenizer.texts_to_sequences([text])[0][:i+1]
input_sequences.append(sequence)
# Pad sequences
max_sequence_len = max([len(x) for x in input_sequences])
input_sequences = np.array(pad_sequences(input_sequences, maxlen=max_sequence_len, padding='pre'))
# Split data into inputs and labels
X = input_sequences[:, :-1]
y = input_sequences[:, -1]
y = tf.keras.utils.to_categorical(y, num_classes=total_words)
# Define the model
model = Sequential([
LSTM(50, input_shape=(max_sequence_len-1, 1)),
Dense(total_words, activation='softmax')
])
# Compile the model
model.compile(optimizer='adam', loss='categorical_crossentropy')
# Train the model
model.fit(X, y, epochs=100, verbose=2)
Time Series Forecasting
Another significant application of RNNs is in time series forecasting, where the goal is to predict future values based on historical data. RNNs can capture temporal dependencies and patterns, making them suitable for tasks such as stock price prediction and weather forecasting.
Example:
Let’s build a simple RNN for time series forecasting using an LSTM:
import numpy as np
# Generate sample time series data
def generate_time_series(n_steps):
freq1, freq2, offsets1, offsets2 = np.random.rand(4, 100)
time = np.linspace(0, 1, n_steps)
series = 0.5 * np.sin((time - offsets1) * (freq1 * 10 + 10)) # wave 1
series += 0.2 * np.sin((time - offsets2) * (freq2 * 20 + 20)) # + wave 2
series += 0.1 * (np.random.rand(n_steps) - 0.5) # + noise
return series[..., np.newaxis].astype(np.float32)
# Generate dataset
n_steps = 50
series = generate_time_series(n_steps + 1)
X_train, y_train = series[:-1], series[1:]
# Reshape data
X_train = X_train.reshape((1, n_steps, 1))
y_train = y_train.reshape((1, n_steps, 1))
# Define the model
model = Sequential([
LSTM(50, input_shape=(n_steps, 1), return_sequences=True),
Dense(1)
])
# Compile the model
model.compile(optimizer='adam', loss='mse')
# Train the model
model.fit(X_train, y_train, epochs=200, verbose=2)
This code snippet defines and trains a simple LSTM model for time series forecasting.