In this example, we will use a simple Recurrent Neural Network (RNN) to predict daily temperatures based on past records. RNNs are useful for time-series forecasting, such as weather prediction, stock market analysis, and trend detection.
1. Importing Required Libraries
First, we import the necessary libraries:
- NumPy: For numerical operations.
- TensorFlow/Keras: To build and train the RNN.
- Matplotlib: To visualize the data and predictions.
- Scikit-learn: To scale the temperature values.
import numpy as np
import tensorflow as tf
import matplotlib.pyplot as plt
from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import SimpleRNN, Dense
from sklearn.preprocessing import MinMaxScaler
2. Generating Synthetic Temperature Data
We simulate a year's worth of daily temperatures using a sinusoidal function to create seasonal variations, plus a small random noise to make the data more realistic.
# Generate synthetic temperature data
days = np.arange(365)
temperature = 10 + 10 * np.sin(2 * np.pi * days / 365) + np.random.normal(0, 1, 365) # Seasonal pattern with noise
# Visualizing the data
plt.figure(figsize=(10,4))
plt.plot(days, temperature, label='Temperature')
plt.xlabel('Days')
plt.ylabel('Temperature (°C)')
plt.title('Synthetic Temperature Data')
plt.legend()
plt.show()
3. Preparing the Data for the RNN
We format the dataset so that each input consists of the last 7 days' temperatures, and the model learns to predict the temperature of the next day.
# Normalize the data for better performance
scaler = MinMaxScaler(feature_range=(0,1))
temperature_scaled = scaler.fit_transform(temperature.reshape(-1,1))
# Create sequences of 7 days for input and 1 day for output
sequence_length = 7
X, y = [], []
for i in range(len(temperature_scaled) - sequence_length):
X.append(temperature_scaled[i:i+sequence_length])
y.append(temperature_scaled[i+sequence_length])
X, y = np.array(X), np.array(y)
# Split data into training and testing sets (80% train, 20% test)
split = int(len(X) * 0.8)
X_train, y_train = X[:split], y[:split]
X_test, y_test = X[split:], y[split:]
4. Building and Training the RNN Model
We create a simple RNN model with:
- A SimpleRNN layer with 10 neurons.
- A Dense layer to output the predicted temperature.
We use the Adam optimizer and mean squared error (MSE) as the loss function.
# Building the RNN model
model = Sequential([
SimpleRNN(10, activation='relu', input_shape=(sequence_length, 1)), # RNN layer with 10 neurons
Dense(1) # Output layer
])
# Compiling the model
model.compile(optimizer='adam', loss='mse')
# Training the model
model.fit(X_train, y_train, epochs=50, verbose=1, validation_data=(X_test, y_test))
5. Making Predictions
After training, we test the model on unseen data and compare the predicted temperatures with the actual values.
# Making predictions
predictions = model.predict(X_test)
# Rescale predictions back to original temperature range
predictions_rescaled = scaler.inverse_transform(predictions)
y_test_rescaled = scaler.inverse_transform(y_test.reshape(-1,1))
# Plot the results
plt.figure(figsize=(10,4))
plt.plot(y_test_rescaled, label='Actual Temperature')
plt.plot(predictions_rescaled, label='Predicted Temperature', linestyle='dashed')
plt.xlabel('Days')
plt.ylabel('Temperature (°C)')
plt.title('Temperature Prediction Using RNN')
plt.legend()
plt.show()
Conclusion
In this practice, we built an RNN to predict daily temperatures based on the past 7 days. This is useful for weather forecasting and other time-series applications.
Possible improvements:
- Increasing the number of neurons in the RNN layer.
- Training with more data.
- Experimenting with more complex architectures, such as LSTMs or GRUs.
Try modifying the code and see how the predictions change!