Model Used:
This code uses a Recurrent Neural Network (RNN) model with Long Short-Term Memory (LSTM) layers in TensorFlow's Keras API.
Reason for Using the Model:
- The
LSTMmodel is chosen because it is effective at handling sequential data and time series forecasting. - The code aims to predict future values based on past data, making LSTM suitable for capturing temporal patterns and dependencies.
- RNNs are neural networks designed for sequence prediction tasks.
- They use loops to allow information to persist over time, meaning they can learn temporal patterns in sequential data.
- However, standard RNNs have a limitation: they struggle to retain long-term dependencies due to issues like the "vanishing gradient" problem.
- LSTMs are a type of RNN specifically designed to handle long-term dependencies.
- LSTMs use a special structure—cell state and gating mechanisms (input, forget, and output gates)—that allows them to retain or forget information as needed, making them more effective than standard RNNs for long sequences.
- Standard RNNs lack mechanisms to retain information over long sequences, making them prone to forgetting crucial information as sequences grow.
- LSTMs address this by using gates that control the flow of information, allowing them to remember longer sequences without losing context.
- This makes LSTMs more powerful for tasks with extended temporal dependencies, such as time series forecasting.
-
1-fabricator.py: This code simulates and generates CPU usage, disk usage, and the number of nodes based on the time of day for a given number of days, and saves the data to a CSV file.
-
2-predictor.py: This code trains an LSTM model to predict future values of CPU usage, disk usage, and number of nodes based on historical data, then outputs and saves the predictions as a CSV file.
-
3-visualizer.py: This code creates an interactive dashboard using Streamlit to visualize fabricated and predicted data for CPU usage, disk usage, and the number of nodes, with an option to display the top predicted nodes.