🔗 Explore More
Want more AI/ML content? Visit SlayItCoder Blogsite and level up your skills every weekend.
🔮 How Machines Predict Tomorrow: Autoregressive Models Explained
"It’s not about individual data points. It’s about the conversation between yesterday and tomorrow." — Emma, a weather researcher in our case study
What if we could train a machine to understand this idea?
That’s exactly what Autoregressive (AR) models do. They take the past few values in a sequence and try to predict the next one.
In this blog, you’ll explore how AR models convert raw sequences into future insight — and why they matter across industries.
🧠 What You’ll Learn
What AR(1), AR(2), AR(p) models are and how they work
How time series becomes training data
How AR compares with Markov models
Hands-on practice in a guided Colab notebook
🔢 What Is Autoregression?
Autoregressive models predict a current value using a fixed number of past values.
If today’s temperature depends on yesterday and the day before, an AR(2) model says:
T_today = b + w1 * T_yesterday + w2 * T_day_before + noise
Here w1, w2 are weight and b is the bias.
🧊 Emma’s Temperature Table
Emma logs temperatures every day:
Day | Temp (°C) |
|---|---|
1 | 30 |
2 | 32 |
3 | 33 |
4 | 31 |
5 | ? |
To predict Day 5, she uses the values from Day 4 and Day 3.
📦 Sequence → Table: Sliding Window
Time series is a stream. AR models need a fixed-shape dataset.
So we slice it:
Inputs (X1, X2) | Target (y) |
30, 32 | 33 |
32, 33 | 31 |
33, 31 | 34 |
This is known as the sliding window technique.
🎯 Try this: Write down your last 7 days of steps or water intake. Turn it into sliding windows of size 3. Congratulations — you’ve built your first AR dataset!
⚖️ AR vs Markov: Are They the Same?
AR(1) | Markov Chain |
Predicts numeric values | Predicts next state (discrete) |
Uses regression on past values | Uses transition probabilities |
Output: continuous | Output: category/state |
AR(1): “Given yesterday’s temp, predict today’s temp.” Markov: “Given it was rainy yesterday, will it be sunny today?”
🧪 Try It in Colab (with Visuals!)
We’ve created a complete Colab notebook where you:
Generate synthetic temperature data
Use AR(p) models to predict future values
Visualize predictions vs actuals
Try different values of
p
👉 Open Colab: Autoregression in Action ➜
Tracking is enabled to help improve your reading experience. No personal data is collected.
📌 Action Checklist
✅ Convert a sequence into tabular (X → y) using a sliding window
✅ Predict temperature or stocks using AR(2)
✅ Try increasing
pand see how predictions change✅ Compare your model vs actual graph
📊 Visual: How Sequences Become Models
Sequence: [30, 32, 33, 31, 34]
Sliding Window (p=2)
Input (X) → Output (y)
[30, 32] → 33
[32, 33] → 31
[33, 31] → 34
This turns a time series into rows that fit traditional ML models.
📚 Recommended Read
Forecasting Principles and Practice by Rob Hyndman
Statsmodels ARIMA Docs
🚀 What’s Next?
AR models give us linear memory. But what if we want deeper, dynamic memory?
In the next post, we’ll explore RNNs, LSTMs, and Transformers — models that learn from entire sequences like sentences, videos, or heartbeat signals.
