Beyond Weather Correlation: A Comparative Study of Static and Temporal Neural Architectures for Fine-Grained Residential Energy Consumption Forecasting in Melbourne, Australia
arXiv cs.LG / 4/15/2026
💬 OpinionSignals & Early TrendsIdeas & Deep AnalysisModels & Research
Key Points
- The study compares static weather-feature inputs using an MLP versus temporal sequence modeling using an LSTM for 5-minute residential energy forecasting in Melbourne households.
- Using 14 months of 5-minute smart meter data (117,000+ samples per household) merged with BOM daily weather observations, the LSTM substantially outperforms weather-driven MLPs, highlighting that temporal autocorrelation is dominant at fine granularity.
- For House 3 (grid-connected), the LSTM reaches R^2=0.883 versus the weather-only MLP at R^2=-0.055, and for House 4 (PV-integrated) the LSTM achieves R^2=0.865 versus the MLP at R^2=0.410.
- The results suggest an asymmetry under solar generation: the MLP’s improved performance for the PV household indicates it may implicitly leverage solar-related patterns from weather-time correlations.
- The paper contextualizes performance with persistence baselines and seasonal stratification, and proposes future directions including hybrid weather-augmented LSTMs and federated learning.
Related Articles

Black Hat Asia
AI Business
Are gamers being used as free labeling labor? The rise of "Simulators" that look like AI training grounds [D]
Reddit r/MachineLearning

I built a trading intelligence MCP server in 2 days — here's how
Dev.to

Big Tech firms are accelerating AI investments and integration, while regulators and companies focus on safety and responsible adoption.
Dev.to
Qwen3.5-35B running well on RTX4060 Ti 16GB at 60 tok/s
Reddit r/LocalLLaMA