AI for Energy Building Monitoring, Management, and Forecasting

Abstract
The conventional or classical forecasting models often rely only on historical weather and electricity data, but there are limitations in explaining the complexity and the dynamic changes in electricity consumption. This research aimed to improve the classical prediction model of the GreEN-ER building by incorporating new exogenous data from academic activities and other external events. In this research, FNN and LSTM models were built and then analyzed using evaluation metrics. The results showed that incorporating new exogenous variables, specifically the number of rooms, events, and occupants, alongside classical variables could improve the prediction performance and reduce the error.

Purpose
This research aims to build a local forecasting model of high quality, readily available, and able to adapt to the continuous structural evolution of demand, using data heterogeneous in nature, taking into account flexible structures and continuous learning including new exogenous information. This research will also focus to improve the previous classical prediction model of the GreEN-ER building based on the weather data, by incorporating new exogenous data from academic activities and other external events to better explain the accuracy gap between the forecast and actual electricity demand. In this research, FNN and LSTM models will be built and the results will be compared, to understand which model is better to capture sequential or temporal patterns in the data, which can be crucial for accurate predictions.

Data Preparation
There are several variables to build a prediction model of GreEN-ER electricity consumption, that can be grouped as dependent - the variable that we want to predict (Electricity Consumption), and independent variable - the variable that influences the dependent variable (Classical : Temperature, Cloud Cover, Visibility, Solar Radiation, and Day Index + New Exogeneous Data : Occupants, Number of Rooms, and Events in the Building. ). Those variables were acquired from various reliable sources (GreEN-ER electricity data, Grenoble INP - Ense3 academic schedule, Visual Crossing, UGA academic calendar, Mailing list Grenoble INP, SNES-FSU-Grenoble) in .csv format, which was later cleaned, combined together, and filtered to cover 3 academic years from August 24th, 2020 to April 25th, 2023.


Feature Importance
Feature Importance is a crucial aspect of machine learning that helps us understand the relevance and impact of each feature in a dataset when making predictions. It provides insights into the underlying patterns and influences the performance of the models. Gradient Boosting, Decision Trees, and Random Forests are popular methods that employ Feature Importance to identify influential features and uncover patterns in the data.


Machine Learning Models
Feed-Forward Neural Network (FNN)

Electricity Forecast of the next 24 hours

Long Short-Term Memory (LSTM)

Electricity Forecast of the next 24 hours

Conclusion
In this research, we have successfully managed the incorporation of new exogenous data, such as the number of rooms, events, day index, and occupants, alongside classical variables such as temperature, cloud cover, visibility, and solar radiation, to improve the performance of the electricity consumption prediction model. The comparison between two different ML models, Feed-forward Neural Network (FNN) and Long Short-Term Memory (LSTM), has provided valuable insights into their respective capabilities in capturing temporal patterns and handling sequential data. However, there are several notes for future research that can enhance our understanding and further advance this work, for example, exploring the effectiveness of other advanced ML models, e.g. Convolutional Neural Networks (CNNs), Transformer models, etc. in this domain could provide alternative approaches to tackle the complexity of the dataset.

Web Application

Disclaimer
This Projecr was partially supported by MIAI (Multidisciplinary Institute for Artificial Intelligence) Grenoble and was carried out at G2ELab (Laboratoire de génie électrique) Grenoble.
Share: