CONCLUSION AND FUTURE WORKS
In this paper, we proposed a controller mind (CM) framework to manage multiple controllers automatically and intelligently in SDEI, so as to keep the high accuracy in the real-time monitoring of smart grid. Specifically, we solved the QoS-enabled load scheduling by reinforcement learning, defined the learning agent, action space, state space, and reward function, as well leveraged the historical data to learn the load scheduling scheme offline and ahead of time, so as to realize the automatic management among multiple controllers. We simulated the performance of CM framework compared with three traditional schemes. Simulation results showed that the reinforcement learning based scheme had the best load balancing and time efficiency, which solved the problems of traditional load balancing schemes. However, the QoS-enabled load scheduling scheme learns from the historical data, so it has the lower robustness to the burst traffic. Once the burst traffic happens, state space in our scheme fails to describe all situations and also needs the longer time to learn the new allocation scheme. During this period, the load variation and time efficiency are severely affected. Future work is in progress to address these challenges.