For some complex systems, because of the influence of internal and external instances, there are periodic changes among stages in runtime and the dynamics in each stage are different from others. When we employ data-driven parameterized methods to model and predict such systems, a unified model is restricted in learning dynamics and transitions of multiple stages. To address the above challenges, inspired by Ordinary Differential Equations Network (ODENet), this paper proposes a novel predictive simulation framework, namely Deterministic Finite Automaton Ordinary Differential Equation Net (DFA-ODENets), which is a continuous-time deep learning framework for modeling periodic multi-stage systems, by giving irregular sampled historical system trajectories. The model includes two principal predictions for predicting system dynamic and predicting stage transition respectively. In terms of learning the dynamics of the system, the model consists of several ODENets whose number is determined by the number of stages of the modeled system. Each ODENet individually learns continuous-time nonlinear dynamics in each stage. In terms of learning stage transitions, a stage transition predictor is adopted to learn the duration time in each stage from observation data, which are labeled in advance according to prior knowledge of the system. During prediction, the stage transition predictor serves as a switcher for each ODENet, for assigning the appropriate one to predict the outputs of the system. Moreover, a specific encoder-decoder structure is integrated with the framework, where the encoder solves the initial state based on the historical system inputs and outputs, and the decoder predicts the future system outputs with the inputs of the prediction range based on the solved initial state.
In order to evaluate the feasibility and effectiveness of the proposed, the encoder-decoder framework is employed in a cooling system of a real data center for simulating certain dynamical variables during operation. After giving multivariate operational data, including the server power and environmental temperature, the model is capable of simulating the system in the expected working patterns and predicting the open-loop output variables including the consumed power and inlet air temperature. When the length of the prediction range exceeds 30 minutes, the Mean Absolute Percentage Error (MAPE) of the predicted energy consumption is within 5%. In the meantime, the cooling temperature setting which determines when to pause the cooling compressor is optimized according to the learned simulation model. Simulation experiments indicate that lower thresholds are preferred when the heat load of servers increases. By adopting the inferred optimal temperature settings, up to 18% of energy is saved.