Integrating DVFS and Task Scheduling to Improve Energy Efficiency for Heterogeneous Edge Devices: A Reinforcement Learning Approach
Loading...
Date
Authors
Journal Title
Journal ISSN
Volume Title
Publisher
Abstract
Energy efficiency is a primary design objective for embedded and edge computing platforms, which operate under tight power and thermal constraints while serving latency-sensitive workloads. In this thesis, we focus on CPU power management on heterogeneous big.LITTLE systems for single-threaded, periodic tasks that operate under a soft Target Execution Time (TET) constraint. Specifically, we design and implement a user-space, reinforcement-learning-based controller that jointly performs dynamic voltage and frequency scaling (DVFS) and task scheduling on heterogeneous edge devices. The controller uses a learned policy to select both the CPU cluster and operating performance point in each execution window so as to minimize per-window energy consumption while satisfying TET constraints. A compact, time-aware state representation makes the policy explicitly TET-conditioned, enabling it to adapt to different TET values at runtime without retraining. Using a hardware-in-the-loop evaluation on an ODROID-N2+ platform, we compare the learned policy against the standard Linux \texttt{ondemand} governor on keyword spotting (KWS) and YOLO-lite object detection workloads. With TETs randomly drawn from the 3.5--4.5\,s range, the proposed controller reduces per-cycle energy by up to 10.6\% for KWS and 7.3\% for YOLO-lite while maintaining high TET satisfaction rates.
Description
Keywords
DVFS, Energy Efficiency, Edge Computing, Task Placement, Reinforcement Learning, Heterogeneous Computing
