Paper ID: 2401.15767

A Centralized Reinforcement Learning Framework for Adaptive Clustering with Low Control Overhead in IoT Networks

F. Fernando Jurado-Lasso, J. F. Jurado, Xenofon Fafoutis

Wireless Sensor Networks (WSNs) play a pivotal role in enabling Internet of Things (IoT) devices with sensing and actuation capabilities. Operating in remote and resource-constrained environments, these IoT devices face challenges related to energy consumption, crucial for network longevity. Clustering protocols have emerged as an effective solution to alleviate energy burdens on IoT devices. This paper introduces Low-Energy Adaptive Clustering Hierarchy with Reinforcement Learning-based Controller (LEACH-RLC), a novel clustering protocol that employs a Mixed Integer Linear Programming (MILP) for strategic selection of cluster heads (CHs) and node-to-cluster assignments. Additionally, it integrates a Reinforcement Learning (RL) agent to minimize control overhead by learning optimal timings for generating new clusters. Addressing key research questions, LEACH-RLC seeks to balance control overhead reduction without compromising overall network performance. Through extensive simulations, this paper investigates the frequency and opportune moments for generating new clustering solutions. Results demonstrate the superior performance of LEACH-RLC over conventional LEACH and LEACH-C, showcasing enhanced network lifetime, reduced average energy consumption, and minimized control overhead. The proposed protocol contributes to advancing the efficiency and adaptability of WSNs, addressing critical challenges in IoT deployments.

Submitted: Jan 28, 2024