Skip to main content

A sensing policy based on confidence bounds and a restless multi-armed bandit model

Author(s): Oksanen, Jan; Koivunen, Visa; Poor, H Vincent

Download
To refer to this page use: http://arks.princeton.edu/ark:/88435/pr1x77k
Abstract: A sensing policy for the restless multi-armed bandit problem with stationary but unknown reward distributions is proposed. The work is presented in the context of cognitive radios in which the bandit problem arises when deciding which parts of the spectrum to sense and exploit. It is shown that the proposed policy attains asymptotically logarithmic weak regret rate when the rewards are bounded independent and identically distributed or finite state Markovian. Simulation results verifying uniformly logarithmic weak regret are also presented. The proposed policy is a centrally coordinated index policy, in which the index of a frequency band is comprised of a sample mean term and a confidence term. The sample mean term promotes spectrum exploitation whereas the confidence term encourages exploration. The confidence term is designed such that the time interval between consecutive sensing instances of any suboptimal band grows exponentially. This exponential growth between suboptimal sensing time instances leads to logarithmically growing weak regret. Simulation results demonstrate that the proposed policy performs better than other similar methods in the literature.
Publication Date: Nov-2012
Electronic Publication Date: 28-Mar-2013
Citation: Oksanen, Jan, Visa Koivunen, and H. Vincent Poor. "A sensing policy based on confidence bounds and a restless multi-armed bandit model." In 2012 Conference Record of the Forty Sixth Asilomar Conference on Signals, Systems and Computers (ASILOMAR), (2012): 318-323. doi:10.1109/ACSSC.2012.6489015
DOI: 10.1109/ACSSC.2012.6489015
ISSN: 1058-6393
EISSN: 1058-6393
Pages: 318 - 323
Type of Material: Conference Article
Journal/Proceeding Title: 2012 Conference Record of the Forty Sixth Asilomar Conference on Signals, Systems and Computers (ASILOMAR)
Version: Author's manuscript



Items in OAR@Princeton are protected by copyright, with all rights reserved, unless otherwise indicated.