The IEEE 802.11 supports multiple transmission bit rates by using different modulation and coding schemes. Due to different bit error characteristics and transmission efficiencies of the rates, stations may benefit from an adaptive use of them for a varying channel condition, called rate adaptation. The accuracy of rate adaptation is expected to be highly affected by a time varying nature of typical radio channels due to multipath fading. This paper presents an analytic model of the IEEE distributed coordination function (DCF) with the automatic rate fallback (ARF) rate adaptation algorithm, which is the most widely used one in the 802.11 market, under time-correlated Rayleigh fading. The key idea behind the approach is to exploit the first-order Markovian approximation of Rayleigh fading channels, based on which transmission failure probabilities are obtained depending on the current and previous transmission status. By using those probabilities, the ARF process of a station is modeled as a Markov chain, then, the rate distribution obtained by solving the Markov chain is fed to a DCF model. The proposed DCF model is described in a per-station manner, thus enables the analysis of heterogeneous channel conditions and medium access control (MAC) configurations among stations.
The physical layer (PHY) of the IEEE 802.11 supports multiple transmission rates by using different modulation and coding schemes, e.g. four rates in the 802.11b (1, 2, 5.5 and 11 Mbps) [1]. Since the PHY rates have individual bit error characteristics, there usually exists a single best rate in a given channel condition, which maximizes the throughput performance. Therefore, wireless stations are encouraged to perform rate adaptation by which each station adaptively selects the best PHY rate depending on its channel quality. To achieve this goal, a rate adaptation algorithm needs to specify two basic mechanisms: (1) how to estimate the current channel quality; (2) when and how to change the rate [2]. Although rate adaptation is unspecified by the 802.11 standards, it plays a critical role with respect to the system performance of 802.11 WLANs [3] and thus it is of crucial importance.
The automatic rate fallback (ARF) algorithm [4] is a simple rate adaptation algorithm, which was originally developed for Lucent Technologies’ WaveLAN-II WLAN devices. ARF estimates a channel quality based on the results of the past transmission attempts, i.e., a certain number of consecutive transmission successes (failures) infer an improved (degraded) channel quality. According to the estimated channel quality, it changes the rate to the next higher or next lower one. Due to its simple behavior and wide acceptance in the market, ARF became the basis of many other proposals for rate adaptation algorithms [2], [5], [6], [7], [8], [9], [10], [11] and [12].
It is well known that a channel quality may fluctuate due to a varying received signal strength over a short period of time or short travel distance, which is called small-scale fading [13]. Small-scale fading is caused by interference between two or more versions of the transmitted signal, called multipath waves which traverse different paths. If a transmitting/receiving station or surrounding objects are in motion, they induce a Doppler shift on those multipath waves, and consequently, the received signal strength becomes time varying. Time-correlation of such a time varying channel identifies how fast the channel response changes, and is directly impacted by the Doppler shift (large Doppler shift corresponds to low correlation). The Rayleigh distribution is commonly used to describe the statistical time varying nature of the received envelope of a signal under small-scale fading [13].
Such a time varying nature of small-scale fading is closely related to the reactive behavior of rate adaptation algorithms. Especially, when a rate adaptation algorithm exploits the past statistics of channel estimation results (e.g. ARF), the accuracy of its decision making will be affected by the time-correlation of fading channels. Therefore, it is of crucial importance to take into account small-scale fading when evaluating the performance of rate adaptation in WLANs.
This paper has presented an elaborate mathematical model of the ARF rate adaptation algorithm with the IEEE 802.11 DCF under time-correlated Rayleigh fading. Under the first-order Markovian approximation of Rayleigh fading channels, the ARF algorithm was modeled as a Markov chain considering correlated channel errors. Then, the rate distribution obtained from the Markov chain was fed to the proposed per-station DCF model, which enables heterogeneous conditions among stations. The validation through ns-2 simulation has shown the proposed model to be accurate in predicting the system performance. It has also been shown that Rayleigh fading highly affects the ARF performance.
An ARF model under non-saturated traffic conditions and a new rate adaptation algorithm opportunistically exploiting time varying channel quality are topics for further research.