Linux Kernel 2.6.22.15, hostAP driver, Prism3 chipset based PCMCIA adapter, and really old laptop (P3 something or other, doesnt matter). Implemented ARF, AARF and SampleRate and compared their PS.
Results:
Proved the theory that Bicket states in his paper; lower bit rates dont necessarily give better delivery probabilities. Lower bit rates infact consume more power. Intuitively, they take longer to transmit, hence keep the Tx circuit alive for longer. Given a flaky medium, these lower bit rates are major power hogs, coz of the packet retries involved. The following graphs show these results ..

This is w/o wifi ON, just for comparison. ~0.7V drop.

This is for AARF. ~1.5V drop. (ARF is similar)

This is SampleRate. ~1.0V drop.
Basically, SampleRate looks at a 10 sec history and chooses the best rate that gave an average Tx time thats lower than the current bit rate. In short it's got some intelligent way of switching.(Read the thesis if you're really interested), or ask me :p
This actually is quite important. Think embedded systems. Every device has WiFi out there. Power saving features directly affect the usability of these devices. My guess is, the differences in voltage drops will be more significant in those kind of devices.
More later ..
No comments:
Post a Comment