What happens when using "iw dev <device> set bitrates ..." to set wireless transmit bitrates?
Hello,
I have a doubt about what happens when trying to set a wireless transmit bitrate with "iw dev <device> set bitrates".
Reading the "iw" documentation, it states that it works "by masking in the allowed bitrates, and also lets you clear the mask."
How does this bitrate masking happen? Does the iw tool tell the driver, using nl80211, what are the allowed bitrates to be used for transmission?
Are the drivers required to implement this or does this really depend on the driver and it may be unsupported by some of them?
I'm asking because I was able, for testing purposes, to set a bitrate of 6 Mbit/s, when connected to a 5 GHz 802.11a access point, with "sudo iw dev wlan0 set bitrates legacy-5 6". The problem was that "iwconfig" was still telling me about the bitrate being "54 Mbit/s", even though I could clearly see the effect of a much lowered transmit bitrate (for instance when tesing the goodput with iPerf or when using ping to measure the latency with a big payload size), and this makes me a bit confused.
How can I be sure that the bitrate I set was really accepted by the kernel and by the kernel?
Moreover, trying to force a single bitrate (11 Mbit/s) with "legacy-2.4" over a 802.11n connection ("sudo iw dev wlan0 set bitrates legacy-2.4 11"), in order to test a basic rate 802.11b connection, was completely unsuccessful, and, even if I got no output from the aforementioned command, it was like my request was completely ignored (I could reach over 40 Mbit/s). Why was this happening? Was I doing something wrong with "iw"?
Thank you very much in advance.
|