Does 802.11a use up more battery than g on a laptop?

miston

Member
Dec 13, 2004
53
0
0
Since 802.11a transmits at a higher power does it use up significant more battery than g?
 

amdskip

Lifer
Jan 6, 2001
22,530
13
81
You mean G has a faster speed? I really don't think the difference would be that noticeable. Not like you should be transferring gigs of data wirelessly all the time.
 

ktwebb

Platinum Member
Nov 20, 1999
2,488
1
0
Originally posted by: miston
Since 802.11a transmits at a higher power does it use up significant more battery than g?


WHere are you getting this information?
 

miston

Member
Dec 13, 2004
53
0
0
From a Tom's Hardware article
The "G" vs. "A" WLAN Conundrum
http://www.tomsnetworking.com/Sections-article101.php

There also is an ongoing debate about the transmission ranges of 802.11a and 802.11g. Because of physics, higher frequency RF signals (5GHz) don't transmit as far as lower frequency ones (2.4GHz). This assumes, however, that everything else is equal, which is not the case with 802.11 wireless LANs. 802.11a, for instance, operates at a higher transmit power and in the presence of less noise than 802.11g. In practice, this enables 802.11a solutions to often have greater range than 802.11g.

 

SaintTigurius

Senior member
Apr 3, 2003
332
0
0
the article talks nothing about battery life, i scimmed through it. unless i missed it.

"IT transmits at higher power" deos not say it takes more batterly power,


even if it did u would never notice it..
 

miston

Member
Dec 13, 2004
53
0
0
I didn't say it said anything about battery life, that's why I'm asking.

If it transmits at a higher power AND the frequency is higher, I would expect that it uses a bit more power.

The question is HOW much?

If you have a short battery life, and your considering whether or not to use 802.11a or g, then that SMALL amount, if its even small, may could mean the difference between 30 minutes.

I mean I don't know, that's why I'm asking. I have an internal a/b/g card in my laptop and I'm deciding whether or not to use a or g. Since I have a small apartment, I would probably get better performance with "a" since the possible distance factor that "a" might have, isn't relevant.

 

JackMDS

Elite Member
Super Moderator
Oct 25, 1999
29,545
422
126
In this case the dominant part that consumes the power (I.e. the battery) is the RF output amplifier; it has nothing to do with the frequency. I.e. 2.4GHZ vs. 5GHz.

The typical Entry Level Transmitter in a Wireless unit is rated 30mW RF output. It does not matter if it is a b g or whatever, 30mW takes about the same ?Juice? No matter what its transmit.

If you have a unit that is 100mW it would take more battery power.

:sun:
 

ktwebb

Platinum Member
Nov 20, 1999
2,488
1
0
The reason I asked is because as a blanket statement the paragraph you referenced is false. They both will push anywhere from 30-50 mW, typical for a budget client device, up to 200 mW. So one .11a card may transmit at a higher power than a .11g card but that is not a set in stone fact. The opposite may be true if you bring other Vendors into play. For instance, Proxim's A/B/G card maxes out at 60 mW for both A and G, while allowing for up to 83 mW if you run 802.11b modulation. Mileage is going to vary from brand to brand but by default .11a gear does no necessarily run higher transmit power than .11b or g. I'd recommend double checking Tom's "facts" in his stories. Any website frankly but Tom's Hardware seems to do less fact finding, accurate anyway, than most of the big name hardware sites.