Frequency and maximum concurrent data throughput

coolVariable

Diamond Member
May 18, 2001
3,724
0
76
First I want to admit that I have absolutely no clue about this. I tried google but that wasn't really helpful.

Is there a way to determine the maximum possible data throughput for a certain frequency?

E.g. 2,100MHz (cell phone 3G).

What is the maximum total data throughput possible for 1 user on 2100MHz?
What is the maximum number of concurrent users (e.g. for one cell tower) and at what data rate?
What is the impact of changing frequencies on the above? (e.g. 500MHz vs. 5000MHz)

I have no clue and want to learn ...
 

bobsmith1492

Diamond Member
Feb 21, 2004
3,875
3
81
Look up Shannon's information theory for some background.

There are quite a few factors that come into play when transmitting information over the radio. The limiting factors are, as best I can remember right now:
1. Bandwidth used
2. Signal to noise ratio

Number 2 is affected by the absolute signal strength and the noise at the receiver.

Imagine data being transmitted using two variables to encode: amplitude and phase. If you plot amplitude and phase on a pair of axis, you can pick a number of points within that axis. The more points you can use, the more data can be transmitted using a given amount of bandwidth. However, there is error on each point based on the signal to noise ratio. So, the points have to be spaced far enough apart from each other so that the error around each point will not overlap with its neighbor. There are other practicalities such as the non-linearity of the radio amplifier that prevent using points in the corners of the phase-amplitude axes, so information must be restricted to a circular area.

Since the amount of information that can be transmitted increases with the signal quality (signal-noise ratio), increasing signal power is a good way to transmit more data. WiMax, for example, uses beamforming (Wiki it) to increase signal strength without using more power to transmit by pointing the radio signal at the receiver.

Anyway, that's what I've got; my boss has a master's in RF and communications and worked at Motorola for 20 years, so he's my source for most of this but I haven't studied it much so I might be a little off on some of the details and terminology.
 

esun

Platinum Member
Nov 12, 2001
2,214
0
0
This cannot be summarized well in a single post. If you want the real answers to those questions, you'd probably need a graduate degree in electrical engineering with a focus on digital communications.

Having said that, bobsmith's explanation is a good qualitative summary of the issues. In generally, a communications system works as a chain of blocks that perform specific functions. You typically have some type of data encoding (usually involves forward error correction, i.e., you transmit extra symbols so that at the receiver you can use those to correct errors in the data), a modulation scheme, after which you send the data over a channel. The channel will add some noise (what type of noise depends on the channel). On the receiving end, you'll demodulate the signal and decode the original bits from the symbols coming out of the demodulator. Note that here we're talking about what's usually called channel encoding/decoding (i.e., coding use for error correction), which is distinct from source encoding/decoding, a term which typically to compression schemes (so you have to send less data over the channel).

Now, if you know the properties of just the channel, you can actually define something called the channel capacity. I'll link the Wikipedia article here:

http://en.wikipedia.org/wiki/Channel_capacity

The idea behind this is that given certain noise characteristics of a channel, you can still transmit data with an arbitrarily small error at a non-zero rate. Now this result isn't immediately obvious if you think about it. If you have a noisy channel, it isn't immediately obvious that you can guarantee arbitrarily small errors at a non-zero rate (if you could tolerate rates approaching zero, then it'd be trivial, since you could just repeat yourself all the time, but the channel would be useless then). Thus, from a theoretical perspective, you can't beat the channel capacity. Note that this is true for any modulation/coding scheme. The idea is that given an ideal modulation/coding scheme, you could theoretically achieve the channel capacity. With modern schemes we can get pretty close.

What bobsmith is talking about is mostly modulation schemes. So when you send symbols over a channel, you have to choose some modulation scheme. A very simple modulation scheme is BPSK. The Wikipedia article for phase shift keying (PSK) is

http://en.wikipedia.org/wiki/Phase-shift_keying

If you scroll down to BPSK, you can see a constellation of two points. If you imagine a sequence of bits, you can modulate it into an analog signal by taking a sinusoid and shifting the phase by 180 degrees whenever the bit value changes from 0 to 1 or vice versa. Now this scheme is relatively robust because you only have to pick between two points. One way to think of it is if at the output you have some voltage. If that voltage is +1, then you can assume the bit is a 1. If it's -1, then the bit is 0. What can happen though is noise can get added when that modulated signal is transmitted. This means that when you demodulate, rather than just having some value x (which should be just +1 or -1), you have the value x plus some noise n, or x + n.

Now we could devise some simple decision scheme given that we have noise: if x + n > 0 we have a 1, and if x + n < 0 we have a 0. However, let's say x = -1 but n = +1.5. Since x = -1 the bit should be a 0, but since x + n = 1.5 > 0, we actually make the decision that the bit is a 1. This is an error. Very often you'll hear about something called BER, or bit error rate. Basically this is a number indicating the rate at which we have bit errors (it's often something like 1/10000 or 1/100000).

If you take another look at the BPSK constellation, think about placing a 2D Gaussian at each of the constellation points. If you're having trouble imagining a 2D Gaussian, it's something like this:

http://en.wikipedia.org/wiki/File:Gaussian_2d.png

So basically a little hill around each constellation point. This represents the ideal value received plus some noise. The more noise we add, the fatter these Gaussians are. The less noise we have, the sharper they are. Note that the more these Gaussians overlap, the higher our probability of making an error is. So you can see that noise, by kind of spreading the Gaussians out, causes more errors. Using a higher modulation scheme (such as QPSK, shown on the same Wikipedia page at BPSK) is going to allow us to send more bits per symbol (2 bits per symbol versus 1 bit per symbol for BPSK), but the constellation points are also closer. Thus, we will incur more errors as a trade-off to having a higher data rate. Take a look at 16-QAM:

http://en.wikipedia.org/wiki/File:16QAM_Gray_Coded.svg

You can see there are lots of constellation points and they are much closer than in BPSK. Again you get an increase in data rate but less robustness to noise. Now one way to combat noise is to increase the power of the signal. What this effectively does it shift all of the constellation points radially away from the origin, causing them to be spaced further. Naturally this means that the Gaussians at each point will overlap less and thus our error probability will go down.

So that covers the idea behind modulation. On top of that you have channel coding, which I mentioned previously as a form of error correction. There are many channel coding schemes. Some very popular ones include convolutional coding and Viterbi decoding, Turbo codes, and LDPC codes. Describing these would take tons of time, so just look them up on Wikipedia if you're interested.

The gist of it, though, is that you combine channel coding and modulation to achieve the best data rate you can given your channel characteristics (noise, bandwidth, etc.). This means your questions cannot be easily answered. 1 user on 2100 MHz could have infinite throughput on a noiseless channel. The maximum number of concurrent users for a communications scheme with 0 data rate is infinite. Changing frequencies doesn't necessarily have an impact on the throughput.
 

coolVariable

Diamond Member
May 18, 2001
3,724
0
76
I probably asked this all wrong.

I want to know the actual "usefulness" of a frequency.
E.g. the gov't auctioned off all these analog TV frequencies and telcos bought various slices.
In a given area (let's say New York, Los Angeles, or any big city), how many concurrent connections are actually (and sensibly) possible and what's their bandwidth?
It obviously is not possible to have a million people at the same time streaming different HD videos.

e.g. some old broadcast TV was on 52.000 MHZ - 72.000 MHz as far as I know.

That sliver of 20 MHz frequency ... how many "concurrent internet connections" are possible/sensible and at what speed?



 

bobsmith1492

Diamond Member
Feb 21, 2004
3,875
3
81
Originally posted by: coolVariable
I probably asked this all wrong.

I want to know the actual "usefulness" of a frequency.
E.g. the gov't auctioned off all these analog TV frequencies and telcos bought various slices.
In a given area (let's say New York, Los Angeles, or any big city), how many concurrent connections are actually (and sensibly) possible and what's their bandwidth?
It obviously is not possible to have a million people at the same time streaming different HD videos.

e.g. some old broadcast TV was on 52.000 MHZ - 72.000 MHz as far as I know.

That sliver of 20 MHz frequency ... how many "concurrent internet connections" are possible/sensible and at what speed?

For the non-readers:

You can't say for sure. There are too many variables. As technology gets better, more users and higher speed are possible. Read the above for more.
 

coolVariable

Diamond Member
May 18, 2001
3,724
0
76
Originally posted by: bobsmith1492
Originally posted by: coolVariable
I probably asked this all wrong.

I want to know the actual "usefulness" of a frequency.
E.g. the gov't auctioned off all these analog TV frequencies and telcos bought various slices.
In a given area (let's say New York, Los Angeles, or any big city), how many concurrent connections are actually (and sensibly) possible and what's their bandwidth?
It obviously is not possible to have a million people at the same time streaming different HD videos.

e.g. some old broadcast TV was on 52.000 MHZ - 72.000 MHz as far as I know.

That sliver of 20 MHz frequency ... how many "concurrent internet connections" are possible/sensible and at what speed?

For the non-readers:

You can't say for sure. There are too many variables. As technology gets better, more users and higher speed are possible. Read the above for more.

And yet ... a company like at&t would not pay billions for frequency licenses without having some basic concept like:

20MHz in NYC = 200,000 active concurrent active connections at 5Mbps
10% of subscribers are active
=> 2,000,000 average subscribers possible

2mm x $20/month x 12 months = $480mm/year

10 years use = $4.8bn in revenues
less: infrastructure and operating cost (e.g. $4bn)

=> $800mm gross profit ... so they are willing to pay $500mm for it for a $300mm net.
 

esun

Platinum Member
Nov 12, 2001
2,214
0
0
Originally posted by: coolVariable
I probably asked this all wrong.

I want to know the actual "usefulness" of a frequency.
E.g. the gov't auctioned off all these analog TV frequencies and telcos bought various slices.
In a given area (let's say New York, Los Angeles, or any big city), how many concurrent connections are actually (and sensibly) possible and what's their bandwidth?
It obviously is not possible to have a million people at the same time streaming different HD videos.

e.g. some old broadcast TV was on 52.000 MHZ - 72.000 MHz as far as I know.

That sliver of 20 MHz frequency ... how many "concurrent internet connections" are possible/sensible and at what speed?

Here's the issue. If you were talking about WiMax versus WiFi versus GSM versus CDMA versus [name another wireless scheme], it would be different. Different power requirements, different coding schemes, different interference specs, etc. Those all affect the answer to your question.
 

coolVariable

Diamond Member
May 18, 2001
3,724
0
76
Originally posted by: esun
Originally posted by: coolVariable
I probably asked this all wrong.

I want to know the actual "usefulness" of a frequency.
E.g. the gov't auctioned off all these analog TV frequencies and telcos bought various slices.
In a given area (let's say New York, Los Angeles, or any big city), how many concurrent connections are actually (and sensibly) possible and what's their bandwidth?
It obviously is not possible to have a million people at the same time streaming different HD videos.

e.g. some old broadcast TV was on 52.000 MHZ - 72.000 MHz as far as I know.

That sliver of 20 MHz frequency ... how many "concurrent internet connections" are possible/sensible and at what speed?

Here's the issue. If you were talking about WiMax versus WiFi versus GSM versus CDMA versus [name another wireless scheme], it would be different. Different power requirements, different coding schemes, different interference specs, etc. Those all affect the answer to your question.

Let's assume WiMax, GSM or CDMA ... whichever provides the highest possible number of "concurrent" connections/throughput.
 

TuxDave

Lifer
Oct 8, 2002
10,571
3
71
Originally posted by: coolVariable
Originally posted by: esun
Originally posted by: coolVariable
I probably asked this all wrong.

I want to know the actual "usefulness" of a frequency.
E.g. the gov't auctioned off all these analog TV frequencies and telcos bought various slices.
In a given area (let's say New York, Los Angeles, or any big city), how many concurrent connections are actually (and sensibly) possible and what's their bandwidth?
It obviously is not possible to have a million people at the same time streaming different HD videos.

e.g. some old broadcast TV was on 52.000 MHZ - 72.000 MHz as far as I know.

That sliver of 20 MHz frequency ... how many "concurrent internet connections" are possible/sensible and at what speed?

Here's the issue. If you were talking about WiMax versus WiFi versus GSM versus CDMA versus [name another wireless scheme], it would be different. Different power requirements, different coding schemes, different interference specs, etc. Those all affect the answer to your question.

Let's assume WiMax, GSM or CDMA ... whichever provides the highest possible number of "concurrent" connections/throughput.

My uber google skills show that for WiMax, each channel is 20MHz wide with each channel offering theoreticaly maximum 96 Mbit/sec.
 

coolVariable

Diamond Member
May 18, 2001
3,724
0
76
Originally posted by: TuxDave
Originally posted by: coolVariable
Originally posted by: esun
Originally posted by: coolVariable
I probably asked this all wrong.

I want to know the actual "usefulness" of a frequency.
E.g. the gov't auctioned off all these analog TV frequencies and telcos bought various slices.
In a given area (let's say New York, Los Angeles, or any big city), how many concurrent connections are actually (and sensibly) possible and what's their bandwidth?
It obviously is not possible to have a million people at the same time streaming different HD videos.

e.g. some old broadcast TV was on 52.000 MHZ - 72.000 MHz as far as I know.

That sliver of 20 MHz frequency ... how many "concurrent internet connections" are possible/sensible and at what speed?

Here's the issue. If you were talking about WiMax versus WiFi versus GSM versus CDMA versus [name another wireless scheme], it would be different. Different power requirements, different coding schemes, different interference specs, etc. Those all affect the answer to your question.

Let's assume WiMax, GSM or CDMA ... whichever provides the highest possible number of "concurrent" connections/throughput.

My uber google skills show that for WiMax, each channel is 20MHz wide with each channel offering theoreticaly maximum 96 Mbit/sec.

I don't think that is correct in the way you describe it ... unless you think that a wireless co. is really satisfied with a tech that can provide approx 10Mbit/s to 10 customers.
I would assume a single wimax "cell" is between 1 and 25 square miles ... NYC is 305 squ miles which would only allow 3050 concurrent wimax connections.
NO WAY!
 

Jedi2155

Member
Sep 16, 2003
47
0
0
That's assuming it only has a single 20 MHz slice and that there is only one WiMax cell in that area. I would presume that there would be multiple "cells" that are probably channel hopping with multiple frequency which essentially allow multiple sets of 96 Mbit bandwidth. My guess that a WiMax cell wouldn't be feasble unless it had at least 1000+ Mbit of bandwidth.

http://en.wikipedia.org/wiki/Frequency_hopping
 

frostedflakes

Diamond Member
Mar 1, 2005
7,925
1
81
bobsmith1492 pointed you in the right direction earlier. Data rate will depend on channel bandwidth and SNR. Shannon-Hartley theorem states that R=BW*log(1+SNR), where R is the data rate (bits/s), BW is bandwidth (in hertz), and SNR is signal-to-noise ratio (note that the log is base 2, not base 10).

So if you were able to keep pumping the broadcasting power up to infinity, there's no limit to how much data you could send. If you could find some typical numbers for SNR of cellular data networks, just plug it and your channel bandwidth (20MHz) in and you should get the theoretical maximum throughput. Of course there's overhead from error correction and other factors that won't allow you to get throughput anywhere near your channel capacity.

edit: Looks like esun linked you directly to Shannon-Hartley, and provides a ton of good info in his post.
 

TuxDave

Lifer
Oct 8, 2002
10,571
3
71
Originally posted by: coolVariable
I don't think that is correct in the way you describe it ... unless you think that a wireless co. is really satisfied with a tech that can provide approx 10Mbit/s to 10 customers.
I would assume a single wimax "cell" is between 1 and 25 square miles ... NYC is 305 squ miles which would only allow 3050 concurrent wimax connections.
NO WAY!

No, what I said was correct. Of course common sense would need to kick in and a wireless company would in no way buy a 20MHz frequency space just to stick in 1 channel of WiMAX. I would assume at minimum they would buy a 400MHz frequency band and thus get to stick in 20 channels of WiMAX to get 960MBit/sec per cell.

 

bobsmith1492

Diamond Member
Feb 21, 2004
3,875
3
81
Originally posted by: TuxDave
Originally posted by: coolVariable
I don't think that is correct in the way you describe it ... unless you think that a wireless co. is really satisfied with a tech that can provide approx 10Mbit/s to 10 customers.
I would assume a single wimax "cell" is between 1 and 25 square miles ... NYC is 305 squ miles which would only allow 3050 concurrent wimax connections.
NO WAY!

No, what I said was correct. Of course common sense would need to kick in and a wireless company would in no way buy a 20MHz frequency space just to stick in 1 channel of WiMAX. I would assume at minimum they would buy a 400MHz frequency band and thus get to stick in 20 channels of WiMAX to get 960MBit/sec per cell.

Since it is directional, theoretically you could have multiple transmitters overlapping in an area and using the same channel. All you need is for the desired signal on the frequency to be sufficiently higher in amplitude than the undesired signals.

Anyway, at 2.4GHz there is only about 100MHz of bandwidth to use... I'd guess that 5.8 would be more useful but then wall and building penetration would be horrible. I'm not really sure how Wimax is supposed to work out!
 
Jul 18, 2009
122
0
0
Originally posted by: bobsmith1492All you need is for the desired signal on the frequency to be sufficiently higher in amplitude than the undesired signals.

Actually, you don't even need that. You can transmit information with a signal that is significantly less powerful than the noise on the same channel, assuming your error correction is sufficiently good.

The GPS is a good example of this. The transmission power on each antenna in orbit is a paltry 14 watts, which gets distributed over nearly half the total surface of the Earth. I'm not positive how wide the transmission channels are, but I know the current transmission rate is 50 bits per second (yes, you read that right : 50 bits per second!), which is enough information for me to start plugging things into the Shannon-Hartley equation:

Effective transmission rate = channel bandwidth (in Hz) * log2(1 + signal power / noise power (assuming the noise is normally distributed))

Which is:

50 bits per second = bandwidth * log2(1 + 14 watts / total noise)

According to Wikipedia, the GPS uses L-band spectrum, and Satellite L-band allocations are usually about 1700 KHz "wide," so:

50 bits per second = 1700 khz * log2(1 + 14 watts / total noise)

50 bps / 1700000 cycles per second = 1/3400 = log2(1 + 14 watts / total noise)

Take 2 raised to the power of both sides:

2^(1/3400) = 2^log2(1+ 14 watts / total noise) = 1 + 14 watts / total noise

Do a little more algebra:

total noise = 14 watts / (2^(1/3400) - 1) = ~69000 watts

So if you assume that the GPS error correction is "efficient" (meaning that it approaches the Shannon Limit), then the GPS is able to withstand SNRs as low as about 1/5000, or -35dB.

But then, that's nothing compared to the signals you can detect with radio telescopes or the Hubble.
 

bobsmith1492

Diamond Member
Feb 21, 2004
3,875
3
81
I was playing around with the 174db/hz noise floor and Shannon's theorum and decided to graph the effects of signal strength and bandwidth on theoretical throughput. I came up with this plot. Notice it is on a log/log scale.

BandwidthandThroughput.png
 

exdeath

Lifer
Jan 29, 2004
13,679
10
81
I probably asked this all wrong.

I want to know the actual "usefulness" of a frequency.
E.g. the gov't auctioned off all these analog TV frequencies and telcos bought various slices.
In a given area (let's say New York, Los Angeles, or any big city), how many concurrent connections are actually (and sensibly) possible and what's their bandwidth?
It obviously is not possible to have a million people at the same time streaming different HD videos.

e.g. some old broadcast TV was on 52.000 MHZ - 72.000 MHz as far as I know.

That sliver of 20 MHz frequency ... how many "concurrent internet connections" are possible/sensible and at what speed?

The frequency is just the center carrier of the modulated signal. Just because you are at 2.4 Ghz does not mean you have 2.4 Ghz of bandwidth. You need a lot more information. It depends on how much bandwidth or space is taken up on the sides of the carrier.

To give an example though, cable modems, using 256QAM, provides a shared ~40 mbit/sec with time division access to a whole neighborhood over a single 6 Mhz TV channel slot. Though that is over coax cable, which has a very high SNR.
 
Last edited: