Saw this question wasn't answered while I was searching the forums
Originally Posted by chris.troutner
Can anyone point me to some good resources for understanding the differences between CDMA and GSM? What cell phone providers use which protocols?
If you find one, I'd like to know so I can just post a URL instead of having to write up my own explanation.
CDMA is a protocol. GSM is a standard.
GSM voice uses TDMA as a protocol. Most GSM carriers use CDMA for their 3G data protocol.
Most implementations of the new 4G LTE service
TDMA is time domain multiple access. Each phone making a call via a tower is given a timeslice. The phone compresses your conversation data and transmits it to the tower as a data burst during its timeslice. This has two important consequences: (1) Your range is limited by the duration of the timeslice and the speed of light. If you're too far away, by the time your signal reaches the tower, it's gone into the next timeslice. The tower sees you haven't responded during your timeslice and drops your call. (2) It sucks for transmitting data because bandwidth is divided equally between everyone transmitting/receiving data, regardless of how much bandwidth they actually need. It's like taking your 8 Mbps home internet connection, and dividing it 2 Mbps for you, 2 Mbps for the spouse, and 2 Mbps each for the two kids
. Now image it with 100 kids
, and you can see why TDMA sucks as a data protocol.
CDMA is code division multiple access. Each phone is assigned an orthogonal set of codes which allows the tower to uniquely identify which transmissions are coming from that phone. The simplest analogy I can think of is how you can write two messages on the same sheet of paper by writing one horizontally, and the other vertically. Even though the two messages overwrite each other, because you can tell which letters are horizontal and which are vertical (they're orthogonal codes), you can read both messages. It allows every phone to transmit simultaneously. Bandwidth is "automatically" distributed as needed - the more phones are transmitting, the more "noise" each phone sees, thus reducing its bandwidth. This makes it scale very well, making it ideal for data. Most 3G and 3.5G implementations use CDMA, even the ones used by GSM. (Incidentally, this is why GSM phones can talk and use data at the same time. They have a TDMA radio
for voice, and a CDMA radio
for data. CDMA phones only have the one CDMA radio, so can only do voice or data, not both simultaneously.)
OFDMA is orthgonal frequency domain multiple access. It's similar in concept
to CDMA, except instead of using orthogonal codes it uses orthogonal frequency assignments. It's more efficient than CDMA, but requires more math to figure out which signal is coming from which phone. Until recently, mobile processors weren't powerful enough to do this math without draining your battery
in 30 minutes. Now that they're fast enough while drawing little power, we're transitioning to OFDMA for 4G.
I have a Virgin Mobile Android phone that I've hacked to rebroadcast the 3G over the WiFi for getting internet on my boat. It uses Sprint's network to the best of my knowledge.
Sprint is CDMA voice/data.
What role does CDMA and GSM play in terms of data over 3G and 4G?
Right now I'm focused on 3G connection, but I'd like to get something that would also work for the expanding 4G network.
GSM is just a standard. They've standardized on TDMA for voice, usually CDMA for 3G data, and a SIM card for your identity.
Most GSM and CDMA carriers have settled on LTE for 4G.
In terms of an amplifier, none of this really matters except for the time/distance limitation of TDMA. The amplifier will just boost whatever signal the bigger antenna picks up, and retransmit it to your phone. As long as the amplifier works at the frequency of your service
, it'll work. If you're using GSM and are beyond the timeslice distance (about 25 miles I think), you won't be able to get voice service. I'm not sure how GSM negotiates a data connection, but as long as it doesn't do anything like try to verify you through TDMA, you should still be able to get data service beyond 25 miles (assuming you can get a strong enough signal).
I should point out though that you're also going to be fighting the curvature of the earth. Water
absorbs nearly all RF signals, so you need a straight line between you phone antenna and the tower to be above the water
. This is the reason the receiving towers are so tall - a 100 ft tower can see a phone at sea level 12 miles away. A 200 ft tower can see one 17 miles away. The calc changes a bit for a phone above sea level. km range = approx 3.57 * (sqrt(h1) + sqrt(h2)), with h1 and h2 in meters (I'm not gonna derive it for miles and feet).
So with a tower 100 ft above sea level and your phone antenna 10 feet above sea level, you'll get a range of about 16 miles. With a 200 ft tower this becomes 21 miles. (Note that the base of the tower may be above sea level - add that to the tower's height.)
A simple solution may seem to be to put the antenna high up on your mast
. At 50 ft up and a 200 ft tower, you'd have 26 miles of range. But the length of the cable coming from the antenna, down the mast
, to the amplifier will reduce signal. You'd have to put the amplifier up on the mast where the antenna is if you want to do this with minimal signal loss.