PCS Technologies |
Last Updated: 15-Apr-2004
So, you are new to cellular and PCS and wonder what all the hype is about. As a consumer, how can you know what all the claims mean if you don't even have the foggiest idea how the technologies work? Well fret no longer, I will attempt to explain to you how each of the technologies works, and what providers do to implement their systems.
Let's begin with the basics, and discuss what analog cellular is all about. In North America, we use a system known as AMPS. This system has its beginnings in 1975 when AT&T began to look into a replacement for their old mobile telephone system. The trouble with the earlier system was a distinct lack of capacity. Within a metropolitan area only about 20 or 30 subscribers could use the system at any one time. Back in those days however, few people could afford a mobile phone, and those that had them didn't use them all that frequently.
The system used one centrally located transmitter site, usually mounted very high up to achieve the greatest possible range. In Toronto the antennas were at the top of the CN Tower, and so coverage of the city was pretty good. Unfortunately, for the mobiles to transmit that far, they needed very high power, and it was not unusual for mobile units to put out upwards of 30 watts. Because of this power requirement handheld devices were out of the question, and portable ones filled a briefcase. Most of that space was dedicated to the enormous batteries.
Some systems were automatic, and presented the user with a dial tone when they connected. Others were manual, and required the user to make all their calls with the assistance of a mobile operator. For people to call the mobile, they too had to go through the mobile operator.
The goals of the new system were fairly simple: it had to be completely automatic, and it had to support a lot more users. The automatic part meant using computer technology, which during the mid to late 1970's was rather expensive, and not particularly compact. The capacity issue could be addressed by increasing the allotted bandwidth, but this was a scarce resource, and you could only go so far with that approach.
The answer to the capacity issue was to use much lower powered transmitters that only covered a fraction of the city each. Providers could then re-use frequencies and effectively multiply their capacity without chewing up more bandwidth. Making it all work called for some rather powerful computer technology in the phones. The phone had to be "handed off" from one transmitter to another as the unit moved in and out of range of the towers. These short range transmitters became known as "cells", and thus the name Cellular was born.
The new cellular telephones used conventional analog transmission techniques that had been around for most of this century. This was the same Frequency Modulation (FM) technique used on our FM radios. FM had much greater resistance to noise than Amplitude Modulation (AM), and it could better reject interference when there were two signals on the same frequency.
In the early days very few sites were needed due to the low number of subscribers. This meant coverage was available in almost every nook and cranny in the city and users were pleased. As the subscriber base grew cellular providers added new cell sites and segregated the city into smaller and smaller cells. This began to create new problems, such as coverage holes caused by the limited range of the smaller cells.
Another problem that started to destroy the system was "co-channel interference". Since frequencies were being reused at a much shorter distance, stray signals from one site carrying a specific channel would interfere with another. AMPS was beginning to crack under the stress.
Carriers could not just go on adding sites, though they certainly gave it a try. One of the new ideas centered around the idea of a "microcell". These were placed along busy highways, and they had an exceptionally short range. They helped system capacity because they took many of the highway users off of the main sites, and left those sites free for users not on the highways. Microcells were also deployed at busy street corners to help fill in weak spots and to aid capacity.
All of these short range sites had their downsides however. For a call to remain clean the system had to be able to hand it off from one of these microcells as the phone rapidly moved out of its range. Failure to do so would surely result in a dropped call once the maximum range of the microcell was exceeded. Sometimes users on roads close to the highways would end up on one of these cells, and because the sites weren't designed to work there, signals faded much more rapidly.
Microcells were both good news and bad news for cellular subscribers. Their extremely low output meant they did not penetrate buildings very well, unless the building you went in had the cell right outside. Heavy reliance on microcells therefore meant a higher risk of dropped calls all around.
As an answer to this problem engineers devised a way to digitally encode signals and cram three people on the same channel that once held only one caller. This system became known as "Digital Cellular", and it was hyped as the answer to everyone's dreams. It would provide improved sound quality, and it wouldn't suffer from interference like analog signals. Needless to say, this didn't happen, and Digital Cellular failed in almost every market it was introduced.
Digital Cellular used a technology known as TDMA, which stands for Time Division Multiple Access. Users shared the same frequency by transmitting and receiving only during short "time slots". Conventional wisdom would dictate that if you were only transmitting one third of the time, you're audio would be choppy. This is true if the audio was transmitted using analog techniques. Because the audio is turned into a stream of bits, these bits can be streamed back at a constant rate so long as they were transmitted fast enough.
The technology worked, sort of. The first big disappointment to users was the sound quality. While some carriers tried to compare their audio with that of a CD, they failed to mention one rather glaring difference. CDs sample sound 44,100 times per second, and thus generated 705,600 bits per second (16 bits per sample). To transmit over 700 thousand bits per second would require a very wide channel. The best they could achieve on Digital Cellular at the time was a mere 8000 bits per second.
It doesn't take a genius to realize that if you only have 88 times fewer bits, then the audio isn't going to sound extremely good. In fact, uncompressed digitized voice at only 8 kb (kilobits) per second is so bad, no one would ever consider using it seriously. So how did they do it?
Enter a technology known as Lossy Data Compression. By compressing the stream of bits you can actually take a much higher sample rate and send it at a lower speed. However, this isn't the same type of compression you are accustomed to with something like PKZIP. When we compress a computer file we expect the uncompressed version to be identical to the original. Not even a single bit can be wrong.
When we get back precisely what we put in, this is called "lossless compression". However, examine the typical compression rates you get when using PKZIP, and you can see that we gain very little advantage using it to compress digitized audio. Enter a new idea called "lossy compression". In this case we do not get back precisely what we put in. Instead we use an understanding of the dynamics of human voice, and we literally throw away the parts of that are not crucial to the reproduction of the voice at the far end. This technique can yield very impressive compression figures, but it does so at the expense of clarity.
Sadly, the compression algorithm used in the original digital cellular systems took so many liberties with the data that the results were less than stellar. Because of this early Digital Cellular sounded so robotic that many people chose to put up with the interference problems of analog than switch to digital. If that was the only problem facing Digital Cellular, it might still have been a success.
Although digital signals are not prone to the noise problems of analog, they still had to deal with co-channel interference and other maladies that threatened to corrupt the stream of bits flowing to and from the phone. When an error occurs in the data stream it is known as a "bit error". The rate at which these errors occur is known as the "bit error rate". When the bit error rate gets beyond a certain threshold the already questionable sound quality becomes even more distorted. If the bit error rate becomes too high, no discernible audio can be reproduced, and the phone generally blanks the resulting noise to protect the user's ears from an aural assault.
At this point you might feel like writing off the entire digital concept, especially TDMA. Fortunately however, the problems inherent in digital communications can be overcome by putting enough effort into it. For the North American market at least, digital was dead for a number of years. In Europe however, digital was not dead, and a fair bit of money was poured into a digital system known as GSM (Global System for Mobile communications). To ensure the survival of GSM, all the European governments agreed to make this the required standard for mobile systems.
GSM engineers decided to stick with the concept of TDMA, but they also realized it had weaknesses that had to be overcome. While Digital Cellular was forced to work within narrow 30 kHz channels, GSM decided to use wider 200 kHz channels. Instead of having only 3 slots, GSM channels had 8 slots. This resulted in an effective bandwidth of 25 kHz per slot, instead of just 10 kHz per slot. It allowed for faster bit rates, which meant more natural-sounding voice compression algorithms could be utilized.
Other concepts, such as frequency hopping (in which the call actually changes frequency quite often during a transmission) randomized the effects of co-channel interference, and reduced multipath interference. All of these improvements to TDMA paid off, as the GSM system proved to be very effective. Eventually a new compression algorithm was added called the "Enhanced Full Rate CODEC". This CODEC could produce voice quality rivaling that of wire line telephones, and it produced very little (if any) robotic effects.
Now let's return to North America, where it was becoming increasingly obvious that something new was required if cellular phones were going to move into the next century. The FCC started formulating plans for a whole new set of frequencies for a system to be called "Personal Communications Service", or PCS for short. It would be completely digital, and it would accommodate not only voice transmission, but data transmission as well.
Confusion has surrounded the use of the term PCS. Some people claim that for a service to truly be called PCS it must work at 1900 MHz (the new set of frequencies allocated for this purpose). They say that services using identical technology on 800 MHz (the old cellular frequencies) are not really PCS at all. As far as I am concerned this is a worthless argument, since it means nothing to the subscriber. The technologies work just as well at 800 MHz as they do at 1900 MHz. In fact, there are now many providers that operate on both 800 MHz and 1900 MHz. So does that mean you call is PCS one second, and not the next? Don't let yourself get roped into this argument, unless you like pointless controversy.
At around this time work had begun in earnest on a digital system that would compete with TDMA. This new technology was called CDMA for Code Division Multiple Access. Instead of dividing up the users of a channel by time slots, it had everyone transmit at the same time and separated them by their encoding scheme. Some people though this was pure science fiction, and claimed that CDMA was nothing but a scam.
They might have been taken more seriously if the idea hadn't already been around for almost 40 years. The military had been using the root concept this technology for their communications, and most satellites communicated this way as well. Instead of using a narrow channel and modulating signals onto a fixed carrier, "Spread Spectrum" uses a very wide channel and spreads the bits out using a "spreading algorithm". If you were to listen to such a transmission, it would sound just like the background noise, only slightly louder.
The military loved this type of communications, since it had two properties that made it perfect for their application. First off it was damned near impossible to detect. Secondly, it was practically immune to narrow band interference, so you couldn't really jam it. So what stopped it from being used for mobile phones until now?
The answer lies in something known as the "near-far problem". When 2 or more CDMA transmitters work on the same frequency they must arrive at the receiving antenna with more or less the same signal strength. If they don't radically strong signals from close-by transmitters would obliterate the weak signals from transmitters far away. In a mobile environment, you could have cars driving within 100 feet of a receiving antenna, while simultaneously using the channel with a mobile up to 10 or 20 miles from the receiving antenna.
The answer to this problem lies in a sophisticated power control scheme. Power control on CDMA phones is critical to the success of the system. It also has tremendous side-benefits. If the phone only transmits at the lowest power level necessary, then the batteries last longer, and the user is exposed to far less radiated energy.
Another benefit comes from the fact that everyone transmits on the same frequency. It means that your phone can decode signals from more than one cell site at a time. When signals from the site it is currently using for your call becomes weak, it just picks one of the others and uses it instead. This switch to a different site is mostly inaudible, unlike the hard change of channel needed in TDMA systems. In fact, the phone can combines signals from multiple sources to ease the transition.
From a provider's point of view CDMA is great since it maximizes the utilization of the spectrum and allows them to cram as many people as possible into their system. Sounds like a win-win situation that should blow TDMA out of the water right? In theory it should, but in practice this hasn't happened. CDMA does work, and it works well, but TDMA systems are still compelling for a number of reasons.
GSM is used in more countries around the world than any other technology, and it currently claims to have in excess of 1 billion subscribers worldwide. Roaming with a GSM phone is markedly easier than with other technologies. iDEN provides services not found on other technologies, such as "Direct Connect", though recently an attempt to reproduce this walkie-talkie system has shown up on CDMA networks.
In addition to this, CDMA has a few weaknesses. For example, each user of a CDMA channel adds to the background noise. To compensate for this the system asks each user to raise their power slightly to overcome the higher noise level. At some point however, the phones run out of "head room" and can no longer raise their output power. The amount of noise a single user adds to the mix is determined by the total number bits he transmits. If the CODEC can choose to use a lower bit rate when possible, the noise added by a caller will be reduced.
All is not perfect at the GSM camp either, as its audio has its own peculiar problem. When your call is handed off to another site a short break in the audio occurs. This break can chop up a word or two, and force you to ask your caller to repeat themselves. When driving in a very open area with no sites in the immediate vicinity you may experience audio dropouts. This happens because of co-channel interference (as I mentioned earlier). CDMA cannot suffer from co-channel interference per se, since everyone IS on the same channel to begin with. However, it degrades when there are too many signals present.
Besides GSM there are actually two other flavors of TDMA-based systems. The first is called IS-136, and many people refer to this generically as TDMA. I won't say much about IS-136 because it is now a dead technology. All of the North American providers who previously used it (including Rogers, Cingular, and AT&T) are abandoning it in favor of GSM. The same is happening in other countries where IS-136 is common.
The second flavor of TDMA is called iDEN, which is a Motorola-specific technology used in Canada by Telus for the Mike network, and by Lakeshore Electronics for the Harmony network. In the US it is used primarily by Nextel, but also by a number of smaller regional carriers.
The biggest claim-to-fame for iDEN is its Direct Connect feature, which allows users to treat it like a long-range walkie-talkie. Business users love this system. Because of the great success of Direct Connect, other technologies have attempted to adopt similar functionality.
Verizon Wireless in the US implemented the first push-to-talk (PTT for short) concept on their CDMA network. The problem with it has thus far been long latency periods, in which you would have to wait upwards of 5 seconds between the time you pressed the PTT button and the time you could actually begin to talk. iDEN still wins hands-down in this regard with latency of less than 1/4 to 1/2 second in most cases. Future development of PTT technologies on other networks will surely close this gap.
However, iDEN isn't just a walkie-talkie network, though this feature is its major selling point. iDEN also supports standard telephone interconnect (it's indistinguishable from any other type of cell phone in this regard) and it also supports a packet data feature for web browsing and for hooking up your laptop to the Internet.