Results 1 to 6 of 6
  1.    #1  
    and will charge ya to urge you along...,1895,1997102,00.asp

    NEW YORK (Reuters)—Cingular Wireless, a venture of AT&T Inc. and BellSouth Corp., said on Monday it would start charging customers with older phones an extra $4.99 monthly fee as early as September unless they upgrade their phones as it moves toward using a single network technology......

    ....It is planning to shut down its TDMA network in early 2008 and under Federal Communications Commission rules it must keep its analog network in place until February 2008, Cohen said.
  2. #2  
    I thought GSM used TDMA by definition?

  3.    #3  
    Here's more:

    It is part of Cingular's plan to phase out phones based on older TDMA and analog technology, the technical standard for the first cellphones produced more than 20 years ago.

    Cingular has been working for years to phase out these technologies in favor of GSM (Global System of Mobile Communications), a newer technology that is the world's most popular wireless standard.
  4. #4  
    Err.. I always thought TDMA was the protocol implemented in the GSM technology? TDMA is a multiplexing scheme which separates stations in the time domain... it was adopted as the narrowband technology implemented in GSM. Thinking about it now, I am assuming they have decided to move to FDMA which gives stations their own subdivision of a frequency... kind of like CDMA... only not. CDMA uses constructive inteference to create a unique signal for each station, FDMA uses a checking mechanism to make sure no two stations are transmitting on the same subdivision...


    Just as a 4AM side note, I think whoever wrote that article needs to take a course in mobile communications...
    Last edited by FrozenCode; 08/10/2006 at 03:54 AM.

  5. #5  
    I remember reading an article 3 years ago saying they were phasing out their TDMA network. Guess they're serious this time.

    GSM is based on TDMA but there are differences - enough so that they need the seperate networks. It must have gotten to the point where there aren't enough subscribers on the old TDMA network to justify it being up. I don't remember the differences between GSM and TDMA but I do have a wireless comm book that describes the differences. I'll have to look it up later.

    Oh well...hopefully they phase that out quickly and devote more resources to better UMTS/HSDPA rollout/coverage.
  6. #6  
    I'd would be really interested in seeing what your book says. I don't have one of my own, and I would love to see how this stuff really works. I spent some time thinking about this, and came to the conclusion that we are both wrong. There is no way on this Earth that a company as large as Cingular could allocate a unique carrier frequency to every station in a cell, nor could they entirely use TDMA (analogous to Token Ring). After thinking about it, I have drawn a new conclusion.

    It is important to remember time division multiplexing and frequency division multiplexing are just that, multiplexing schemes. They occur on a logical level, while the actual signal modulation (GMSK in this case) occurs on a physical level. Also, for those who are OSI minded, remember that GSM layers do not completely correlate with the OSI model.

    This post will attempt to explain the technical details of the GSM standard. It is useful to have a basic knowledge on signal processing and GSM network structure, though not required. Almost all GSM networks use a combination of FDMA and TDMA, this will become immediately apparent within the next few sentences and more so by the end of this post. This post defines uplink to be MS to BTS transmissions and downlink to be BTS to MS transmissions. It will also use the uplink band based on the GSM standard for 900MHz as an example.

    The uplink band actually ranges from 890000KHz to 915000KHz; resulting in a total displacement, within the spectral envelope, of 25000KHz. That 25000KHz is then divided into 124 carrier frequencies each with a bandwidth of approximately 201.613KHz. You can already see the FDMA design incorporated in this structure. The 201.613KHz carrier frequencies are then divided up into eight timeslots which occupy the carrier frequency for .576875ms every 4.615ms. The TDMA structure is now easily recognizable.

    Frequency hopping, which incorporates both TDMA and FDMA, is done to eliminate narrowband interference on certain frequencies. As devices authenticate with the BTS they are assigned a timeslot on a given carrier frequency. The BSC provides the BTS with the hopping scheme, and the BTS broadcasts that scheme on a reserved channel to the MSs.

    To elaborate further on the aforementioned carrier frequencies: The entire carrier frequency is allocated 4.615ms to transmit frames at a throughput of 270.855904659kbps, which ultimately means each mobile station receives .576875ms to transmit frames at a rate of 33.8569880823kbps before the timeslot is closed and reallocated. (I will attempt to go over the origins of these numbers later).

    Since time is expended in the actual travel of the RF signal, a timing advance variable must be calculated based on distance to offset for time lost in transition. This can cause two stations to have overlapping transmissions. As a precaution, a 30.5 microsecond guard space is allocated.

    Frames are transmitted in bursts, for clarification, refer to the throughput paragraph above. The structure of a generic frame is similar to this: The first and last three bits are set to 0. Immediately after the first and before the last "tail" fields are the User Data fields whose lengths are 57 bits. The bits adjacent to both of the previous data fields are 1 bit flags which specify if the data fields contain user or network control data. And lastly the middle 26 bits are for the "training" field. This field is used to adjust the station's parameters to the current path propagation characteristics and tell it to select the strongest signal in the event of multipath propagation. The most data that a station can transmit in a given timeslot is 156.25bits which is what all of the numbers are based on. In this case, the guard space is ignored.

    In addition to those frames, there are also definitions within the GSM standard for additional bursts, frequency correction bursts and synchronization bursts to improve reliability of a transmission, an access burst is used to emulate a SYN/ACK conversation, and a dummy burst in the event no data is being transmitted.

    The life of a frame is as such: An SDMA application is implemented to determine the best BTS. The BTS (Base Tranceiver Station) is in essence, the cell tower we are all familiar with. When it is time to send data, the mobile station within the RSS (Radio Subsystem) sends a frame (GSM specifications define LPC as the method of translating analog data (voice) into an encapsulated digital render) to the BTS via the Um interface in the BSS (Base Station Subsystem). The BSS overlaps with the RSS, however the RSS is a more collective term. The BSS refers to a cell and the BSC (Base Station Controller) associated with it.The frame goes through the Abis interface and is sent to the BSC. The BSC is, in a nutshell, the brains behind the BTS.

    From here, the frame leaves the RSS and enters the NSS (Network and Switching Subsystem) into the MSC (Mobile Switching Center) through the A interface. The MSCs are essentially the backbone network of the system. They connect to other MSCs and BSCs (through the A interface). If a frame is designated for an attached BSC, then it will be sent out to that BSC. However, if the frame is destined for an "external" network, it will be handed off to a GMSC (Gateway MSC), where it will eventually be sent to a PSTN network or another carrier's GMSC. The method of frame handling used in this segment is called the SS7 (Standard Signalling System Number 7). This is the system responsible for number portability and other nifty services.

    There are several aspects here which have not been discussed. Several databases have been missed in the process. I will briefly describe their functions. The HLR (Home Location Register) stores information about a station. Such information includes: the MSISDN (Mobile Suscriber ISDN number), the IMSI (International Mobile Suscriber Identity), and the current LA (Location Area). The VLR (Visitor Location Register) is similar to the HLR, except that it is an entirely dynamic database. Its sole purpose is to avoid frequent HLR updates.

    Lastly I will describe the third part of a GSM system the OSS (Operation Subsystem). The OMC (Operation and Maintainance Center) manages all other network devices through the O interface. Basically, this appliance monitors the status of everything else. The AuC (Authentication Center) contains authentication algorithms and encryption keys. And of course, the EIR (Equipment Identity Register). This database is what stops people from activating stolen phones. It's really just a large black and white list. EIRs are not cross-carrier synchronized, and therefore a stolen unlocked phone will work flawless on a carrier other than the one of its rightful owner.

    Now to explain some of the math. The numbers you saw before were based on very basic math. I'll account for all of numbers since the beginning of this post. The frequency allocation is regulated by the higher ups, we have no control over that. In this post, we've only concerned ourselves with the uplink end of the 900MHz spectral envelope, which accounts for 124 channels. However, in reality there are 248 channels (124 uplink and 124 downlink). If you attempt to account for all of the carrier frequencies with the bandwidth determined earlier, you will notice a 20MHz gap. This gap separates the uplink and downlink carrier frequencies and is allocated for duplex channels.

    Our end of the spectral envelope is allocated a frequency range from 890MHz (+- .1MHz deviance) to 915MHz (+- .1MHz deviance). The ~25000KHz frequency allocation is then divided up to the 124 separate carrier frequencies, resulting in the gross bandwidth of 201.613KHz per carrier frequency.

    In 4.615ms 8 stations are allocated equal amounts of time to transmit 156.25 bits each. Therefore, each station will be allocated .576875ms to transmit 156.25 bits of data at a gross throughput of 33.8569880823kbps. When all of the stations resources are added up, the total resource allocation for the carrier frequency as stated above is met.

    Now that you have a general idea of the multiplexing schemes used by GSM, as well as typical higher level procedure, I will discuss the physical transmission of the bits.

    The facts are that GSM uses GMSK (Gaussian Minimum Shift Keying) which is a form of continuous phase frequency shift signalling (frequency modulation with no phase discontinuity in the modulated signal). The process of GMSK modulation, involves transforming the binary representation of data (square wave) into a structure which uses -1 and 1 values. That data is then shaped into a Gaussian curve and lastly frequency modulation is used to complete the GMSK signal.

    The following is all theoretical, and has not (to my knowledge) been confirmed. I have developed this theory independently and honestly, wouldn't be surprised if it's not 100% accurate.

    So we know that GSM uses GMSK, which ultimately utilizes a Gaussian curve. I have decided to venture into that a little more and am claiming that the Gaussian function being used is an integration of complicated model, whose parent function is a*e^((-(x-b)^2)/c^2) where a,b, and c are constants such that a is greater than zero (an elementary Gaussian function), in relation to x.

    I am also claiming that the result of that integration is an outline of a procedure which describes the change in frequency at a certain instant. That certain instant is not synonymous with any given point; rather it describes the aforementioned change when no two consecutive binary bits are the same (e.g. 10 or 01 NOT 11 or 00).

    This waveform is very useful, because with it, I think it is possible to derive a Gaussian distribution to model a group of functions which in turn model lengthened or shortened times spent in [frequency] transition. However, its physical application does not go without repercussion.

    I propose that by lengthening this time, the signal requires less bandwidth, however, it also requires a greater signal to noise ratio in order to be properly demodulated. By shortening the time, the signal requires more bandwidth, but a lower signal to noise ratio for error free demodulation.

    Further, I propose that the result of calculating the standard normal distribution of the Gaussian integral curve, will result in the optimal balance between bandwidth and noise tolerance with a *possibly negligable* (but not without some) amount of deviance at any given point within the domain mentioned above.

    So, I agree that, in theory, it is not outside the realm of possibility to change the multiplexing technology, it's just not very practical.... at all... It's not cost effective, it can't possibly be stable after its immediate release, and its certainly not in Cingular's or their customer's best interests.


    Wow, I wrote more than intended to Perhaps I'll put this in the wiki...
    Last edited by FrozenCode; 08/11/2006 at 10:24 AM.

Posting Permissions