Home >> July 2014 Edition >> At A Fever Pitch: Spectral Efficiency Enhancements
At A Fever Pitch: Spectral Efficiency Enhancements
By Karl Fuchs, Senior Contributor

 

It is a fact of life for those in the satellite industry that satellite transponder space is a precious commodity. Even the introduction of the new high throughput satellite (HTS) constellations will not change this reality.

As such, the spectral efficiency of satellite transmission equipment is of paramount concern. At its basic form, spectral efficiency refers to the rate of information that can be conveyed over a given bandwidth—this can also be called bandwidth efficiency.

The quest for greater spectral efficiency has driven virtually all development since the beginning of satellite communications. This is particularly true for digital communications. 

Spectral efficiency enhancements began in the 1960s with the development of forward error correction (FEC) codes and the advent of high order modulation techniques. The intervening decades have brought newer, more powerful FECs, which have brought us remarkably close to the Shannon limit, which is the maximum rate at which data can be sent without error.

Demodulators have become more capable, offering higher order modulations at lower Eb/N0s. However these approaches have their limits and antenna size, available power and other link budget constraints limit usable MODCODs.

A great increase in spectral efficiency came with the introduction of Time Divisional Multiple Access (TDMA) systems. TDMA allows multiple users to transmit on the same frequency at different times, allowing network designers to take advantage of the statistical nature of bursty traffic. For the first time with TDMA, the precious satellite resource could be shared across multiple users. No longer would single channel per carrier (SCPC) circuits, dedicated to one user, need to be provisioned.

The move to TDMA allows a network designer to build a network typically using 40 percent less bandwidth than a traditional SCPC network. While the statistical nature of traffic engineering leads to bandwidth savings, the necessary overhead of a TDMA system was a bit of a price to pay but more than worthwhile. 

The true challenge in developing a TDMA system is the ability to provide the network characteristics, primarily low jitter to allow for high-quality voice and video calls as well as to provide mechanisms to ensure sufficient network resources for high-priority applications while still maintaining spectral efficiency. Early TDMA systems suffered when trying to support real-time applications such as Voice over Internet Protocol (VoIP) and IP video. The systems simply were not agile enough to keep the jitter low across the network.

Greater processing power has allowed for systems which can reallocate bandwidth on the order of milliseconds. Application aware systems are able to preempt low-priority data in favor of high-priority or flash override traffic. 

The next important advancement in spectral efficiency came with the introduction of adaptive coding and modulation (ACM). The most common ACM implementation is found in the DVB-S2 standard, which was ratified in 2005 and quickly adopted by industry for two-way communications on the outbound channel. ACM allows a satellite engineer to develop a link budget based not on worst case rain fade but, rather, allows the use of higher order modulations during clear sky conditions. 

Today’s advancements in spectral efficiency are focused on the inbound channels. The same basic ACM concept is employed, but due to the complexities of TDMA, the implementation is much more intricate. Just as in DVB-S2, a receiver must communicate then receive Eb/No back to the transmitter so the appropriate transmit modulation and coding can be determined. However, as mentioned above, the key to a TDMA system is the ability to deliver the network characteristics needed to support VoIP and video and be application aware to support multilevel precedence and preemption (MLPP) traffic.

The integration of Adaptive TDMA (A-TDMA) return channels, feature-rich quality of service and de-queuing mechanisms presented a great challenge during development. An A-TDMA network must be able to maintain fairness as a network adjusts to atmospheric conditions ranging from clear sky through widespread rain fade to hub side rain fade. 

Future spectral efficiencies will be realized with the adoption of the new HTS satellites. The power and antenna gain-to-noise-temperature (G/T) available on these new satellites will enable higher order modulations than could be obtained on a terminal of a given aperture. The homogenous equivalent isotropically radiated power (EIRP) distribution of the spot beams helps mitigate the need to build in margin for the edge of beam remotes.

Some of the new HTS satellites will be using Ka-band, which has challenges with rain fade. The rain fade problem makes the ACM on the outbound and inbound more important than ever. The occurrence of hub side rain fade can be so pronounced in Ka-band that geographically redundant hubs are often required. 

Further off in the future is the prospect of using Low Earth Orbit (LEO) satellites for two-way communications. Currently, there are LEO constellations for voice traffic. However, a number of providers, some of whom are not currently in the satellite industry, have expressed interest in developing their own LEO data services. The power and data rates available on these LEO constellations are yet to be seen, but they certainly have an advantage, given their much lower orbits. 

Spectral efficiency enhancements continue at a fever pitch. Designers are working to improve the roll-off of waveforms to decrease the required guard band; improve signal cancellation techniques, which allows one carrier to reside under another, have gained popularity; mathematicians are working on better FEC. Developments in spectral efficiencies will continue for as long as satellite bandwidth is expensive. 

About the author
Karl Fuchs is vice president of technology for iDirect Government Technologies (iGT). He joined iGT in 2004 as the director of sales engineering, just as the satellite-based IP communications company was expanding its very small aperture satellite (VSAT) market presence into the federal government and international Internet Protocol (IP) networking world. He now works as the vice president of technology.

With more than 20 years of experience in technology and the federal government, Fuchs leads iGT’s team of federal systems engineers and serves as chief architect for new product integration. 

Prior to joining iGT, Fuchs was director of systems engineering at Nortel Networks, where he oversaw the Verizon account team of systems engineers, leading the design of IP, frame relay, asynchronous transfer mode (ATM) and dense wavelength division multiplexing (DWDM) networks. Before joining Nortel, he designed IP and ATM networks for Sprint and the federal government. Fuchs holds a Bachelor of Science degree in electrical engineering from George Mason University, Fairfax, Virginia, and an MBA from Averett University, Danville, Virginia.