- 1. Overview
- 2. Etymology
- 3. Cultural Impact
The digital tapestry of our interconnected world, ever-evolving and perpetually demanding, owes a significant debt to the often-overlooked architects of its underlying infrastructure. Among these, the International Telecommunication Union (ITU), specifically its Radiocommunication Sector (ITU-R), stands as a venerable arbiter, defining the very standards that underpin our wireless existence.
• Part of a series on the Wireless network technologies Analog
Digital
• 2G (2.5G , 2.75G , 2.9G ) • 3G (3.5G , 3.75G , 3.9G/3.95G ) • 4G ( 4G/4.5G , 4.5G , 4.9G ) • 5G (5.5G ) • 6G
• v • t • e
International Mobile Telecommunications-Advanced (IMT-Advanced Standard)
The International Mobile Telecommunications-Advanced (IMT-Advanced) Standard represents a set of rigorous technical specifications and requirements, meticulously issued in 2008 by the ITU Radiocommunication Sector (ITU-R), a division of the overarching International Telecommunication Union (ITU). These requirements were specifically formulated to define the next generation of mobile phone and Internet access services, which, in the grand tradition of marketing over precise technical adherence, quickly became known as 4G . Intriguingly, in certain markets, notably Turkey, this standard was even colloquially marketed as 4.5G, a testament to the industry’s perennial quest to constantly one-up itself, regardless of the fine print. This designation marked a pivotal moment in the evolution of wireless communication, aiming to push the boundaries far beyond what previous generations had offered.
Description
One might have hoped that a standard of such significance would be accompanied by a comprehensive, universally cited body of work. Alas, this section, much like many promises in the wireless industry, does not currently cite any sources . Perhaps the vision was so self-evident it required no external validation, or more likely, it’s merely awaiting the diligent efforts of those willing to improve this section by adding citations to reliable sources . Until then, one must proceed with a healthy dose of skepticism, acknowledging that unsourced material may, quite rightly, be challenged and removed . (August 2018) ( Learn how and when to remove this message )
An IMT-Advanced system, in its idealized form, was envisioned as a sophisticated and robust all-IP based mobile broadband solution. This comprehensive architecture was designed to seamlessly deliver high-speed connectivity to a diverse array of client devices, including the ubiquitous laptop computer wireless modems , the increasingly indispensable smartphones , and a myriad of other mobile devices that populate our digital landscape. Beyond mere connectivity, the standard aimed to facilitate an extensive suite of advanced facilities and services. These included, but were not limited to, the provision of ultra-broadband Internet access, the increasingly popular and cost-effective voice over IP (VoIP) for crystal-clear calls, immersive gaming services that could rival console experiences, and high-quality streamed multimedia content, all delivered directly to the user’s mobile device with unparalleled performance.
The fundamental objective of IMT-Advanced was to adeptly accommodate the stringent quality of service (QoS) and demanding rate requirements that arose from the continuous evolution of existing applications. This encompassed services such as general mobile broadband access, the venerable Multimedia Messaging Service (MMS), the now commonplace video chat platforms, and the burgeoning field of mobile TV . Crucially, the standard also aimed to cater to the emergence of entirely new, bandwidth-intensive services, most notably high-definition television (HDTV), promising a truly cinematic experience on the go. Furthermore, 4G systems were designed with the foresight to allow for seamless roaming capabilities with existing wireless local area networks (WLANs) and to gracefully interact with established digital video broadcasting (DVB) systems, creating a more integrated and expansive communications ecosystem. This ambition was a clear directive to move significantly beyond the capabilities and the somewhat modest International Mobile Telecommunications-2000 (IMT-2000) requirements, which had previously defined and specified the mobile phone systems that were then marketed as 3G .
Requirements
The ITU-R report that meticulously outlined the IMT-Advanced standard specified a series of demanding technical criteria, a veritable wish list for the future of mobile communication. These precise requirements included:
- All-IP Packet Switched Network Foundation: A fundamental shift towards an entirely all-IP based packet switched network architecture was mandated. This move was not merely a technical preference but a strategic decision to ensure greater flexibility, scalability, and efficiency in handling diverse data traffic, moving away from the circuit-switched legacy of earlier mobile generations.
- Interoperability with Existing Standards: The new standard had to demonstrate robust interoperability with the vast array of existing wireless standards. This was a pragmatic acknowledgment of the diverse global telecommunications landscape and the necessity for seamless integration rather than wholesale replacement, allowing for graceful transitions and backward compatibility where feasible.
- Ambitious Data Rate Targets: Perhaps the most eye-catching requirement was the specification of a nominal data rate of 100 Mbit/s for users experiencing high mobility, such as those traveling in vehicles. This was a significant leap from previous generations. Even more impressively, for clients and stations in relatively fixed positions, the standard demanded an astonishing 1 Gbit/s , a speed that, at the time, seemed almost futuristic for a mobile network. This dual-speed target highlighted the ambition to cater to both dynamic on-the-go usage and high-bandwidth stationary applications.
- Dynamic Resource Sharing: IMT-Advanced systems were expected to possess the capability to dynamically share and efficiently utilize network resources. This intelligent allocation was crucial to support a greater number of simultaneous users within each cell, optimizing spectral efficiency and enhancing overall network capacity.
- Scalable Channel Bandwidth: The standard stipulated a scalable channel bandwidth ranging from 5 to 20 MHz , with an optional expansion up to 40 MHz . This flexibility allowed operators to adapt their deployments to available spectrum and capacity demands, ensuring efficient use of the radio frequency landscape.
- Peak Link Spectral Efficiency : To maximize the amount of data transmitted over a given bandwidth, a peak link spectral efficiency of 15 bit/s/Hz in the downlink and 6.75 bit/s/Hz in the uplink was required. This meant that, theoretically, achieving 1 Gbit/s in the downlink should be feasible using less than 67 MHz of bandwidth, a testament to the advanced modulation and coding schemes envisioned.
- System Spectral Efficiency : Beyond individual link efficiency, the standard also set targets for overall system spectral efficiency , aiming for up to 3 bit/s/Hz/cell in the downlink and 2.25 bit/s/Hz/cell for indoor usage. These metrics underscored the importance of efficient spectrum utilization across the entire network.
- Seamless Connectivity and Global Roaming : Users were promised seamless connectivity and truly global roaming capabilities across diverse networks, coupled with smooth handovers as devices transitioned between cells and even different network technologies. This was a critical user experience enhancement, aimed at eliminating disruptive service interruptions.
- High-Quality Multimedia Support: The ability to consistently offer high-quality service for multimedia applications was a non-negotiable requirement. This ensured that bandwidth-intensive services like video streaming and gaming would perform reliably, meeting user expectations for a premium experience.
The initial set of requirements from the 3rd Generation Partnership Project (3GPP) specifically for LTE Advanced received formal approval in June 2008, setting the stage for the development of a technology that would ultimately meet, and in some aspects, surpass these ambitious ITU targets. A comprehensive summary detailing the various technologies that were meticulously studied and considered as the foundational elements for LTE Advanced can be found within a dedicated technical report.
It is crucial to understand that while the ITU adopts and disseminates these requirements and recommendations for future communication technologies, it does not, in fact, directly perform the arduous development work itself. Furthermore, the standards it proposes are not inherently considered binding by individual countries. Instead, the actual heavy lifting of research, development, and standardization falls to a consortium of other influential trade groups and standards bodies. Prominent among these are the venerable Institute of Electrical and Electronics Engineers (IEEE), the industry-promoting WiMAX Forum , and the aforementioned 3GPP , each playing a vital role in translating the ITU’s grand visions into tangible, deployable technologies.
Principal Technologies
To attain the ambitious performance metrics set forth by IMT-Advanced, a sophisticated arsenal of physical layer transmission techniques was not merely expected but absolutely essential. These foundational technologies included:
- MIMO (Multiple-Input Multiple-Output): This technology was paramount for achieving ultra-high spectral efficiency. By employing multiple antennas at both the transmitting and receiving ends, MIMO leverages spatial processing techniques, including both multi-antenna and multi-user MIMO , to transmit and receive multiple data streams simultaneously. This effectively multiplies the capacity of the wireless channel without requiring additional bandwidth, a truly elegant solution to a fundamental problem.
- Frequency-Domain Equalization: To efficiently manage and exploit the frequency-selective properties of wireless channels without resorting to overly complex equalization circuits, techniques such as “multi-carrier modulation” in the form of Orthogonal Frequency-Division Multiplexing (OFDM ) were crucial for the downlink. For the uplink, “single-carrier frequency-domain-equalization” (SC-FDE) offered similar benefits, simplifying receiver design while maintaining performance.
- Frequency-Domain Statistical Multiplexing: This technique enables variable bit rates by intelligently assigning different sub-channels to different users based on their individual channel conditions. For the downlink, Orthogonal Frequency-Division Multiple Access (OFDMA ) was the preferred method. In the uplink, options included Orthogonal Frequency-Division Multiple Access or “single-carrier FDMA” (SC-FDMA), also known as Linearly Precoded OFDMA (LP-OFDMA), which balances efficiency with power amplifier linearity requirements.
- Turbo Principle Error-Correcting Codes : To ensure data integrity and minimize the requisite signal-to-noise ratio (SNR) at the reception side, sophisticated error-correcting codes based on the Turbo principle were employed. These codes add redundant information to the data stream, allowing receivers to correct errors introduced during transmission, thereby improving reliability and extending coverage.
- Channel-Dependent Scheduling : Leveraging the inherent time-varying nature of wireless channels, channel-dependent scheduling dynamically allocates resources to users based on their instantaneous channel quality. This ensures that users with good channel conditions receive more resources, maximizing overall system throughput.
- Link Adaptation : This adaptive strategy involves dynamically adjusting modulation schemes and error-correcting codes in real-time based on channel conditions. Adaptive modulation and coding rates ensure that the system operates at the highest possible efficiency and reliability given the current radio environment.
- Relaying and Cooperative Relaying : To extend coverage, improve throughput, and enhance reliability, the concept of relaying was incorporated. This included both fixed relay networks, where dedicated relay nodes retransmit signals, and the more advanced cooperative relaying concept , often referred to as multi-mode protocol, where multiple devices collaborate to relay data.
Predecessors
Before the full realization of IMT-Advanced, several technologies served as crucial stepping stones, some of which even prematurely donned the “4G” moniker, much to the amusement of purists and the confusion of consumers.
Long Term Evolution
Long Term Evolution (LTE) emerged as a significant predecessor, a technology that, in its initial iterations, boasted a theoretical net bitrate maximum capacity of 100 Mbit/s in the downlink and a respectable 50 Mbit/s in the uplink, assuming the utilization of a 20 MHz channel. This capacity could be further augmented through the strategic deployment of a MIMO (multiple-input and multiple-output) antenna array, a foundational technology for enhancing spectral efficiency. The underlying physical radio interface, initially somewhat prosaically named “High-Speed Orthogonal Packet Access,” eventually settled on the more streamlined designation of E-UTRA .
A notable architectural departure in LTE was the decisive abandonment of the CDMA’s spread spectrum radio technology, a cornerstone of previous 3G systems and the cdmaOne standard. In its place, LTE embraced more advanced orthogonal frequency-division multiple access (OFDMA ) and other frequency-division multiple access schemes. This fundamental shift was meticulously combined with the deployment of sophisticated MIMO antenna arrays, intelligent dynamic channel allocation , and adaptive channel-dependent scheduling algorithms, all working in concert to optimize performance and capacity.
The world witnessed the dawn of publicly available LTE services, somewhat boldly branded “4G,” on 14 December 2009. These pioneering networks were launched in two Scandinavian capitals: Stockholm , Sweden, utilizing an Ericsson system, and Oslo , Norway, powered by a Huawei system. The initial user terminals, for those eager to experience this new era of connectivity, were manufactured by Samsung. Subsequently, all three major U.S. wireless carriers rapidly moved to offer LTE services, recognizing its immense potential. In South Korea, telecommunication giants SK Telecom and LG U+ rolled out LTE service for data devices in July 2011, with ambitious plans for nationwide coverage by 2012, further solidifying LTE’s global footprint.
Mobile WiMAX (IEEE 802.16e)
The Mobile WiMAX standard (IEEE 802.16e-2005), a robust mobile wireless broadband access (MWBA) technology, also often found itself branded as “4G,” despite not fully meeting the IMT-Advanced requirements in its initial form. Marketed as WiBro in South Korea, this standard offered commendable peak data rates of 128 Mbit/s for downlink and 56 Mbit/s for uplink over 20 MHz wide channels. (It’s worth noting that the original text here indicates a citation needed , a minor oversight in the grand scheme of things.)
The very first commercial mobile WiMAX service was inaugurated by KT in Seoul , South Korea, in June 2006, marking an early foray into high-speed mobile broadband. In September 2008, Sprint Nextel in the U.S. began marketing its Mobile WiMAX offerings as a “4G” network, a move that, while perhaps premature from a strict technical standpoint, certainly captured public imagination and set a precedent for future branding strategies. Beyond these early adopters, WiMAX found significant traction in other regions; in Russia, Belarus, and Nicaragua, the Russian company Scartel offered WiMAX broadband internet access under the “4G” banner, notably through its brand Yota .
The data speeds associated with WiMAX were quite competitive for its era:
| WiMAX | Peak download | Peak upload |
|---|---|---|
| WiMAX | 128 Mbit/s | 56 Mbit/s |
Ultra Mobile Broadband
Ultra Mobile Broadband (UMB) was the designated brand name for a 4G project initiated within the 3rd Generation Partnership Project 2 (3GPP2) standardization group. This ambitious undertaking aimed to significantly enhance the existing CDMA2000 mobile phone standard, preparing it for the burgeoning demands of next-generation applications and the increasingly stringent performance requirements. However, the trajectory of UMB took a decisive turn in November 2008 when Qualcomm , the primary sponsor and driving force behind UMB, made the strategic announcement to cease all further development of the technology. This decision was driven by a clear preference to instead focus its considerable resources and influence on the rapidly gaining momentum of LTE , which was demonstrating greater industry alignment. The initial objectives for UMB were undeniably ambitious, targeting impressive data speeds exceeding 275 Mbit/s downstream and over 75 Mbit/s upstream, numbers that, at the time, were at the forefront of wireless innovation.
Flash-OFDM
At a relatively early stage in the conceptualization of 4G, the Flash-OFDM system was considered a promising candidate, with expectations that it would undergo further development to eventually evolve into a full-fledged 4G standard. This technology, known for its efficiency and robustness in mobile environments, held significant potential before other paradigms gained dominance.
iBurst and MBWA
The iBurst technology, distinguished by its utilization of High Capacity Spatial Division Multiple Access (HC-SDMA), was also initially considered a significant predecessor in the journey toward 4G. Its innovative approach to spatial multiplexing made it an intriguing contender. Ultimately, iBurst was incorporated by the Mobile Broadband Wireless Access (MBWA) working group into the IEEE 802.20 standard in 2008, becoming a part of the broader effort to define standards for mobile broadband wireless access systems supporting vehicular mobility.
Candidate Systems
As the dust settled and the various contenders vied for the IMT-Advanced crown, the ITU-R Working Party 5D, after careful deliberation, formally approved two industry-developed technologies in October 2010. This pivotal decision narrowed the field significantly, identifying the leading candidates that truly embodied the spirit and technical demands of the next generation.
On December 6, 2010, the ITU issued a clarifying statement, acknowledging the complex reality of marketing versus technical compliance. The organization noted that while the then-current versions of LTE , WiMax , and other evolved 3G technologies did not, strictly speaking, fully fulfill the stringent IMT-Advanced requirements for true 4G , it nonetheless allowed for the use of the term “4G” in an “undefined” fashion. This pragmatic concession was granted to represent those “forerunners” to IMT-Advanced that demonstrated “a substantial level of improvement in performance and capabilities with respect to the initial third generation systems now deployed.” In essence, it was an acknowledgment that progress, even if not perfectly aligned with the ultimate vision, deserved some recognition in the marketplace.
LTE Advanced
LTE Advanced , an ambitious evolution of Long Term Evolution (LTE), was formally submitted by the 3GPP organization to the ITU-T in the autumn of 2009, signifying its readiness to be considered for the IMT-Advanced standard. This technology was subsequently released in 2011. The explicit target for 3GPP LTE Advanced was not just to meet the demanding requirements set by the ITU, but to decisively surpass them, a clear demonstration of engineering ambition. LTE Advanced represents a significant and incremental improvement upon the existing LTE network infrastructure, building upon its strengths and addressing its limitations.
While Release 8 of LTE, published in 2009, supported impressive download speeds of up to 300 Mbit/s , these figures, while robust, still fell short of the ultimate IMT-Advanced standards. It was anticipated that Release 10 of LTE would be the pivotal iteration to finally achieve the full suite of LTE Advanced speeds and capabilities, thereby fulfilling the ITU’s vision.
WiMAX Release 2 (IEEE 802.16m)
The WirelessMAN-Advanced evolution of IEEE 802.16e , a standard that built upon the foundations of Mobile WiMAX, was officially published in May 2011 as standard IEEE 802.16m-2011 . The industry consortium actively promoting this technology strategically assigned it the marketing name of WiMAX Release 2. Its core objective was to meticulously fulfill every criterion stipulated by the IMT-Advanced requirements, establishing itself as a fully compliant 4G technology. Indeed, the IMT-Advanced group formally acknowledged and approved this technology as meeting its stringent criteria in October 2010, marking a significant milestone for the WiMAX ecosystem. Later, in the second half of 2012, the 802.16m-2011 standard underwent a consolidation process, being rolled up into the broader 802.16-2012 standard. However, the specific WirelessMAN-Advanced radio interface portion of the 802.16m-2011 standard was subsequently moved to its own dedicated specification, IEEE Std 802.16.1-2012, ensuring its distinct technical identity within the larger IEEE framework.
Comparison
The following table, a rather exhaustive compilation, offers a direct comparison of the IMT-Advanced candidate systems alongside other competing and complementary technologies. It’s a snapshot of the wireless landscape, revealing the varied approaches and performance metrics that characterized this era of rapid evolution.
•
Parts of this article (those related to template) need to be updated . Please help update this article to reflect recent events or newly available information. (November 2018)
Comparison of mobile Internet access methods
| Common name | Family