>>> Click here to access this episode of the Syllab Podcast on Spotify <<<
a) Radio broadcasting
Propagating waves have been reliable signal-carriers, whether in the form of pressure waves as when we speak, and since the mid-19th century, in the form of electricity with the telegram first (refer to S4 Section 4.c) and later with the telephone (refer to S4 Section 4.d). This leaves electromagnetic waves as another option, and well before optical fibre (refer to S4 Section 4.e) a whole industry spun around waves with much lower frequency, in the range of kHz or MHz rather than the hundreds of THz of the visible light spectrum. This is the domain of radio waves.
The fact that radio waves can propagate over ambient air makes the required infrastructure as well as the business models around the use of the technology and the consumption of the data carried by its signal very different from the telegraph or the telephone. Indeed, no wiring is required. But this also means the content which is transmitted cannot be directed at one person in particular. Thus the term of broadcasting as opposed to point-to-point communication. And since anybody with her own receiver can transduce the signal and avail the content without a controlled access, the broadcasters cannot charge for the content and must derive revenues from advertising – unless the content is encrypted.
Let us not digress too much into the economics though since there is enough to say about the technology. Before we explore the workings of transmitters and receivers, let us ask ourselves the question of how can several stations transmitting different signals can co-exist in the same geographic areas? Wouldn’t there be a conflict between the waves, some interference and therefore some signal loss?
If you have read about multiplexing in S4 Section 4.b you would know the answer is “no”, provided separate channels are created. Since there is real time broadcast and listening, time-division is not an option, nor is spatial-division obviously because no spatial segregation can be maintained over such long distances plus it goes against the very idea of broadcasting, which leaves us with frequency-division. In other words, channels can be created by adhering to specific frequency bands and your receiver will decode the content it receives at the frequency you specify, being that of the radio station you want to listen to. This is the physical dial you turn (or the buttons you press nowadays) and, when you are exactly at the relevant frequency, the signal should be clear because demodulation algorithms work.
Radio transmission was developed at the turn of the 20th century and used the carbon microphone mentioned in S4 Section 4.d to transduce voice onto carrier waves in what is called “medium wave band”, in the roughly 500 to 1700 kHz frequency range with channels interspersed every 9 or 10 kHz depending on the countries. The comparatively low frequency allowed the wave to travel for longer distances, especially at night when they can be reflected by the F layer of the ionosphere, something we saw in S3 Section 3.a, whereas by day the D layer attenuates those radio waves with frequencies below 10MHz.
Being technically simpler, the first method of encoding was amplitude modulation, hence the name of “AM” stations, which consists in translating the up and down analog signal by increasing and decreasing the amplitude of the signal carrier waves. Simple, and accordingly, prone to interference and noise, in particular where metallic structures and electrical equipment are present.
Later on, most listeners and stations shifted to the higher-quality “FM” stations, where FM stands for frequency modulation. In this method, the analog signal variations are translated into increases and decreases in the frequency of the carrier waves. The typical frequency range for FM transmission is 88 to 108 MHz, which may indeed sound familiar, with spacing of 200kHz or more between channels.
Some additional info regarding usage and channels that you may find interesting:
- Provided your receiver also has transmitting capabilities, it is possible to communicate one-on-one with another person though anybody tuned to that channel can listen. This is called two-way radio and in half-duplex mode one cannot both transmit and receive at the same time. Police forces use it and anybody can get their hands on such equipment in the form of walkie-talkies.
- CB radio stands for “Citizens band radio”; it is a two-way radio system with dedicated channels near the 27 MHz frequency. It is often used in a rather passive listening mode, collecting useful information, and sometimes providing some information to other listeners, in particular drivers…
- Of course, the digital revolution has also carried over to the world of radio transmission though analog radio remains more popular. Radio over the internet is also increasingly listened to as it allows access to most national, regional and even local radio stations from anywhere in the world, provided internet access is unrestricted. For digital audio, the most popular technique for encoding analog signals into digital ones is pulse-code modulation (PCM), it consists in measuring the amplitude of the analog waves at frequent intervals and mapping these onto a finite set of values. The measurement is called sampling, typical frequencies are more than 40,000 times per second, and the jargon term for mapping to a value is quantization with 16-bit or 65,536 possible values being a popular standard resolution.
Finally, the hardware. On the transmitter side, a microphone converts the analog audio signal, be it voice or music, into an analog electrical signal – you may refer to S4 Section 4.d where we explained the basic technology of condenser microphones with a diaphragm inducing changes in capacitance and creating an electrical signal. This signal is then modulated to the intended radio frequency and applied to the antenna. And the magic happens: the antenna emits the radio waves carrying the signal.
What we should remember is that radio waves are electromagnetic in nature so there is a fundamental correspondence between electric waves and electromagnetic waves and antenna serve to convert one into the other, from electric current conducted through a metallic medium to radio waves propagating through the air. And vice versa. Indeed, the process is symmetrical, so the principle of transmission works in reverse for reception. From a physics standpoint, when transmitting, alternating current is applied to the antenna rods, thus creating electrical and magnetic fields with alternating polarization that are radiated away perpendicular to the rods and then propagating as concentric half-circle waves. When receiving, the electrons are physically being pushed away back and forth by the waves, thus creating an oscillating electrical signal. I am including the link to the Wikipedia entry for antenna (in the context of radio) at the end of this chapter, if you wish to learn more.
Once the signal is travelling through space, since it only propagates horizontally – actually it is on a 2-dimensional plane but the transmitting antennas are oriented so that this plane is horizontal for maximum range and because that is where the signal is intended to be received – the distance it can travel, assuming no bouncing back from the ionosphere at night, is a function of the power with and the direction in which it was emitted. To ensure broader coverage, relay stations may be used and will generally operate as repeaters, receiving the signal and re-transmitting it with more power.
On the receiver side, we have an antenna, which may be designed with several rods to capture waves from various frequencies (half-wavelength works best) and oriented in a particular direction or shaped to derive a better-quality signal. Then the receiver should have the ability to tune to different frequencies to receive separate channels and the electrical signal is ultimately transduced into sound waves via a loudspeaker or an earphone – a comparatively simple process where the electrical signal is translated into movements of a diaphragm by way of a motor and these movements create vibrations, the sound waves.
b) Cellular networks
As pointed out at the outset of the previous section, radio waves can be received in any location provided there is an antenna broadcasting in the vicinity, which can extend to several kilometres in rural areas. So what do we get when we combine radio waves and a switched voice or data network? Mobile phones of course and, increasingly, other “smart devices” capable of connecting to the Internet. Smartphones will be dealt with in Chapter 9 so here we will focus on the network, signal and data aspects.
The antennas used for radio wave transmission, whether for data or voice, are mounted on telecommunication towers. These are essentially tall masts acting as local relays in charge of providing inbound and outbound coverage for a particular area called cells – this is where the name of cell phones and cellular network comes from. The rationale behind the structuring of the network into cells is three-fold:
- Adjacent towers can use different frequency bands thus allowing for a partitioning of the user base without interference between signals across cells. Since the number of bands is limited, they can be reused in other cells, provided they are not contiguous.
- The partitioning can be designed to be denser in locations with more population so all users in a given area can expect reliable service with calls that connect and good signal quality. There is indeed a limit to how many users can be multiplexed at one time within a certain frequency band, regardless of the multiplexing technique used.
- Arbitrarily reducing the geographic coverage area of any one cell allows for the use of low power transmitters that will create less interference in cells using the same frequencies further way.
With mobile phone comes mobility however and cell coverage doesn’t follow precise borders so ongoing communications may need to be shifted from one tower to another once or even several times to ensure proper signal quality is maintained. This process is called “handover” or “handoff” and can happen in one of two ways, which is dictated by the technology used. The first option consists in interrupting the communication before re-engaging it in a different channel, this is called “hard” handover and at today’s speeds it is perceived as being seamless. An obvious advantage of hard handover is the phone or receiving device only needs to be able to receive and decode from one channel at a time and the downside is a potential loss of communication if connection to the second channel fails.
In contrast, the second option is dubbed “soft” handover and sees the receiving device maintaining at least one more connection and shifting to this channel before relinquishing the previous connection. This allows for the device to select the best signal available to carry or extract the information from and it provides increased reliability in geographic areas with poor coverage. However, from the perspective of the network, this means several channels from adjacent cells need to be dedicated to the same user, so there is a trade-off involved.
In terms of connecting two users, or handing over between cells, the underlying principle is the same as with the telephone network and consists in switching the signal at the relevant network nodes. In fact, for voice, the same public switched telephone network (PSTN) integrates the switching infrastructure and list of phone numbers (serving as device addresses) as the traditional or “plain old” telephone service. With the rise of data transmission, the PSTN has been complemented by an analogous network called the packet-switched network (PSDN) which relies on fixed-format streams of data called packets being routed by following the information contained in the packet header with the rest of the data called the payload, which creates a natural time-division multiplexing with the same channel being available to multiple users concurrently.
Since we are on the topic of multiplexing, we can now shift our attention to the encoding of messages and the various standards that have rolled out, and some phased out, since the start of mobile telephony. One thing which should be noted is that starting with 2G technology, the radio waves carried digital data, not analog, unlike the pre-early-1990s technologies retroactively named 1G and 0G where 0G involved a centralized operator and 1G a cellular network.
As we have seen in S4 Section 4.b as well as in S4 Section 4.e on optical fibre and in the previous section for radio stations, it is possible to use the same channel by employing frequency-division or time-division. In mobile telephony, the incarnation of these would be frequency-division multiple access (FDMA) and time-division multiple access (TDMA). I will not rehash the underlying concept but you may have also heard the name CDMA before, it means code-division multiple access and should be thought of as allowing multiple users to speak in different languages as opposed to taking turns to speak or using different voice pitches – I must acknowledge this analogy is not mine but borrowed from the Wikipedia article on CDMA for which I include the link in the last section because it also provides more detailed explanations on the mathematical reasoning behind this technology. Essentially, the idea is to superimpose the binary data on an underlying code unique to the user for the duration of the communication; generally a XOR logic is applied, this is an exclusive OR yielding 1 for (0,1) or (1,0) and 0 for (1,1) or (0,0). These codes can be demodulated without loss of information because the data strings are effectively vectors and the codes are selected so that they are orthogonal, i.e. the dot product of the vectors is equal to zero. The more bits in the code, the more channels can be created without interference. The underlying logic of orthogonal subcarrier signals has also been deployed for frequency-division, making possible orthogonal frequency-division multiplexing (OFDM) which is used in 4G and 5G technologies.
With the advent of 2G networks, i.e. digital information expressed as binary bits rather analog, even if embedded in the analog carrier signal that radio waves are, it became easier to transmit data rather than just voice as well as to encrypt communication. The most prevalent technology from this period was GSM (Global System for Mobile Communication) which supported short message services (SMS) and it was later upgraded to GPRS (General Packet Radio Service), sometimes called 2.5G which saw the addition of packet-switching to the existing circuit-switched data transport. Unlike circuit-switching, packet-switching doesn’t require a dedicated circuit to be formed and the data can travel through different nodes, which is much better from a resource’s optimization standpoint. With GPRS came “always on” connections and thus improved Internet access, a network and related technologies we will cover in Chapter 6.
Increasing data transfer speeds became the name of the game and the third-generation technology (3G) deployed in the communication spectrum of 400MHz to 3GHz starting in the late 1990s saw speeds reaching a minimum of 144kb/s, compared to a theoretical maximum of 40kb/s for non-enhanced single-slot GPRS where data also experienced high latency since priority was given to voice. The two standards that came to dominate the 3G era were CDMA2000 and UMTS (Universal Mobile Telecommunication System). Both used CDMA for multiplexing and UMTS evolved incrementally to High Speed Packet Access (HSPA) and LTE (Long Term Evolution), which became the de facto standard for the transition into 4G technologies deployed from the late 2000’s onward. In 4G, no more circuit-switching telephony, voice has to follow the Internet Protocol, and of course bandwidth has continued to increase quite dramatically over time to support media streaming and Internet of Things (IoT) applications. The approved standards include LTE Advanced, an enhanced version of LTE with peak download rate of 1000Mb/s and 500Mb/s for upload, and WiMax 2 (officially named IEEE802.16m or WirelessMAN-Advanced), the successor of WiMAX (formally IEEE802.16e).
So, what’s in store for mobile telephony? Higher speed of course, and lower latency. Think of latency as round-trip time and the ability to decrease this lag is not just good for gaming, if that is your thing, it will enable technologies such as autonomous driving cars where two and more devices need to communicate in “real-time”. More devices connected and talking to each other faster with improved response time. Hence, 5G promises peak download speeds of 10 Gb/s with increased bandwidth and, though 6G is already being devised at a high level, there is some degree of wait-and-see with respect to customer needs and demands to decide which aspects need to be prioritized. Furthermore, there is an explicit acknowledgment that the existing hardware in the network should continue to be used as much as possible and licensing costs imposed by governments (including in the form of spectrum auctions) should not be unreasonable to avoid massive funding requirements before a roll-out – painful lessons learned from 3G, 4G and 5G.
c) Wi-Fi & Bluetooth
As a result of the massive capital expenditures network operators had to make, mobile data subscription come at a significant cost. To be sure, mobile services are not expensive on a per megabyte basis, the problem is the content we avail on our phones and personal computers is increasingly data intensive because of the higher definition and the presence of an increasing share of pictures and videos rather than text. And of course, we use it a lot more than we did in the past since many of our interactions as consumers are done online. The solution, from a consumer standpoint is to look for a cheaper source of data, and this is the landline infrastructure with its blazing fast optical fibre. If we think of it as the end point of a trunk line, all we need is a technology to create a local area network. Enter Wi-Fi.
Wi-Fi bridges the gap between fixed line digital data stream and its receipt and transmission from mobile devices; plug a wireless router which acts as your local switchboard and you can connect multiple devices, even improving coverage with repeaters or mesh units. Indeed, this may be required because signals at the radio frequencies of 2.4GHz to 6GHz are highly susceptible to interference from materials, metallic structures in particular because radio waves are electromagnetic and metals are electrical conductors. This means line of sight is ideal, to the extent possible, and you should think of meshes as mirrors – placing them right next to your desk might therefore not be the optimum configuration.
Within the network, each of your devices can be found thanks to its 48-bit MAC address (medium access control address) and, since eavesdropping on wireless communication can theoretically be done for any other device within range, security and privacy can be ensured by first restricting the network access via a password and second by encrypting every communication between the device and router, this is what WPA3-certified routers do – WPA means Wi-Fi Protected Access.
Channels are only half-duplex in Wi-Fi, so both directions are possible but only one at a time, like the walkie-talkie but since radio waves propagate at close to the speed of light in the air (the index of refraction of the air is about 1.00003 at sea level on average), this is not a major issue, especially as the channels can be time shared between devices and the signal is modulated according to orthogonal frequency-division multiplexing – OFDM was explained in section b) above – so each channel can accommodate several sub-carrier signals and therefore the same number of separate data streams.
In terms of standard evolution and capabilities, Wi-Fi 5 (IEEE 802.11ac) uses the 5GHz radio frequency band and can achieve link rates of up to 3.4Gb/s for handheld and twice that on a laptop or tablet. Its successor, Wi-Fi 6 (IEEE 802.11ax) uses both the 2.4GHz and 5GHz bands with the extended version, Wi-Fi 6E, also covering the 6 GHz band. It performs much better than Wi-Fi 5 in crowded places thanks to a much higher data throughput. The main reasons are the use of additional frequency bands, a multi-user technology (MU-MIMO) which is essentially spatial-division multiplexing with targeted beams available both on the downlink and uplink directions (instead of downlink only), lower latency access protocols and the use of OFDMA instead of OFDM where “MA” stands for multiple access. This works by assigning subcarrier signals to different users depending on urgency and availability, in a way combining frequency and time division principles and resulting in higher spectral efficiency.
The Wi-Fi 7 standard (IEEE 802.11be) was adopted in 2024 but has not been rolled out in hardware as yet, at the time of this writing (Apr’25). Improvement over Wi-Fi 6 would include a move from 10-bit to 12-bit per symbol, reducing interference losses experienced with OFDMA thanks to a feature called “multiple resource unit”, and enhanced MU-MIMO from 8 data lanes to 16.
All things considered, Wi-Fi is a potent tool to wirelessly access and transfer data to and from the Internet. But sometimes external access is not required and all you want is devices to communicate between themselves in your own local area. The dominant technology for that is called Bluetooth and it works by creating personal area networks (PAN); each device has a unique 48-bit address and an easier-to-remember nickname and gets paired with one or several other devices. At the time of pairing, a link key is shared which is used to authenticate the devices every time in the future and, post authentication, the data stream between the pair is encrypted for security and privacy reasons.
The technology is designed to work at short ranges of less than 10m so it is low-powered and works best when devices are in visual line of sight though the radio waves will still travel through most thin partitions and, as with Wi-Fi, conductors such as metal structures will interfere with the signal. The technology operates across 79 channels, each 1MHz wide in the 2.4GHz band (2.402 to 2.480 to be exact, excluding the bottom and top guard bands). The digital signal is encoded using a variant of Quadrature PSK (refer to S4 Section 4.a for explanations about phase-shift keying) called DQPSK where the D stands for “differential”. Instead of the data being encoded in set values, it is encoded by changing values; in other words, it is not the absolute value which contains the information but the relative value from one phase to the next with a further offset of 1/4π added in the case of π/4-DQPSK to avoid the signal moving through the centre of the circle between phases. To better understand this, I suggest looking up for more information in a video format.
In terms of multiplexing, there is the usual time-division but this spectral optimization is complemented by changes in the frequencies used to transmit the data packets. This method is called frequency-hoping spread spectrum (FHSS) and the shifts occur in a pre-determined order between sub-carrier frequencies as narrow as 10KHz within a given channel. This has advantages in terms of reducing interferences and security, the latter since detection requires knowing the time interval between hops (typically 1600 times per second for Bluetooth), the number of sub-carriers used, their frequency band, and the pattern of the hops.
The 1600 per second is of course not a random number and corresponds to the timing of exchange between two devices where one is a master holding the clock and the other the slave lining itself up with the master’s clock which ticks every 3200th of a second (that is 312.5 microseconds). On even ticks the master will transmit data packets and the slave will do so on odd ticks.
One last thing about Bluetooth, the name! You may be curious about its origin and it is not something that can be guessed. It was originally only a codename for the technology and the reference is to a Danish king called Harald Bluetooth who united tribes into his kingdom in the 10th century. The analogy being the unification of short-range wireless technologies under a common standard. As for the logo, it is the merging of the king’s initials in the Scandinavian runic alphabet. Something lighter to talk about at the dinner table as compared to π/4-DQPSK, FHSS or OFDMA.
d) Communication satellites
At the other end of Bluetooth, from short range to long range wireless, is telecommunication making use of satellites. Some satellites are used to make observations and measurements and send the data to ground-based receivers but for communications they act as transponders, both relaying data and amplifying the signals to offset losses experienced across the entire communication channel which often involves several satellites in a line of sight of each other before the signal makes its way back to land or ships.
To avoid interferences, frequency bands are allocated to different operators, mostly in the range between 12 and 18 GHz called the Ku-band, which is below the resonance peak frequency of water vapor in the atmosphere at 22.24 GHz.
The interesting bit about satellites is not the frequency or the modulation methods but the coverage they provide and this is a function of both their number, which can be very high in the case of a constellation providing internet access, and their orbit. In S3 Section 10.f, we went through the concept of escape velocity and the balancing of local gravity with tangential speed to explain why an object can remain in orbit. It works equally well for stellar systems, the planets, moons and man-made devices.
Conceptually, there are two types of orbit: geostationary and non-geostationary with the latter being functionally divided into two more types of orbits: low earth (LEO) and medium earth (MEO).
Geostationary orbit means the satellite remains vertically fixed above a certain point along Earth’s equator at an altitude which can be worked out mathematically. Since you probably want to know, the equation has to equal the centripetal force with the gravitational force, if those are unequal then the orbit will either decay or the object will end up being ejected. We thus end up with one side of the equation being the mass of the satellite multiplied by the square of its speed divided by the radius “r” of its orbit measured from the centre of mass of the Earth and the speed around the circular orbit can itself be calculated as 2 πr/T with T being the orbital period, exactly one sidereal day of ~23h56’ (refer to S3 Section 5.b). The other side of the equation comprises of the gravitational constant G multiplied by the mass of the Earth multiplied by the mass of the satellite divided by the square of the radius r. The end result is r = 42,164km and, above the equator, this works out to an altitude of 35,786km above sea level with a travelling speed of just over 3km/s in the direction of our planet’s rotation. This type of orbit means antennas on the ground need not move once they are correctly oriented, thus making it a natural choice for satellite TV relying on millions of antennas for reception, however the latency increases because the signal needs to travel more distance for a round trip. Note that geostationary orbit is a special case of geosynchronous orbits where a satellite has an orbital period of one sidereal day but doesn’t necessarily remain in a fixed position for an Earth-based observer, tracing the same figure-8 path in the sky every sidereal day. I enclose the link to the relevant Wikipedia entry at the end of this chapter if you are curious about this.
Because satellites in MEO and LEO move in relation to antennas on the ground, several of them need to be launched by any operator so that at any given time an antenna can be reoriented to face at least one of them. MEO is defined as orbits between the geostationary altitude of 35,786km and as low as 2,000km whereas LEO starts from 2,000km to as low as 160km – the threshold is only a matter of convention. Compared to other orbits, low-Earth is cheaper to launch and requires less signal strength but it requires more units to ensure uninterrupted communications and coverage. Unsurprisingly, more than 80% of the artificial satellites around our planet are in LEO, and while there are generally small, the International Space Station is among them, at an altitude ranging from 400 to 420kms.
e) Trivia – RFID
Back to short range technology, when only short packets of data need to be exchange, such as for identifying an item (think the anti-theft tags and tag-detectors in clothing stores or an hotel room key card), paying by placing your phone near a point-of-sale device or even when driving through a toll gate using a magnetic card. This technology called radio-frequency identification (RFID) hinges on the concept of electromagnetic induction whereby changes in voltage in one wire results in changes in a magnetic field which can induce a voltage change in a wire within a certain range, with this range being a function of the power of the magnetic field.
An RFID system thus comprises a reader and a tag. The reader is powered and “interrogates” the tag by changing its magnetic field and in a passive tag, which works best at distance of 4cm or less, the induced voltage is used to both access stored data and then power a transponder that modulates the voltage inside the tag which by symmetry, via the magnetic field, induces a change of voltage inside the reader as well, which is then demodulated to extract the information. This principle works for near-field communication (NFC) and, for far-field (beyond a few wavelengths in distance), semi-passive or active tags with their own power source are used as well as higher frequencies. The far-field technology is somewhat different as it relies on the backscattering and onward propagation of the original signal and the encoding of data is done by modulating the impedance of the tag’s antenna and therefore alter the proportion of the original signal which is being backscattered. This change in back scattering is the signal the receiver will extract information from.
f) Further reading (S4C5)
Suggested reads:
- Wikipedia on Antenna: https://en.wikipedia.org/wiki/Antenna_(radio)
- Wikipedia on Code-division multiple access: https://en.wikipedia.org/wiki/Code-division_multiple_access
- Wikipedia on Geosynchronous orbit: https://en.wikipedia.org/wiki/Geosynchronous_orbit
Previous Chapter: Wired Telecommunications
Next Chapter: The Internet