African Internet Exchange System 

Following adoption of the African Regional Action Plan on the Knowledge Economy (“ARAPKE”) framework, the Second Ordinary Session of the African Union Conference of Ministers in charge of Communication and Information Technologies (“CITMC”) requested the African Union Commission and the United Nations Economic Commission for Africa to accelerate the implementation of the flagship projects including the development of Internet Exchange Points (“IXPs”). IXPs, also referred  to as Internet Exchanges, provide the opportunity for many Internet Service Providers (“ISPs”) to hand off and receive traffic at a convenient regional facility instead of having to establish several direct interconnections, or using costly interconnection and traffic management services of other carriers.

The African Internet Exchange System (“AXIS”) project aims to create a robust Africa-wide internet system by installing IXPs in nations lacking any such facility, along with five regional internet hubs to serve as many African ISPs as possible. The African Union Commission has received funding from the Luxembourg Agency for Development Cooperation and the EU-Africa Infrastructure Fund. The Internet Society, a non-governmental organization with expertise in the Internet technology, logistics and coordination will assist the African Union to use the funds efficiently. Senegal, Burkina Faso, Burundi, Niger, Namibia and Guinea have been selected as the first countries to benefit from the project. The Internet Society’s Africa Interconnection and Traffic Exchange program aims to have 80 percent of local Internet traffic exchanged within Africa by 2020.

Implementation of the AXIS project involves several phases.  Before actual construction of facilities representatives of the Information Society visit candidate nations for an IXP with an eye toward explaining the benefits of the facility and how it would operate. These community mobilization and technology training workshops can help explain the cost savings, efficiency enhancing and consumer benefits accruing from having an in-country IXP.  IXPs can reduce yearly operating expenses for ISPs, including transiting costs imposed by other carriers. Additionally multiple ISPs can share IXP infrastructure costs reduces. Broadband subscribers benefit by the possibility of lower rates as well as the likelihood of faster service and less latency (delay) in accessing content, particularly that available from regional carriers.* 

Broadband Decision Tree

Both private and public network planners need to identify which of many broadband technological options best match specific requirements in a particular region or locality, taking into consideration such factors as the terrain, expanse of desired land coverage, population density, level of existing interest in broadband, ability to pay for service and distance from existing service options. Using these and other locality-specific factors, planners can begin to identify which broadband technologies constitute candidates for providing services. Having identified viable technological options, planners subsequently need to assess which one option provides the most cost-effective and efficient solution, taking into consideration whether grants and subsidies are available for projects identified as commercially unviable and unsustainable without one-time, or continuing subsidization.

Broadband network planners can begin to develop a decision tree based on experience gleaned from projects occurring in similarly situated areas.  The decision tree below can provide a baseline template.

Preliminary Assessments

Map Existing and Planned Narrowband and Broadband Plant

Before assessing which of many broadband technologies can optimally serve a specific geographical area, planners should map existing and planned infrastructure. Such documentation can help identify specific areas lacking broadband service and also start the process of determining which technological options are feasible and efficient. Broadband mapping also should identify the location of Internet Exchanges, telephone company switching locations, cable television headends, wireless carrier tower sites and existing broadband network locations.

In addition to mapping existing and prospective broadband assets, mapping projects also can identify the population density of locations having some forms of broadband access as well as those areas currently lacking any option. In the unserved areas, population density and geographical terrain will have a substantial impact on what broadband option is both technologically feasible and affordable. For example, in mapping existing broadband backbone network lines, planners can assess whether adjacent areas have sufficient population density to support either extending the backbone, building lower capacity branches, or using wireless networks.  Similarly maps that identify the locations of telephone company switching facilities and cable television headends can provide planners with locations possibly served by retrofits of existing plant, e.g., Digital Subscriber Line broadband service to areas having sufficient population density and close proximity to a telephone company switch.

Identify the terrain of the targeted locality or region

If the terrain has swamps, deserts, mountains and other obstacles, technologies requiring the installation of ducts, poles and towers typically become cost prohibitive. Wireless options—particularly satellite earth stations—will offer the best option based on this criterion. If the terrain supports installation of comparatively low cost ducts, poles and towers wireline options may offer the best option, including construction of a branch, or back haul link to an existing broadband network facility.

Identify the desired terrain coverage area and conduct an analysis of population density demographics and interest in broadband

Network planners should specify the locality and region targeted for broadband access.  The size of the desired broadband footprint and its population density have a significant impact on which technologies can provide the most cost-effective solution.  Most targeted locations will be in sparsely populated areas, but some may have the population comparatively more concentrated in villages than others.  Generally the more concentrated the population, the greater the likelihood that broadband options can include access from individual residences, or at least multiple facilities, instead of a single access point such as a telecentre or kiosk.  Planners should use surveys to assess interest in broadband and willingness to pay for service.

Inventory the nearest wireline and wireless broadband options and estimate the cost to extend them to the targeted locality or region

While network planners may have to construct “islands” of broadband access, typically using satellite option, they first should determine the distance from the targeted locality to the closest existing broadband access options. Planners should assess whether and how existing networks can extend to the targeted locations. Some technologies, such as DSL, are distance constrained meaning they cannot penetrate farther into the hinterland.  Others have no technological limitations, but planners will need to calculate whether a business case can be made for an extension, on an unsubsidized basis, with a one-time infusion of capital, such as a grant, or only with ongoing subsidization.

Broadband Options in Relation to Terrain and Population Density

Broadband planners cannot readily erect a flow chart that specifies which technology to install based solely on population density levels and terrain.  However, several basic “rules of thumb” can provide a baseline for analysis of particular circumstances.

Telephone and Cable Television Network Retrofits Typically Offer Timely and Lowest Cost per Prospective Subscriber Passed

Broadband planners wisely opt to retrofit existing telecommunications plant whenever possible. This strategy helps extend the useable life of existing, “sunk investment” and helps conserve capital by reducing the amount of capital expenditures needed to offer a broadband option.  Areas already served by terrestrial, narrowband telephone service can include broadband Digital Subscriber Line service with an investment of as few as a few hundred dollars per home passed. Of course the cost of a network retrofit will vary as a function of population density. Also one should appreciate that many remote locations may have wireline telephone service thanks to universal service subsidies and not because the location and population density supported the network installation free of government mandated financial support.    

Wireless Options May Offer Cost-effective Service to Locales Lacking Terrestrial

Wireline Options

Areas currently lacking cable television or wireline telephone service can be prime candidates for terrestrial wireless network installations. The ongoing buildout of cellular radiotelephone service well into the hinterland corroborates this point. Once a wireless voice network has extended into a specific geographical region, carriers may voluntarily make the additional investment to support broadband options. Targeted government subsidies and other universal service financial incentives can expedite the timetable.

Satellites May Offer the Carrier of Last Resort Option

Even now many geographical areas lack terrestrial broadband access, because the population density, terrain and proximity to existing broadband network assets do not support buildout farther into the hinterland. For these least populated, most remote locations satellite broadband access may constitute the only feasible option. Typically satellite broadband costs more than terrestrial options and may offer comparatively slow bit transmission speeds.  Subscribers to satellite service must acquire and install an antenna, receiver and possibly other devices such as a modem. This equipment has become less expensive and smaller over several generations of innovation, but they do add costs typically not incurred by wireline subscribers.

Because the satellite option may constitute the best and only solution for people in quite remote areas, ICT development specialists have devoted much effort at finding innovative ways to economize and to maximize access. For example, rather than install a very small antenna for each subscriber, some communities have installed a somewhat larger satellite antenna capable of serving many users at the same time. The term very small aperture terminal (“VSAT”) refers to these satellite dishes that facilitate shared use. A single VSAT antenna linked with a terrestrial wireless delivery medium, such as Wi-Fi or WiMAX can serve an entire village. 

Many residents, in remote communities also have limited discretionary income thereby necessitating the search for ICT development grants and subsidies. Typically “community champions” help identify and aggregate demand for broadband with an eye toward demonstrating the viability of a satellite project to private, public and non-governmental organizations with grant money available.

For background on broadband network planning see:

For case studies on broadband projects see:

For broadband decision tree templates see:

Broadband Mapping 

Mapping existing geographical availability of broadband networks constitutes one of the first steps undertaken in formulating a strategic development plan.  This inventorying process may seem straightforward, but one should not underestimate the cost and complexity involved in acquiring an accurate assessment of existing installed physical plant.  For example the United States federal government allocated $240 million to develop a comprehensive national broadband map as well as maps of individual states.*  

Best practices include on site and road tests to confirm reported availability as well as the solicitation of reports from end users using in person interviews, online surveys and informal self-reporting, sometimes referred to as crowdsourcing.*  

Additionally formal reporting requirements of facilities-based carriers can promote mapping accuracy.  Map creators should make their findings readily available and provide consumers with both the opportunity to see what options exist and to provide corrections and updates to incorrect data. 

Interactive maps can provide information about broadband access opportunities from businesses and residences as well as information about which community anchor institutions exist in a neighborhood, or locality.  Maps also should provide contact information for broadband providers as well as informational about any programs designed to stimulate and subsidize access.

For background on broadband mapping see:

Conversion from Analog to Digital Television and the Digital Dividend

The delivery of broadcast, satellite and cable television has migrated, or soon will migrate from analog to digital transmission. While this conversion will impose costs on both operators and viewers significant benefits will accrue including the ability to view a higher quality image and the opportunity for nations to reallocate a significant portion of broadcast television spectrum for other uses including wireless broadband. Because digital television offers a more efficient and higher quality transmission, containing as much as six times more content for display, broadcasters can offer more than one video signal still using a 6 MegaHertz channel. National Regulatory Authorities can group all broadcasters within a smaller range of frequencies thereby freeing up broadcast television spectrum for reallocation. 

Digital television will require consumers to replace their existing television sets, or install a device that receives digital signals and converts them back to analog for viewing. New digital television sets can display a higher quality image that increases the number of columns and lines of pixels, the individual and tiny squares of color that combine to form an image. High Definition Television (“HDTV”) is typically classified by the number of pixel lines and whether the video image is created in one line-by-line sequence, called progressive scanning, or by the sequencing of even lines in one scan followed by another scan of the odd lines, called interlacing. While standard definition, analog television generated about as few as 350 visible lines of resolution, HDTV offers 1080 lines. HDTV also presents video signals in a ratio of length to width like that occurring in movie theaters. This aspect ratio also makes it possible to reproduce entire movie images when broadcast on television. Previously movie images were cut from a 16 x 9 aspect ratio to 4x3.

The migration from analog to digital television will generate what some call a Digital Dividend, because freed up broadcast television spectrum can expand the amount bandwidth available for current and future broadband wireless networks. In many nations wireless carriers providing broadband services have expressed concerns about a scarcity of available spectrum, particularly for Next Generation Network (“NGN”) services that require lots of bandwidth to transmit content, such as full motion video, at bit transmission speeds comparable to wired networks. The reallocated broadcast spectrum offers superior signal transmission characteristics, known as propagation, because of its frequencies are lower than that previously allocated in many nations for wireless mobile services.

For more information on digital television and the Digital Dividend see:

Deep Packet Inspection

As the Internet evolves subscribers will have diversifying network requirements that place different demands on broadband networks. For example, viewers of full motion video will need high bit transmission speeds with little tolerance for delays in delivering such “mission critical” content. On the other hand broadband networks can handle content with less time sensitivity in ways that conserve bandwidth and use network capacity during off peak times.

Internet Service Providers (“ISPs”) increasingly use technologies that can identify the nature of subscribers’ service requirements by inspecting labeling information contained in packets as well as the actual content being transmitted.

Deep Packet Inspection (“DPI”) provides ISPs with tools to identify subscribers’ bandwidth requirements, to prioritize traffic and to prevent piracy by implementing restrictions on copying content. Some consider this technology controversial, because it equips ISPs with the means to offer different levels of service and to charge for higher quality of service. On one hand offering “better than best efforts” routing can enhance the user experience for subscribers requiring high quality service that conventional “best efforts” routing will not achieve. On the other hand DPI can provide ISPs will many ways to avoid operating as neutral conduits leading some to express concerns that the Internet will become less open and receptive to improvements and innovations. 

DPI provides real time monitoring of packets as they travel through an ISP’s network. The technology can inspect header information that typically provides information about the source and destination of the traffic. Additionally DPI can examine packet payloads and identify the nature and type of traffic being transmitted. This capability will make it possible to identify “mission critical” content can provide superior service, possibly at a premium rate. However ISPs can also use this technology to prioritize, degrade, or block  traffic, not because of network conditions, such as congestion, or even at the request of a subscriber. Under a worst case scenario DPI can provide ISPs with ways to identify traffic so that it can be subject to inferior service with the goal of forcing subscribers or content sources to pay more to achieve a basic level of acceptable service. DPI also raises questions about privacy as ISP and even third parties can use the technology to track and profile online usage.

For more information on deep packet inspection see:

Featurephones  Smartphones and Tablets

As wireless networks become more sophisticated and able to handle data applications at high speeds, the nature, type and number of useable handsets also will expand. The handsets that transmit and receive wireless signals now range from simple devices designed primarily to provide telephone calls to ones that operate much like portable computers. Generally the cost of handsets increases as the number of functions and available services rises. For wireless subscribers interested primarily in voice communications, as well as the ability to send and receive text messages and photographs, a variety of “featurephones” are available at low cost. These handsets lack many of the news features and either lack the capability, or offer less than optimal access to Internet-based, data services. Some recent vintage feature phones also offer additional features such as personal digital assistant note taking and scheduling, a media player, a touchscreen, Global Positioning Satellite (GPS) navigation and Wi-Fi access. Currently a majority of consumers use feature phones, but near term migration to smartphones will reduce market share, particularly in developed nations.

Smartphones offer far more use options, because manufacturers have installed an operating system with data access in mind. Most smartphones use a mobile operating system created by Google, Apple, Nokia, Blackberry, and Microsoft. These devices use high performing computer chips and typically have larger screen capable of displaying high definition content, including full motion video. Some smartphone users rely on their wireless connection exclusively for broadband data services, while others continue to maintain subscriptions to both wireless and wireline broadband services. Other smartphone users may never have acquired a personal computer, or laptop before resorting to smartphone access to the Internet.

Tablets offer users an even larger screen and more computer processing power coupled with wireless access that may include both Wi-Fi and mobile radio frequencies. Tablets still offer voice and text communications options, but their size and power favor data communications.  Many content providers now offer services optimized for access via tablets and smartphones. These so-called applications accommodate the smaller screen sizes of smartphones and tablets as compared to tablets. Applications also offer fast access to a specific service or function as compared to the possibly larger options available from a world wide web site containing many pages of content.

Feature Phones

Source: New York Times, Bits Blog site; available at:

Hybrid Broadband Using a Combination of Copper and Fiber Optic Cables

Carriers providing terrestrial broadband services typically attempt to upgrade and retrofit existing networks facilities, rather than replace them entirely with costly new technologies such as fiber optic cables. A combination of newly installed glass fiber optic cables and already installed copper wire cables provides an opportunity to extend the usefulness of existing plant and also to reduce the amount of capital investment needed to provide next generation network services. The word hybrid is used to identify networks that combine older “legacy” facilities with newly installed plant. As some future date these hybrid combinations will get replaced with entirely new equipment that can offer even better transmission speeds and capacity. However in the interim time period carriers have found ways to expedite the introduction of networks offering improvements to an all copper wire medium.

Companies providing both basic wireline telephone service and cable television service have devised ways to combine fiber optic cable connections with existing copper wire. The replacement process typically starts between carrier facilities with the last wire replacement occurring for the wire providing the first and last link to individual subscribers. 

The term Hybrid Fiber Coax (“FTC”) identifies a network that combines fiber optic and copper coaxial cables. The terms Fiber to the Pedestal (“FTTP”) and Fiber to the Curb (“FTTC”) identify the location where copper wires continue to provide the network delivery. The retained copper wire is located at the point where a network connects directly to an end user, on the curb near a street, or at a frame, called a pedestal, where the wire linking a residence is connected with another copper or fiber optic wire. For residential subscribers the “drop line” leading to and from a residence is located on the property of the subscriber, but typically along a right of way or easement at the edge of the property.  The drop line may connect with another copper line, or to the first of many fiber optic links.

At the location where copper and fiber optic cables are connected, the carrier must also install equipment that can convert the transmitted signals from an optical carrier to a copper wire based carrier and vice versa. Telephone companies can extend the reach of their broadband services and increase the transmission speed and capacity by replacing the copper local loop with fiber optic cables also extending close to subscribers. The term Fiber to the Node refers to the installation of fiber optic cables to a switching facility in a neighborhood serving as many as 500 residences. Fiber to the Premises refers to the installation of fiber optic cables all the way to a pedestal serving one subscriber.

Coaxial Cable Pedestal Located on the Edge of a Residential Subscriber;s

Source: Dave Whitmore's Home Page; available at:

Fiber to the Pedestal Installation

Source: OSP Magazine; available at:

Hybrid Fiber-Cable Distribution

For background on hybrid fiber optic-copper cable networking see:

Infrastructure Sharing

Broadband service providers can share the cost of installing and maintaining infrastructure in ways that promote competition, operating efficiency and universal service. When multiple operators can spread the investment costs over a larger base of users, they can achieve scale economies represented by lower per unit costs of essential elements of service. Additionally consumers can benefit when infrastructure sharing helps expand the coverage area that carriers can afford to serve.

Infrastructure costs are comprised of capital expenditures in the physical plant needed to provide service, operating expenditures needed for ongoing service and the interest and other expenses operators incur when they borrow funds to invest in new infrastructure. Infrastructure costs also divide into passive and active elements. Operators can share passive elements without affecting their ability to differentiate service and market their offerings as superior. 

Passive elements constitute  the civil engineering and non-electronic elements of infrastructure including: physical sites, poles and ducts, power supplies, trenches, towers and masts, splitters, shelters, air conditioning equipment, diesel and other forms of backup power generators including batteries, and the premises easements and other authorizations to own or lease property. Because passive elements do not directly transmit content shared management of these resources does not impact coverage or capacity of a broadband network.

Passive Radio Communications Infrastructure

When operators share active elements they cooperate in the use and cost recovery of  components such as spectrum, copper wire and fiber optic lines.  Such sharing raises more difficult coordination and cost sharing issues and may also trigger regulatory matters such as who shall serve as the holder of any required license, how will that licensee represent the interests of all parties before the regulatory agency and how to ensure that a dominant carrier does not use sharing to handicap smaller competitors.

For more background on infrastructure sharing see:

Integrating Femtocell and Wi-Fi Coverage in Residences and Beyond

Currently some residential broadband subscribers have the option of using two separate wireless devices to extend the range and accessibility of their service.  A Wi-Fi router provides access to more than one computer, tablet and smartphone by sharing a single broadband subscription available to any device equipped with a small receiver and transmitter operating on Wi-Fi frequencies, typically 2.4 GHz and 5 GHz.  Wi-Fi routers assign addresses to each computing device to avoid data stream collisions.  To avoid interference between computing devices, which might operate in close proximity to each other, these routers also assign different frequency channels to each device.

At residences wireless broadband subscribers also might install a small femtocell that operates as a low-powered base station to improve network accessibility. Mobile wireless broadband services operate on very high frequencies that partially bounce off walls and other obstructions instead of penetrating them. To improve in-building signal penetration some wireless carriers offer subscribers the option of installing a device that receives weak incoming wireless signals, amplifies them and retransmits them inside a building. Subscribers with mobile radio handsets communicate with the nearby femtocell instead of the closest available network tower that might have a location far from the residence. 

Integrating the femtocell and Wi-Fi devices with a subscriber’s wired broadband  service has the potential to generate benefits to both carriers and their subscribers. Wireless carriers can install a specific type of femtocell designed to interconnect with their subscribers’ wireline broadband service that might be offered by an affiliate of the wireless carrier, or by another carrier. By connecting the femtocell with a wired broadband connection, the wireless carriers can offload subscriber traffic that otherwise would travel through the femtocell and onto the wireless carrier’s network. The wireless carrier can reduce its volume of traffic and the potential for network congestion by routing traffic originating and terminating at residences via an available wireline broadband connection instead of the wireless connection. Subscribers benefit by having a more reliable service capable of delivering high bandwidth intensive applications such as full motion video.

In the future manufacturers will combine the femotocell coverage extension function with the Wi-Fi ability to offload data from wireless to wireline networks and perhaps more importantly from one type of wireless network to another. This device also will provide the necessary modem function so that both wireless and wireline routing options are available depending on current network conditions. 

Additionally wireless operators may plan on combining their 4G networks with small cell configurations using Wi-Fi frequencies. Rather than offload broadband traffic onto a wired carrier’s network the wireless carrier can assign traffic to either its 4G network, or localized Wi-Fi small cells based on the nature of the traffic to be delivered and the potential for congestion. Carriers might install the small cell option in places where high demand and the potential for congestion is likely, such as shopping malls, stadiums, university campuses and public transportation like airports and train stations.

Femtocells vs. Wi-Fi - Summary

Source: Pradeep De Almeida, How Wi-Fi and Femtocells Complement One Another To Optimize Coverage and Capacity (May, 2012); available at:

For more information on femtocell-Wi-Fi integration see:

Internet of Things

Currently the Internet provides a medium for the transmission and processing of information created and used by humans. Computers, servers and other devices store, switch and transmit the information, but human involvement must occur in one or more instances. The Internet of Things refers to the prospect for the creation of data by devices, such as sensors, that do not involve humans in the collection, processing, storage and even interpretation of the information:

[T]he predictable pathways of information are changing: the physical world itself is becoming a type of information system. In what’s called the Internet of Things, sensors and actuators embedded in physical objects—from roadways to pacemakers—are linked through wired and wireless networks, often using the same Internet Protocol (IP) that connects the Internet. These networks churn out huge volumes of data that flow to computers for analysis. When objects can both sense the environment and communicate, they become tools for understanding complexity and responding to it swiftly. What’s revolutionary in all this is that these physical information systems are now beginning to be deployed, and some of them even work largely without human intervention.* 

The Internet of Things requires physical objects to have the ability to identify themselves and regularly transmit data measurements via the Internet. Tiny measurement devices, commonly referred to as sensors, can operate in a variety of hostile, mobile and other environments where ongoing human monitoring would be impossible or too expensive. For example, monitors can be installed under or on the skin of people so that medical data, such as heart rate, blood pressure and glucose levels, can be transmitted on an ongoing basis. So long at the reported data does not fall above or below a prescribed level the receiving computer would collect the data and do nothing more with it. However should the reported data exceed set parameters the computer could have programmed instructions to issue and alert triggering human intervention.

Device miniaturization, wider and cheaper Internet access and drastic drops in the cost of data storage make it possible for computer and network intelligence to become part of new networks serving households and businesses. Significant operating efficiency gains can occur, because regular monitoring can occur automatically and frequently without human intervention. For example, a utility company can measure power demand on an immediate, “real time” basis, rather than send meter readers to make monthly on-site visits to each subscriber. With immediate power demand information, utility management and programmed computers can use price changes to stimulate or retard demand and better avoid outages.

The Evolution of Internet of Things
The Internet of Things

For more information see:


New African Submarine Cables

As never before, residents throughout the continent of Africa have access to high speed fiber optic cables including several submarine cables that link Africa with other transoceanic cables traversing the world. African submarine capacity divides between west and east coast systems.  On the west coast the SAT-2 cable provided the first available bandwidth in 1993 followed by the SAT-3/SAFE cable in 2002.  Between 2010 and 2012 four new systems began service: Glo-1 and Main One Main Street Technologies in 2010 and Africa Coast to Europe (“ACE”) and  the West Africa Cable System (“WACS”) in 2012. WACS offers a 14,000 kilometer route from South Africa to Portugal and the United Kingdom with a total bit transmission speed of 5.12 Terabits per second. The ACE cable follows a serving 21 landing points in Africa.

On the east coast of Africa the East Africa Marine System delivered the first fiber optic submarine cable capacity in 2009. The Seacom cable, launched in 2009, provides connectivity between the east coast of Africa onward to Europe and India. The key markets served are: Tanzania, Kenya, Uganda, Mozambique, South Africa and Rwanda. The East Africa Submarine Cable System (“EASSy”) started operations in July 2010, providing a 10,800 kilometer fiber optic pathway running from Sudan to South Africa. This $263 million project has a total capacity of 3.84 Terabits per second divided into lines currently offering up to 30 Gigabits per second transmission speed. There are nine landing stations in South Africa, Madagascar, Mozambique, Comoros Islands, Tanzania, Kenya, Somalia, Djibouti and Sudan.

Existing and Planned Submarine Cables in Africa
Number of Submarine Cables Available by African Nation

For background on African submarine cable projects see:

Powering Remote Broadband Access

Access to broadband facilities in remote areas often requires consideration on how to install and maintain other necessary infrastructure such as a reliable source of electrical power.  A major challenge to rural connectivity lies in the lack of a “last mile” infrastructure providing a link to a regional or national backbone network. Telecenters and broadband kiosks may not have a direct link to the power grid and therefore must have a sustainable, self-contained direct source. Power options include rechargeable batteries, solar power, diesel/gas generators, micro-hydroelectric dams and small windmills. 

Solar-powered charging stations at the Jokko Telecenter, Senegal.

Source: Joko Initiative Blog site (October 12, 2010); available at:

For background on rural power generation options see:

Satellite Backhaul

Satellites provide broadband network access to and from local distribution facilities located in the most remote areas as well as locations where topography restricts connection to backbone networks and when emergency telecommunications is needed. The satellite option typically triggers high operating costs with comparatively slower transmission speeds and problems with latency, transmission delays due to the length of time it takes to transmit and receive signals to and from satellite operating as far as 22,300 miles from earth.

Despite their limitations satellites may provide the only viable means for users in the remote locations to access the Internet cloud. Best practices in satellite backhaul address how to build and maintain facilities in a timely and efficient manner, taking into consideration particular site requirements such as the need for a reliable power source where no installed grid access option exists.

Recent developments in satellite backhaul include the installation of smaller sized satellite dishes that operate in the 20-30 GigaHertz, Ka-band.   These “[s]atellites can use more spot beams. Rather than broadcast the same signal across their whole footprint, the satellites can reuse the spectrum many times over because they have been fitted with a number of small spot beam antennas for specific geographic coverage. The same spectrum can be reused in every second spot beam. This greatly increases the overall system capacity and total throughput available.”* 

The second major advance has been the change from using dedicated bandwidth to packet switched architecture. The older fixed capacity allocation method [Single Channel per Carrier] left bandwidth unused, wasting system capacity that could have been used elsewhere. Allocating bandwidth on demand means that statistical multiplexing gains can increase total system capacity by anything from 30 to 80%. Rather than pay for a fixed bandwidth link, regardless of how much of it is being used, satellite operators can be more creative in their tariff plans, and for example, they can charge on usage rather than on a fixed capacity basis. For small cells, where there are potentially thousands of sites to be connected, it doesn't make economic sense to use dedicated bandwidth, so solutions that can centrally manage bandwidth will be used."

For background on satellite backhaul see:

Types of 4G Wireless Service

The latest generation of wireless networking offers the promise of bit transmission speeds that rival what wired terrestrial systems can offer. So-called 4G networks represent the fourth major technological change in wireless networking with dedicated bandwidth for data services and transmission formats conducive to very fast broadband service. The preceding generations did not offer networks optimized for data services and accordingly offered significantly slower transmission speeds.  1G service offered analog voice services and no data. 2G services offered a digital transmission format, but no special accommodation for data services. In the third generation, wireless carriers retrofitted their voice networks to handle data services, but the bit transmission rate rarely exceeded 200-400 kilobits per second. 

While the 4G networks currently in operations do not fully comply with the transmission speeds identified in international standards, these networks, providing Long Term Evolution (“LTE”) regularly offer speeds between 5 and 12 megabits per second (“Mbps”). The International Telecommunication Union official standard for 4G, the International Mobile Telecommunications Advanced (IMT-Advanced) specification, establishes a peak transmission speed standard for 4G service at 100 Mbps for high mobility communication (such as from trains and cars) and 1 gigabit per second (“Gbps”) for low mobility communication (such as pedestrians and stationary users).

The IMT-Advanced 4G specification establishes a number of operating standards including the use of Internet Protocol, packet switching instead of formats primarily suited for voice communications. Networks must efficiently use available bandwidth both in terms of supporting shared use of the same channel by multiple users and the ability to scale up the use of allocated bandwidth as demand grows. Carriers use Orthogonal Frequency-Division Multiple Access (“OFDMA”) technology that divides available bandwidth into many channels and also multiplexes data streams into multiple pieces, each of which is modulated onto a separate carrier which are later combined.

Another spectrum efficiency requirement imposes a minimum rate of how many bits can be transmitted per channel and by each individual transmission cell.  The standard also requires that all network operators can handle the traffic of other carriers thereby eliminating format incompatibility as exists between 3G networks that operate on the same spectrum, but use different transmission formats, e.g., Time Division Multiple Access versus Code Division Multiple Access. 4G carriers also have to provide subscribers with the ability to transmit and receive data that represents full motion video and high fidelity sound.

It appears that the LTE format for 4G service has become the consensus standard in light of the decision by many carriers to purchase and install 4G equipment.  Previous a significant number of formats competed for adoption including: High Speed Packet Access, WiMAX, WCDMA, Edge and EV-DO.

For background on 4G wireless service, see:

Ultra High Definition Television

The next generation of high definition television sets will have even more resolution than currently available. Video screen resolution is measured in terms of the number of columns and lines as well as the total number of pixels, the smallest unit of video display. The current best standard for high definition television combines 1920 vertical columns with 1080 horizontal lines. Multiplying the number of columns by lines identifies the total number of pixels displayed.

Ultra high definition television doubles or quadruples the number of columns and lines. So-called 4K Ultra High Definition contains 2160 lines of resolution and 8K 4320 lines. Ultra high definition video resolution will make it possible for television manufacturers to offer larger sets with screens exceeding 84 inches as measured diagonally.

Ultra High Definition video will likely generate even greater consumer demand for faster broadband transmission speeds to accommodate the increased amount of content delivered per video frame. Terrestrial broadcasters and cable operators can repurpose some bandwidth by eliminating analog transmissions. However, Internet Service Providers, as operators of a fully digital medium, will have no ability to repurpose bandwidth to accommodate rising demand.

For more information on ultra high definition television see:

Ultra Wideband Networks

Next generation network options use even higher spectrum to satisfy the ever increasing demand for wireless broadband service. So called Ultra Wideband (“UWB”) networks provide very high speed bit transmission using a wide range of extremely high frequencies, at or above the 2.4 GHz band currently used for Wi-Fi service. UWB transfers large amounts of data wirelessly over short distances, typically less than ten meters.  Unlike other wireless systems, which are limited to relative narrow allocation of spectrum, UWB operates by transmitting signals over a very wide range of spectrum, but at very low power.

UWB networks can satisfy individual short broadband requirements and provide a wireless alternative to possibly inconvenient wire-based services. So-called Personal Area Networks will provide broadband connectivity like that currently served by Bluetooth applications that can link mobile phones, portable computers, cars, stereo headsets, and MP3 players with sources of content. The low power and short range of these technologies supports unlicensed use. 

In light the proliferation of sensors, which typically need to transmit over very short distances, UWB technologies can power an Internet of Things.  UWB can avoid causing interference with current narrowband and wideband radio services and between unlicensed users. It can operate in hostile environments and has been miniaturized so that it can be embedded in chip sets attached to other devices.

An Internet of Overlapping Networks

For more information on Ultra Wideband Networks see:

White Spaces Explained 

White spaces refer to radio spectrum allocated for a specific use, but available for other uses in many locations where such secondary uses will not cause interference to the primary, authorized users. Historically the International Telecommunication Union on an international, multilateral basis and individual nations on a domestic basis typically allocate spectrum for a single, specific use. This results in many instances where no primary spectrum user operates, but administrative rules prevent other uses. Put another way concerns about the potential for interference have motivated spectrum allocation decisions that result in inefficient use, because significant amounts of bandwidth remain unused even though many uses can occur without causing harmful interference. 

For example, most nations have allocated a large amount of spectrum for radio and television stations, typically on an exclusive basis.  Such exclusivity ensures that the broadcast signal encounters no interference, but there are many locations where spectrum use of broadcaster assigned frequencies have no potential to interfere with actual broadcast transmissions. White spaces refers to the geographic areas where there exists no potential for interference, because actual users are located at great distances away and in some instances there may be no actual users whatsoever. 

The broadcast television band constitutes a likely candidate for identifying white spaces, because national regulatory authorities created a very large frequency band for this service and often substantial distances separate actual users of any specific broadcast channel. “White spaces exist primarily because analog television receivers were highly susceptible to interference, requiring the FCC to create frequency ‘guard bands’ between television channels in order to prevent interference. For example, in a given viewing market, if channel 9 is licensed, channel 8 and 10 will be vacant, as will channel 9 in any neighboring viewing market.”* 

With the conversion from analog to digital television broadcasting in many locations the amount of white spaces increases significantly.  Some nations have identified spectrum for reallocation to other services such as wireless broadband. In some instance nations auction the newly available spectrum and accrue a substantial monetary infusion into the national treasury, a so-called Digital Dividend. Even for nations reallocating some of the freed up spectrum, the expanded availability of white spaces has resulted in changes in policies allowing non-interfering uses.

White Spaces Interference Avoidance 

The opportunity to use white spaces for broadband access depends on the ability of secondary users to apply techniques that ensure the ability of primary users to continue operating without interference. For broadcast television white spaces this means that even unlicensed broadband applications must use sophisticated techniques that can sense other uses and change frequencies to avoid causing interference. 

Technologies such as software-defined and cognitive radio offer such frequency agility. They have sensing capabilities that can identify frequencies where TV channels exist and can find and quickly move transmissions to open spectrum. This means that instead of simply tuning into a specific frequency, white spaces devices must have built in intelligence for detecting other spectrum users and quickly finding other frequencies on which to operate.

In addition to using receivers to sense whether a specific frequency has an existing user, white space devices can interrogate data bases that map and identify preexisting registered uses for specific locations. In light of inexpensive access to the very accurate Global Positioning Satellite service, white spaces transmitters can employ geo-location procedures to assess the interference potential before operating.

Additionally white spaces devices can operate at very low power to provide location specific broadband access in much the same way as Wi-Fi operates.  With low power Wi-Fi devices can reduce interference even for other users in close proximity. Also Wi-Fi devices can change channels once interference is sensed. In addition to operating at low power and having the ability to change transmitting frequencies, white spaces devices have sensing capabilities that trigger a change in frequency to avoid causing interference in the first place.

For more background on white spaces interference avoidance techniques see:

  • LS Research, Understanding TV White Spaces (Feb. 1, 2011); available at:
  • Maziar Nekove, A Survey of Cognitive Radio Access to TV White Spaces, International Journal of Digital Multimedia Broadcasting, (2010); available at:
  • Michael Fitch, Maziar Nekovee, Santosh Kawade, Keith Briggs, and Richard MacKenzie, Wireless Service Provision in TV White Space with Cognitive Radio Technology: A Telecom Operator’s Perspective and Experience, IEEE Communications Magazine, 64-73 (March 2011).
White Spaces Test and Demonstration Project

Source: Google Africa Blog, Announcing a new TV White Spaces trial in South Africa (March 25, 2013); available at:

For background on how nations have allowed the use of white spaces see:

Wireless Device Tethering

Tethering refers to the ability to link two devices so that they can share a function such as wireless access to the Internet. Cellphone users might want to use personal computers and tablets for accessing the Internet instead of their phones which may not have a sufficiently large screen or the capability of accessing the World Wide Web. Some personal computers and tablets can achieve direct wireless access using installed electronics or with the insertion of a dongle into an interface such as a Universal Service Bus (“USB”) jack. Devices lacking direct access capability can tether to a cellphone for indirect access.

Currently many areas in the world have cellular radio access, but lack a local broadband access options such as Wi-Fi. For locations one can tether a personal computer or tablet to the cellphone thereby securing the Internet access available via the cellphone. Note that the cellphone must have software supporting tethering and the wireless carrier may impose a surcharge for this feature, or prohibit the option.

Example of a Dongle Providing Wireless Access to a Personal Computer or Tablet

Source: Pete’s Tech Ramblings, The $50 wireless tethering solution (May 9, 2009)

Cellphone Tethering

Source:, Cell Phone Tethering: Secure or just another hole in the wall? (July 29, 2009); available at:

Creating Your Own Mobile Wi-Fi Hotspot

In areas where Wi-Fi broadband access does not exist, subscribers to a broadband service can create a limited, short range alternative.  So called mobile Wi-Fi hotspots use a portable wireless router to provide shared access to a broadband service such as 4G wireless data. This small, battery powered device offers a portable hotspot that taps into wireless data services, just like a smartphone does, and then wirelessly shares its data connection with other nearby Wi-Fi-enabled devices such as a personal computer or handset. Several users can share the single Wi-Fi connection and encryption techniques provide a password authentication process to prevent unauthorized access.

A Portable Mobile Wi-Fi Hotspot Router

Source: The Cool Gadgets, MiFi 4510L: Novatel Wireless Mobile Hotspot – Introduced For Verizon Wireless Network; available at:

Wireless Mesh Networking

Wireless mesh networks provide broadband access through the coordination and interconnection of nodes that have can receive and retransmit traffic. In much the same way as the Internet provides a managed, “best efforts” routing of traffic, the software configured management of nodes achieves the same “dynamic routing” based on intelligent selection of which node can help move traffic to the final destination or closer to that destination.

Wireless mesh networking requires at least one broadband connection to the Internet, e.g., a cable modem or DSL link. Access to that links can be shared over many geographically separated users who can secure a link to the broadband connection via one or more intermediary nodes. The geographical range of the mesh network extends with increases in the number of installed nodes. 

Wireless mesh networking can provide a low cost way to extend the reach of a broadband connection. However its open and shared networking characteristics does create security risks, particularly when nodes are installed in a variety of locations not under the control of a single manager.  Nevertheless wireless mesh networks can offer a quickly installed and inexpensive way to share broadband access, particularly on an unlicensed and noncommercial basis.

Wireless Mesh Networking Internet Access via Many Nodes a Few Gateways

Source: Wireless Networking in the Developing World, Mesh networking; available at:

For more information on mesh networking see: