The common understanding of net neutrality is a regulatory stance against any form of discrimination by telecom networks against users of the Internet, whether as suppliers of services and content or as consumers. On the supply side, this implies that Internet companies should not be charged for delivering their products to end users unless they have come to a commercial agreement with the network. For example, to act as a local billing agent or as a content distribution network (CDN). On the demand side, this implies that customers should not experience any blocking of sites that has not been sanctioned in law, nor any throttling or quality degradation of the bandwidth that they are entitled to, nor charged discriminatory fees. What is normally permitted is a layered tier of bandwidth prices for customers to choose from and each customer can choose their preferred package. It is relevant to note that when consumers buy access devices, such as tablet computers, they come at different prices according to the speed of the networks they can access, thus equality of consumer choice seems to be consistent with different price levels for different levels of service.
On top of this, regulators usually recognize that telecom companies, most of whom are also Internet service providers (ISPs), have the right to manage their networks in the most cost-efficient manner to ensure quality of service obligations. The quality of service standards for available access to a network stands ideally stand at over 99%. For example, the Infocomm Development Agency (IDA) in Singapore requires 99.85% for the narrowband and 99.9% for broadband,* while in Chile the standards for narrowband are set at 97% in urban areas and 90% in rural areas.* By contrast, in the 1990s, Internet traffic was seen as ‘best effort’ unless it was sent over a public or private managed network. In the broadband era, the public expect consistently high access rates, although the speeds will differ widely from market to market. The quality of service is vital also to the success of the digital economy.
The issues at stake are principally two-fold. First, whether the network operator is using the need for quality-of-service network management as a cover for bandwidth throttling or degradation of some services such as peer-to-peer communications. Second, whether they should have the right to charge fees to Internet companies for the use of their networks as a way, as they will argue, to raise the funds required to invest in new network capacity. To confront these issues regulators need to examine their own policy goals.
3.7.1 Goals of Net Neutrality
In 2011, a study by the Body of European Regulators for Electronic Communications (BEREC) found that the blocking of voice-over-Internet protocol (VoIP) and peer-to-peer traffic by telecom operators and Internet service providers was a common practice. In 2013, the EU Commission announced it would proceed to require all telecom carriers to observe net neutrality, meaning no throttling, no degradation of Internet services and unrestricted access to Internet content providers by users without discrimination between low and high volume users. What is permitted is the practice of charging users different prices for different bandwidth packages.
In many developing economies there are no effective regulations and the incumbent operator pretty much does as it pleases. For example, a study of the Union of the Comoros off the East Coast of Africa reveals that Comores Telecom (CT), which holds a monopoly in both fixed-lines and mobile telephony, and acts as the sole Internet Service Provider (ISP), is threatened with declining international call revenues from competing OTT voice services such as Skype and Viber.* Its strategy has been to deliberately degrade the quality of the internet service it provides to its subscribers, on both fixed and mobile networks. By increasing the latency, or delay, in internet traffic, it makes VoIP effectively unusable. This has proved to be a highly controversial policy because it also affects other legal internet services, such as webmail or instant messaging used by Comorian citizens. Clearly, this is not a sustainable long-term solution for CT or for the Union of the Comoros.
In the US, the Federal Communications Commission (FCC) accused cable TV operator and Internet service provider Comcast of selectively blocking connections to peer-to-peer (P2P) applications. Comcast was found guilty, but a ruling by the Court of Appeals in 2010 found that the Commission did not have legal jurisdiction over the Internet services of Comcast. Subsequently, the FCC published in 2010 an Open Internet Report & Order, guidelines to keep the free and open nature of the Internet, around the three basic principles of transparency, no blocking and no unreasonable discrimination. But beyond these guidelines there is no legislation or formal regulations for Internet neutrality.
Besides the commercial interests of the carriers and the Internet companies involved, there are opposing camps of the ‘deregulationists’ including property rights advocates versus supporters of the ‘open access’ and ‘commons’ approach. For property rights advocates, carriers should retain a right to manage their networks to their own best advantage with minimal interference from regulators. This argument works best when there is well established competition for consumers to choose from. Consumer advocates of the ‘open access’ approach point to the lack of competition that results from violations of net neutrality, and to the adverse effects this has upon investment in, and growth of, the digital economy.*
3.7.2 Regulatory Approaches
The FCC, in considering the Comcast case, issued a consultation paper asking what were ISPs using traffic management techniques trying to achieve; was it to prioritize latency-sensitive applications, to avoid network congestion, to block unwanted traffic, to implement parental controls, or was to gain advantage over competitors. For regulators concerned that network management may be used as a pretext for discrimination against sources or users of services over the Internet, the devil lies in the detail. Network management tools can do blocking, traffic shaping and quality of service functions. Each can be used for discriminatory and non-discriminatory purposes.
The question asked by the FCC was whether the network management in question was ‘reasonable’, although the definition of ‘reasonable’ is itself open to question. In other words, the concept of net neutrality in operational terms is often arrived at only after a judgement has been made on what actual network management practices are reasonable and unreasonable. There is a huge literature debating the finer points of net neutrality along these lines, and a useful approach, although not the only one, has been put forward by Scott Jordan and Arijit Ghosh of the University of California, Irvine.* They start their analysis using the three-layered stack of the Internet as compared with the standard seven-layered stack ISO model traditionally used by telecom engineers. (See figure 3.12).
They suggest that four criteria could be used by regulators to judge whether or not network management practices raise red warning flags.
- Where within the network are the network management tools applied: in typical Internet design it is assumed that management techniques are applied above the transport layer if possible. If they are applied in the transit between the different networks (source network and the carrier network) in the routers below the transport level this should raise a red flag.
- What type of tool is applied: if network congestion is short or medium term (for example, less than one minute) then tools such as traffic shaping and queuing are effective, for example at endpoints in the network. The delay occurs at final delivery, but for congestion over one minute access control may be required. If this involves blocking or termination as opposed to quality of service degradation this should raise red flag.
- Who decides which tool should be applied: it may be at the request of the Internet source or the end user, but if it is a unilateral decision of the ISP then this should raise a red flag.
- When and on what basis is a tool applied: it may be applied to (i) an application, (ii) the source/destination, (iii) the service provider, and/or (iv) the payments processor. Tools applied to traffic on the basis of (ii) or only to traffic based on (iii) should raise a red flag.
What is useful about this framework is that it is not deterministic because the red flags are only alerts that can help regulators. However, this does focus attention on the fact that in an interconnected world the old distinctions between what were telecoms services and what are services over the Internet can no longer be a good guide to policy. For the ‘deregulationists’ these red flags will be redundant and for some advocates of ‘open access’ they may not go far enough, but regulators do need some practical points of reference going forward.
3.7.3 Net Neutrality and Wireless Networks
Under the FCC’s Open Internet Report & Order (2010) “Fixed and mobile broadband providers must disclose the network management practices, performance characteristics, and terms and conditions of their broadband services” but excludes mobile from restrictions on blocking and “unreasonable discrimination”.* Broadband wireless sector was exempted because it was seen as a young growth sector. Unlicensed spectrum services are not covered by the Order.
As noted above, many wireless broadband devices place limits on what services can be received, for example, Apple’s iPhone does not download Adobe files and restricts the types of apps that can be downloaded, and certain devices will not stream YouTube. The device vendors often have partnerships with different telecom networks, with CDNs and even have their own networks. Mobile networks are often converged with fixed (FMC), and these crossovers do not lend themselves to universal net neutrality regulations. For regulators the important issue is to keep the mobile wireless market as competitive as possible so consumers always have choice.
3.7.4 Governance Issues
Governance of the Internet is fundamental to its openness. The Internet began as an American creation that has now become part of the everyday life of the modern world. That means it also becomes part of every country’s national interest. It can only be hoped that a multi-stakeholder approach does not politicize the Internet which would be detrimental to the damage the digital economy. Good regulation should guard against that danger.
Two issues in particular have featured significantly in recent debates, and they relate to the respective roles of states and other stakeholders in Internet governance. The first issue concerns the rights of states within their own borders to govern the use of Internet domain names at the country level. The Domain Name System (DNS) evolved for technical reasons using Latin script for Top-Level Domain Names (TLDs) such as .cn for China and .com for company. Later it became possible to use non-Latin scripts such as Cyrillic, Hebrew, Korean, Thai, etc., which have been adopted by many countries for their country code TLDs or ccTLDs, also known as Internationalized Domain Names of IDNs.*
Over time search engines may adapt to these as well, but until they do searching for materials in non-Latin languages will remain an obstacle. For many states this raises both cultural and political concerns. The second issue concerns agency. Some Internet issues, such as the DNS and Internet engineering protocols are handled within internationally recognized bodies, such as Internet Corporation for Assigned Names and Numbers (ICANN) and the Internet Engineering Task Force (IETF) but as of 2013 there is no international agency that deals with other issues such as cyber-security over the Internet. Some states have suggested the ITU could extend its reach into the area, a suggestion quite vigorously opposed by others, including some other states and by many in the Internet community itself, who do not see an inter-governmental telecommunications organization as being the appropriate forum. Whatever the outcome of this debate, the issues it raises are very real and ways need to be found that are genuinely multi-stakeholder with no-one claiming to have all the answers. The Internet is a continuously evolving technology-based mode of communications which is having truly profound economic and social consequences, and the role of regulators is probably best described as two-fold: to regulate with a light touch in order to encourage continuing innovation and the benefits that brings, and encourage the active involvement of all stakeholders to address the various challenges the spread of the Internet has for society.