[Some unusual territory for me: An amateur linguistic analysis of some recent trends on the internet.]
The internet, glorified in the abstract like much of modern technology, is often idealized as a universal medium of communication that promotes globalization, convenient and cheap education, community cohesion, and a host of other social goods. John Barlow, a founder of the Electronic Frontier Foundation, claimed“cyberspace consists of transactions, relationships, and thought itself, arrayed like a standing wave in the web of our communications… We are creating a world that all may enter without privilege or prejudice accorded by race, economic power, military force, or station of birth… a world where anyone, anywhere may express his or her beliefs… We believe that from ethics, enlightened self-interest, and the commonweal, our governance will emerge… The only law that we recognize is the Golden Rule” (Barlow).
The assumption being made in this veneration, is that humanity, if somehow able to achieve perfect communication, will naturally recoil from all evils- that communicative unity can only bring about good. A powerful counter-example can be found in Orwell’s 1984, in which an all-powerful totalitarian central government enslaves the minds of its citizens by enforcing universal patterns of speech. “Newspeak” it is termed, and it “was intended to make all other forms of thought impossible… a heretical thought- a thought diverging from the principles of Ingsoc [the totalitarian government]- would be literally unthinkable” (Orwell 300). Certainly, in this case, the universal communication established in the fictional Oceania is not for the good. It illustrates that communication, no matter how universal, is ultimately modified by the dynamics of power, for better, worse, or neither.
This principle is not limited to fictional exemplification: it is embodied in the very creation of the internet. An analysis of the development of the internet, from ARPANET in 1969 to the dominance of the World Wide Web in the 1990s, leads us to abandon any traditional narrative of a communication network novel in its neutrality, in favor of an internet that, while explicitly striving for neutrality, is deeply seated in an English language bias that arose from American and English power dominancy in the internet’s early development. Continued analysis of more recent internet trends and “Web 2.0”, the modified form of the World Wide Web in the 2000s, shows a sudden shift: a renewed movement towards language neutrality. While this trend is often regarded with optimism, application of a power-centric analytic framework yields a darker, more familiar conclusion. Rather than an elimination of language bias altogether (and the supremacy of neutrality), the economic and political forces motivating recent changes suggest that a replacement of English bias with a bias favoring a different language of power is, in fact, more likely.
Though the internet ubiquitously encounters originality, the conflict between neutrality and language bias is not a novel one. The internet consists of shared code (most websites are written in HyperText Markup Language [HTML]) and thus inherits traits of neutrality and universality from its basis on programming languages. These languages are complex expressions of the fundamental logic that governs all computer based technology: 0 and 1, on and off. This binary manipulation is the most basic system of logic and thus universally comprehensible and completely language neutral. In so far as any portion of the internet is “translated” binary, the design of the internet is inherently neutral: it is universally intelligible and wieldable and lends no advantage to speakers of a specific language.
Yet a language bias persists, provoked (but not necessitated) by an increasing complexity in expression. The internet is a complex construct, and designing it using binary would be nigh impossible. To tackle such projects, programming languages utilize complex expressions that fulfill objectives efficiently. Nearly every language uses numerical and algorithmic logic (as opposed to binary, yes vs. no logic), which requires mathematical fluency for comprehension. Such mathematical knowledge is nearly as universal as logic and, since the widespread adoption of Arabic numerals, language neutral as well. However, many languages (such as HTML, C++, ) reach such heights of logical complexity that algorithmic representation becomes cumbersome, and thus, designers turn to grammatical constructs. For example, an if-then logical construction represented grammatically is simply [If (X=0); Then (Y+2);] The same construction represented (long-form) algorithmically is [X=-1,Y+0; X=0,Y+2; X=1,Y+0;] and so on, for every value of X. For this reason, in C++, an if-then statement is written:
if (x == 100)
cout << "x is 100";
cout << "x is not 100"; (Soulié)
Here, at this point of complexity, we introduce a language bias, for the if-then construction is in a specific language (English) and though one could learn and manipulate the logical basis irrespective of linguistic fluency (though it would be more difficult since one does not have the benefit of instant grammatical understanding), one must still employ the English markers to write the code. As programming languages are used in tandem, and multitudes of code (User interfaces, encryption methods, graphical outputs etc.) are assembled to form the internet, we have another jump in complexity and thus another jump in language bias: an increase in the dependency on a specific pre-established language to communicate.
The pioneers and designers of the internet tended to explicitly pursue a standard of neutrality and divorcement from real world bias (Barlow’s quote that began this paper came from a work titled A Declaration of the Independence of Cyberspace) yet, despite their efforts at utilizing complexity without bias, the internet developed with a strong English bias. This occurred because the internet, being a cutting edge technology, was strongly influenced throughout its development by the power of English speaking nations and groups. In 1958, in response to the Russian success with Sputnik, the United States Department of Defense established the Advanced Research Projects Agency (ARPA) to “improve the military’s use of computer technology” (Gormov). With a focus on information networking and heavy utilization of universities (such as Stanford and M.I.T.), ARPANET, the first significant network of computers, was established in 1969. The English language influence here is apparent: the entirety of the project was executed by either American or British scientists and the initiative came from the Unites States government itself. An international rival to ARPANET (the International Telecommunication Union’s public data networks) would not arise until 1976, and by then ARPANET had expanded to nearly 200 interconnected hosts: a dominating English speaking majority was already established (International Telecommunications Union).
Two years later, the differences wouldn’t matter. With the proliferation of networks outside of ARPANET (Usenet, Telenet, many other local and university based networks) a standard was created to unite them. TCP/IP (also called Internet Protocol Suite) reformulated networking protocol and forged a unified system- bringing most current users into the same system and allowing significant network use outside the U.S. for the first time. While this created the opportunity for internationalization, the vast majority of the now united communities were already English speaking, and, with the dearth of business and local hosts outside of the U.S. and England, the few non-English networks were either governmental or university based (Gormov). English retained its power over the internet.
However, the “internet” (short for “internetworking”, first attested in its modern meaning after the adoption of TCP/IP) was adopted by the European Organization for Nuclear Research (CERN), which established its own CERNET in 1981. In 1990, Tim-Berners Lee, a physicist at CERN, completed his design for the World Wide Web, the now universally adopted system and interface of the internet. He wrote HTML, the design code for the internet, and posted the first webpage. Despite the European and multilingual origins of CERNET, Tim-Berners Lee, an Englishman, did all his work in English. HTML was designed with (and continues to have) English key-words (such as “If” and “Else”), the first webpage was written in English, and Uniform Resource Locators, the labels assigned to webpages to allow the browser protocols to find them, were only encoded for English letter combinations (Berners-Lee).
The World Wide Web did not dominate the internet overnight; in fact, until 1993 the World Wide Web had an adoption reminiscent of ARPANET- exclusively universities and science laboratories. In the summer of 1993, the popularity of the World Wide Web exploded with the release of Mosaic, the first browser for Windows. While previous interfaces for the internet required a minority operating system (like UNIX) and significant technical expertise, Mosaic, with its graphical interface and online availability, allowed almost anyone to browse the Web. With Mosaic, the English bias of designers profoundly transferred to the common user. Mosaic, developed at the University of Illinois at Urbana-Champaign, only offered an English interface- non-expert users (of any nation) had no choice but to experience the internet through an English filter (About NCSA Mosaic).
In 2000, and continuing to the present day, a dramatic shift of bias occurred: neutrality seems to be reasserting itself. Between 1999 and 2000, English speakers went from 60% of internet users to 40% , English webpages from 75% to 60% (Pimienta, Prado and Blanco). In 2000, the number of newly created non-English sites doubled that of newly created English sites (Crystal 218). Most recently (June 2011), the Internet Corporation of Assigned Names and Numbers made a fundamental change to one of the World Wide Web’s original English Biases: the internationalization of domain names. Previously, no matter the language of a website or user, in order to list a website or access it, one had to use Latin letters. For example, the Greek news site “Ta Nea” had to be listed and accessed under “www.Tanea.gr”, with the recent change the URL could instead be typed or listed as “Τανέα” (Arthur).
This seems like a radically positive step towards neutrality, but a reexamination, with power in mind, leads us to a different interpretation. Perhaps the changes that have occurred are not moves towards neutrality, but away from an English bias. This may seem a meaningless semantic difference, but if recent trends away from an English bias can be attributed to the loss of English speaking states’ power, then it is possible that rather than a move towards neutrality, the internet is in fact, slowly moving towards a new language bias, backed by more powerful nations. The shifts of power within the last decade seem to support this hypothesis. In 2000, the “dot-com boom” mentioned previously, turned into a bust, as hundreds of startup companies plummeted into bankruptcy and stock-holders abandoned online companies in droves (Lowenstein). This total loss of economic power, neatly coincides with the beginning of a renewed shift towards neutrality. In contrast to this loss, other nations have been experiencing surges of economic power on the internet. For example, Asian businesses have been experiencing a dot-com-boom-like rush for domain names, as corporations increase their worth by acquiring newly created (by the ICANN internationalization) domain names. Furthermore, the demographics of the internet have also radically changed. In 2000, 95 million internet users were in the United States, the highest number at the time. In 2012, 245 million internet users were in the United States, but, today, that is only 10% of the total internet population- China has 500 million users, India 137 million, and Japan 100 million (Miniwatts Marketing Group). Finally, more direct evidence comes from analysis of lobby groups. One of the largest supporters of the ICANN internationalization of domain names was the Multilingual Internet Names Consortium, a lobby group that “focuses on developing and promoting a truly multilingual Internet domain names and keywords [and] internationalization of Internet names standards and protocols…” (Multilingual Internet Names Consortium). However, an examination of their sources of funding and representatives, reveals a large proportion of Asian businesses- the very groups that stand to gain power by ICANN’s move.
In conclusion, we have found the manifestations and origins of an English bias in the development of the internet, noted a renewed movement towards neutrality, and questioned whether neutrality is truly being served, or if simply a different language bias is beginning its ascension. We located the source and spirit of internet neutrality in the universal logic of binary, and then found the opposition against neutrality flowing from the necessity of complex semantics. This dichotomy ended seemingly with an English bias victory in the 1990s, but we now have reason to question this conclusion, as either a resurgence of neutrality or a replacement of English seems likely.