Law in the Internet Society
The Internet and Capitalist Gain: The Cost of Lunch

It was fall 1990. I was a freshman at Syracuse and my high school girlfriend was a freshman at Dartmouth. In one of her letters, she described this system called “ELM” that would allow us to write each other through our university personal computers. I was intrigued and slightly skeptical (I had always thought of myself as the more tech-savvy in the relationship). The next day, an assistant from Syracuse’s personal computer (PC) lab demoed the “ELM” system and I sent my girlfriend an email message - my first experience with the Internet.

Columbia Professor Eben Moglen would have us to believe that the Internet’s architects designed it with the altruistic goal of reaching, and then availing education to, every human on earth and that corporations, such as Google and Facebook, have depredated this ideal in pursuit of “capitalist gain.” That might be true. But, it is true also that, without the pursuit of “gain,” the Internet would have never experienced such a colossal expansion in global usage. This “gain” is a quid-pro-quo or cost of corporate contribution to the Internet’s growth.

The Internet is the world-wide, public network of interconnected computer networks. The modern-day Internet is commonly thought to have descended from the ARPAnet, a network developed by the U.S. Department of Defense’s Advanced Research Projects Agency (DARPA). In February 1958, the U.S. government created DARPA, after being caught off-guard by the Soviet Union’s launch of an Intercontinental Ballistic Missile and the world's first unmanned satellites, Sputnik 1 and 2. In 1962, amidst fears of what might happen if the Soviet Union attacked the nation’s telephone system, a scientist from M.I.T. proposed, as a solution, a “galactic network” of computers that could talk to each other (later referred to as ARPAnet).

Although a network in name, the Internet is a creature of the computer. During the early computing age, computers were incredibly expensive to produce and operate. An early computer, the Electronic Numerical Integrator Analyzer and Computer (ENIAC), cost $500,000 ($6,781,798 in today’s money), weighed 30 tons, covered nearly 2,000 square feet, and had almost 18,000 vacuum tubes. The pursuit of “gain” motivated “for-profit” corporations to produce smaller, faster and more affordable computers, with more memory and user-friendly software.

In 1948, Bell Laboratories introduced the transistor, an electronic device carrying and amplifying electrical current, but much smaller than the vacuum tube. Ten years later, scientists at Texas Instruments and Fairchild Semiconductor invented the integrated-circuit, incorporating the computer’s electrical parts into a single silicon chip.

In 1971, an Intel engineer developed the microprocessor, one of the most significant advancements in computer technology. Before this invention, computers needed a separate integrated-circuit chip for each function (hence the need for such large machines). Microprocessors were the size of a thumbnail and could run the computer’s programs and manage its data. Intel’s first microprocessor, the 4004, had the same computing power as the massive ENIAC.

These innovations led to the birth of the small, relatively inexpensive “microcomputer” now known as the “personal computer.” In 1974, a corporation called Micro Instrumentation and Telemetry Systems (MITS) introduced Altair, a mail-order build-it-yourself PC kit. In 1975, MITS hired two young programmers to adapt the BASIC programming language for the Altair. In April 1975, the two young programmers formed Microsoft, responsible for the hugely popular Windows operating systems. By some estimates, Windows runs more than 90% of all PCs.

Over the next two years, two engineers in Silicon Valley built the Apple I and Apple II PCs, with more memory and a cheaper microprocessor than the Altair, a monitor, and a keyboard. Innovations like the “graphical user interface,” allowing users to select icons on a monitor screen instead of writing complicated commands, and the computer mouse made PCs more user-friendly. Bell Laboratories, Texas Instruments, Fairchild Semiconductor, Intel, MITS, Microsoft and Apple were all “for-profit” corporations. These corporations and their inventions spearheaded the PC revolution.

Soon, other “for-profit” corporations, like Xerox, Tandy, Commodore and IBM, had entered the PC market. PCs, networked over the global telephony infrastructure, created the Internet we have today. Innovations in personal computing facilitated the Internet’s expansion to 201 countries and to 3.8 billion or 51.7% of the human population. There might have been an Internet without PCs, but it would have been uninteresting, and probably confined to the research community and computer scientists.

In my Syracuse finance classes, professors inculcated the axiom that “for-profit” corporations exist for the sole purpose of maximizing shareholder wealth. According to Columbia alumnus, Milton Friedman, “[t]here is . . . only one social responsibility of business- to use its resources and engage in activities designed to increase its profits so long as it stays within the rules of the game, engages in open and free competition, without deception or fraud."

Moreover, shareholder wealth maximization is the law. For decades, Delaware courts, which dominate U.S. corporate jurisprudence, have espoused the tenet that the maximization of shareholder wealth must be every “for-profit” corporation’s ultimate objective. In essence, a corporation that pursues “capitalist gain” is merely following its legal mandate and complying with contractual obligations to its shareholders. As Senator Franken reminded us, “it is literally malfeasance for a corporation not to do everything it legally can to maximize its profits.”

No doubt, the U.S. government, through DARPA, funded some research and technological developments that made the ARPAnet, and eventually the Internet, possible. However, in many cases, this funding was provided to private “for-profit” corporations, like Xerox. It was the desire for “capitalist gain” that led these corporations to develop the technology products commissioned by DARPA.

Having advanced the Internet architects’ goal of reaching every human on earth, it should not come as a surprise that “for-profit” corporations would then seek to exploit the Internet for “gain.” These corporations are simply following market and legal expectations. The “gain” sought is a cost of the corporations’ facilitation of the Internet’s expansion. As Friedman stated, “[t]here’s no such thing as a free lunch.”


Webs Webs

r6 - 11 Nov 2017 - 03:01:07 - TravisMcMillian
This site is powered by the TWiki collaboration platform.
All material on this collaboration platform is the property of the contributing authors.
All material marked as authored by Eben Moglen is available under the license terms CC-BY-SA version 4.
Syndicate this site RSSATOM