The idea of a
computer network intended to allow general communication between users of various
computers has developed through a large number of stages since the earliest versions
of such an idea appeared in the late 1950s.
Before the Internet
world's first operational packet switching network was the ARPANET, which first
went online in 1968. The program which produced the ARPANET, initiated by J. C.
R. Licklider, had already conceived most of the basic ideas of what such a network
would look like to its users; today's Internet is not much more than a better-engineered
and very wide-spread (much more so than perhaps even those visionary pioneers
foresaw) implementation of those ideas.
The Internet's roots
lie within the ARPANET, which not only was the intellectual forerunner of the
Internet, but was also initially the core network in the collection of networks
in the Internet; it was also an important tool in developing the Internet (being
used for communication between the groups working on internetworking research).
Motivation for the Internet
need for an internetwork appeared with ARPA's sponsorship, by Robert E. Kahn,
of the development of a number of innovative networking technologies; in particular,
the first packet radio networks (inspired by the ALOHA network), and a satellite
packet communication program. Later, local area networks (LANs) would also join
Connecting these disparate networking technologies
was not possible with the kind of protocols used on the ARPANET, which depended
on the exact nature of the subnetwork. A wholly new kind of networking architecture
Early Internet work
Vint Cerf of the University of California, Los Angeles to work with him on the
problem. They soon worked out a fundamental reformulation, where the differences
between network protocols were hidden by using a common internetwork protocol,
and instead of the network being responsible for reliability, as in the ARPANET,
the hosts became responsible. Cerf credits Herbert Zimmerman and Louis Pouzin
(designer of the CYCLADES network) with important influences on this design. Some
accounts also credit the early networking work at Xerox PARC as an important technical
With the role of the network reduced to the bare
minimum, it became possible to join almost any networks together, no matter what
their characteristics were, thereby solving Kahn's initial problem. (One popular
saying has it that TCP/IP, the eventual product of Cerf and Kahn's work, will
run over "two tin cans and a string".) A computer called a gateway (later
changed to router to avoid confusion with other types of gateway) is provided
with an interface to each network, and forwards packets back and forth between
them. Happily, this new concept was a perfect fit with the newly emerging local
area networks, which were revolutionizing communication between computers within
a single site.
had been up and running for a decade, ARPA looked for another agency to hand off
the network to. After all, ARPA's primary business was funding cutting-edge research
and development, not running a communications utility. Eventually the network
was turned over to the Defense Communications Agency, also part of the Department
In 1983, TCP/IP protocols replaced the earlier
NCP protocol as the principal protocol of the ARPANET; in 1984, the U.S. military
portion of the ARPANet was broken off as a separate network, the MILNET. At the
same time, Paul Mockapetris and Jon Postel were working on what would become the
Domain Name System.
The early Internet, based around the
ARPANET, was government-funded and therefore restricted to non-commercial uses
such as research; unrelated commercial use was strictly forbidden. This initially
restricted connections to military sites and universities. During the 1980s, the
connections expanded to more educational institutions, and even to a growing number
of companies such as Digital Equipment Corporation and Hewlett-Packard, which
were participating in research projects, or providing services to those who were.
branch of the U.S. government, the National Science Foundation, became heavily
involved in Internet research in the mid-1980s. The NSFNet backbone, intended
to connect and provide access to a number of supercomputing centers established
by the NSF, was established in 1986.
At the end of the 1980s,
the U.S. Department of Defense decided the network was developed enough for its
initial purposes, and decided to stop further funding of the core Internet backbone.
The ARPANET was gradually shut down (its last node was turned off in 1989), and
NSF, a civilian agency, took over responsibility for providing long-haul connectivity
in the U.S.
In another NSF initiative, regional TCP/IP-based
networks such as NYSERNet (New York State Education and Research Network) and
BARRNet (Bay Area Regional Research Network), grew up and started interconnecting
with the nascent Internet. This greatly expanded the reach of the rapidly growing
network. On April 30, 1995 the NSF privatized access to the network they had created.
It was at this point that the growth of the Internet really took off.
and Privatization of the Internet
Parallel to the
Internet, other networks were growing. Some were educational and centrally-organized
like BITNET and CSNET. Others were a grass-roots mix of school, commercial, and
hobby like the UUCP network.
During the late 1980s the first
Internet Service Provider (ISP) companies were formed. Companies like PSINet,
UUNET, Netcom, and Portal were formed to provide service to the regional research
networks and provide alternate network access (like UUCP-based email and Usenet
News) to the public. The first dial-up ISP, world.std.com, opened in 1989.
interest in commercial use of the Internet became a hotly-debated topic. Although
commercial use was forbidden, the exact definition of commercial use could be
unclear and subjective. Everyone agreed that one company sending an invoice to
another company was clearly commercial use, but anything less was up for debate.
The alternate networks, like UUCP, had no such restrictions, so many people were
skirting grey areas in the interconnection of the various networks.
university users were outraged at the idea of non-educational use of their networks.
Ironically it was the commercial Internet service providers who brought prices
low enough that junior colleges and other schools could afford to participate
in the new arenas of education and research.
By 1994, the
NSFNet lost its standing as the backbone of the Internet. Other competing commercial
providers created their own backbones and interconnections. Regional NAPs (network
access points) became the primary interconnections between the many networks.
The NSFNet was dropped as the main backbone, and commercial restrictions were
as a message service on early time-sharing mainframe computers connected to a
number of terminals. Around 1971 it developed into the first system of exchanging
addressed messages between different, networked computers; in 1972 Ray Tomlinson
introduced the "name@computer" notation that is still used today. E-mail
turned into the Internet "killer application" of the 1980s.
early e-mail system was not limited to the Internet. Gateway machines connected
Internet SMTP e-mail with UUCP mail, BITNET, the Fidonet BBS network, and other
services. Commercial e-mail providers such as CompuServe and The Source could
also be reached.
The second most popular application of the
early Internet was Usenet, a system of distributed discussion groups which is
still going strong today. Usenet had existed even before the internet, as an application
of Unix computers connected by telephone lines via UUCP. The Network News Transfer
Protocol (NNTP), similar in flavor to SMTP, slowly replaced UUCP for the relaying
of news articles. Today, almost all Usenet traffic is carried over high-speed
Other early protocols include the File Transfer
Protocol (1985), and Telnet (1983), a networked terminal emulator allowing users
on one computer to log in to other computers.
The Internet has developed a significant subculture
dedicated to the idea that the Internet is not owned or controlled by any one
person, company, group, or organization. Nevertheless, some standardization and
control is necessary for anything to function.
wanted to put their ideas into the standards for communication between the computers
that made up this network, so a system was devised for putting forward ideas.
One would write one's ideas in a paper called a "Request for Comments"
(RFC for short), and let everyone else read it. People commented on and improved
those ideas in new RFCs.
With its basis as an educational
research project, much of the documentation was written by students or others
who played significant roles in developing the network (as part of the original
Network Working Group) but did not have official responsibility for defining standards.
This is the reason for the very low-key name of "Request for Comments"
rather than something like "Declaration of Official Standards".
first RFC (RFC1) was written on 7 April 1969. As of 2004 there are over 3500 RFCs,
describing every aspect of how the Internet functions.
liberal RFC publication procedure has engendered confusion about the Internet
standardization process, and has led to more formalization of official accepted
standards. Acceptance of an RFC by the RFC Editor for publication does not automatically
make the RFC into a standard. It may be recognized as such by the IETF only after
many years of experimentation, use, and acceptance have proven it to be worthy
of that designation. Official standards are numbered with a prefix "STD"
and a number, similar to the RFC naming style. However, even after becoming a
standard, most are still commonly referred to by their RFC number.
Internet standards process has been as innovative as the Internet technology.
Prior to the Internet, standardization was a slow process run by committees with
arguing vendor-driven factions and lengthy delays. In networking in particular,
the results were monstrous patchworks of bloated specifications.
fundamental requirement for a networking protocol to become an Internet standard
is the existence of at least two existing, working implementations that inter-operate
with each other. This makes sense in retrospect, but it was a new concept at the
time made practical because significant physical layer standards were already
in place. Other standardization efforts (e.g., Integrated Services Digital Network,
ISDN) created complex specifications with many optional capabilities. Creating
standards for the physical layer before testing them is necessary. Creating complex
higher layer standards, with many options (necessary to support the many different
requirements proposed), before testing them, proved nearly impossible.
the 1980s, the International Organization for Standardization (ISO) documented
a new effort in networking called Open Systems Interconnect or OSI. Prior to OSI,
networking was completely vendor-developed and proprietary. OSI was a new industry
effort, attempting to get everyone to agree to common network standards to provide
multi-vendor interoperability. The OSI model was a most important advance in understanding
network concepts. However, the OSI protocols or "stack" that were specified
as part of the project became extensive in order to support the many different
capabilities that could be imagined. While the functionality of the Internet evolved
over 20 years, the OSI standardization efforts attempted to give birth to a mature
network. Standards like X.400 for e-mail took up several large books, while Internet
e-mail took only a few dozen pages at most in RFC-821 and 822. Most protocols
and specifications in the OSI stack, such as token-bus media, CLNP packet delivery,
FTAM file transfer, and X.400 e-mail, are long-gone today; in 1996, ISO finally
acknowledged that TCP/IP had won and killed the OSI project. Only one OSI standard,
X.500 directory service, still survives with significant usage, mainly because
the original unwieldy protocol has been stripped away and effectively replaced
Some formal organization is necessary to make
everything operate. The first central authority was the NIC (Network Information
Centre) at SRI (Stanford Research Institute in Menlo Park, California).
of the World Wide Web
One of the Internet applications
many people are most familiar with is the World Wide Web, with many people thinking
that it is the only possible use of the Internet.
Internet grew through the 1980s and early 1990s, many people realized the growing
need to be able to find and organize files and related information. Projects such
as Gopher, WAIS, and the Archie search engine attempted to create schemes to organize
distributed data and present it to people in an easy-to-use form. Unfortunately,
these projects fell short in being able to accommodate all the various existing
file and data types, and in being able to grow without centralized bottlenecks.
Gopher's development would later be halted when the University of Minnesota asserted
its intellectual property rights over the technology.
one of the most promising user interface paradigms during this period was hypertext.
The technology's creation had been inspired by Vannevar Bush's "memex"
and developed through Ted Nelson's research on Project Xanadu and Douglas Engelbart's
research on NLS. Many small self-contained hypertext systems had been created
before, such as Apple Computer's HyperCard, but before the Internet, nobody had
worked out how to scale up the technology so that it could to refer to another
document anywhere in the world. Both Nelson and one of Engelbart's collaborators
had speculated about how to do it, but such speculations had gone nowhere.
actual solution was invented by Tim Berners-Lee in 1989, out of sheer exasperation
after he kept raising his idea at conferences and no one in the Internet or hypertext
communities would implement it for him. He was a computer programmer working at
CERN, the European Particle Physics Laboratory, and wanted a way for physicists
to share information about their research. His documentation project was the source
of the two key inventions that made the World Wide Web possible.
two inventions were the Uniform Resource Locator (URL) and HyperText Markup Language
(HTML). The URL was a simple way to specify the location of a document anywhere
on the Internet in one simple name that specified a computer name, a file "path"
on that machine, and a protocol to use in retrieving that file. HTML was an easy
way to embed codes into a text file that could define the structure of a document
and also include links pointing to other documents. An additional network protocol,
(HTTP), was also invented for reduced overhead and improved speed during file
transfers, but the true genius of the new system was that a new protocol was useful
but not necessary. The URL and HTML system was backwards compatible with existing
protocols like FTP and Gopher.
Tim Berners-Lee's original
WorldWideWeb browser was able to include so-called inline graphics in HTML pages
(also now known as image transclusion), but it was not until around 1992 that
other people also started implementing this in their browsers. The first popular
graphical web browsers were developed: Viola and Mosaic.
for Mosaic (considered by many to be a turning point in the mainstreaming of the
internet) came from the High-Performance Computing and Communications Initiative,
a program created by then-Senator Al Gore's High Performance Computing Act of
1991 . Indeed, Mosaic was developed by a team at the National Center for Supercomputing
Applications at the University of Illinois at Urbana-Champaign (NCSA-UIUC), led
by Marc Andreessen.
Andreessen left NCSA-UIUC and joined
Jim Clark, one of the founders of SGI (Silicon Graphics, Inc.). They started Mosaic
Communications which became Netscape Communications Corporation, making Netscape
Navigator the first commercially successful browser. Microsoft acquired Mosaic
code from Spyglass, Inc., (who got it from NCSA) to develop Internet Explorer.
ease of creating new Web documents and linking to existing ones caused exponential
growth. As the Web grew, search engines and Web directories were created to track
pages on the web and allow people to find things. The first search engine, Lycos,
was created in 1993 as a university project. In 1993, the first web magazine,
The Virtual Journal, was published by a University of Maine student. At the end
of 1993, Lycos indexed a total of 800,000 web pages. Around this time, many corporations
began web sites, including Time Warner with its Pathfinder service and some online
magazines. Online brochures began appearing as well, initially for firms targeting
younger and presumably web-savvy buyers, such as Saturn.
August 2001, the Google search engine tracked over 1.3 billion web pages and the
growth continues. In early 2004, Google's index exceeded 4 billion pages. On November
11, 2004 this number had doubled to just over 8 billion.
August 8, 2005, Yahoo! announced that its online search engine index spans more
than 20 billion Web documents and images.
of computing hardware
Charles Babbage Institute
of hypertext technology
Norberg, Judy E. O'Neill, Transforming Computer Technology: Information Processing
for the Pentagon, 1962-1982 (Johns Hopkins University, 1996) pp. 179196
Janet Abbate, Inventing the Internet (MIT Press, Cambridge,
Katie Hafner, Matthew Lyon, Where Wizards Stay Up
Late: The Origins of the Internet (Simon & Schuster, New York, 1996, ISBN
0-684-83267-4) pp. 219254
Internet Society History Page
the Internet Came to Be
We are located in Clearwater Florida and serve clients around the country.