THE LITTLE HISTORY OF THE WORLD WIDE WEB
Wolfgang Mulzer,Shunfeng Zhang
What is the World-Wide Web?
- The basic idea of WWW is to merge the techniques of computer networking and hypertext into a powerful and easy to use global information system.
- Hypertext is text with links to further information, on the model of references in a scientific paper or cross-references in a dictionary.
- With electronic documents, these cross-references can be followed by a mouse-click, and with the World-Wide Web, they can be anywhere in the world.
Hypertext is a way to organise documents so that your computer can help you find items of interest.
the blue rectangles are electronic documents, similar to pages of paper documents.
Inside them, sensitive spots are indicated (the little red rectangles). A computer is used to display the pages on its screen. The sensitive spots are exploited by the computer to switch automatically from one page to another when the user clicks on a sensitive spot.
This navigation by wandering from one page to another is called "browsing".
WWW is "seamless" in the sense that a user can see the whole Web of information as one vast hypertext document. There is no need to know where information is stored, or any details of its format or organisation. Behind this apparent simplicity of course there is a set of ingenious design concepts, protocols and conventions.
1. History of Hypertext
1.1 Memex
The history of hypertext begins in July of 1945, Dr. Vannevar Bush, proposes Memex in an article titled As We May Think published in The Atlantic Monthly. In the article, Bush outlines the ideas for a machine that would have the capacity to store textual and graphical information in such a way that any piece of information could be arbitrarily linked to any other piece.
1.2 Hypertext
- In 1965, Ted Nelson coined the terms "hypertext" and "hypermedia" in a paper to the ACM 20th national conference. In an article published by Literary Machines.
- The first hypertext-based system was developed in 1967 by a team of researchers led by Dr.Andries van Dam at Brown University. In 1968, van Dam developed FRESS, a File Retrieval and Editing System which was an improvement of his original Hypertext Editing System and was used commercially by Philips.
- Doug Engelbart of the Stanford Research Institute, inventor of the mouse. In 1968 he introduced his NLS, the oN Line System, which held in a "shared journal"
- In 1972, researchers at Carnegie-Mellon University began development of ZOG (doesn't stand for anything!). ZOG was a large database designed for a multiuser environment. The ZOG database consisted of frames which, in turn, consisted of a title, a description, a line with standard ZOG commands, and a set of menu items (called selections) leading to other frames. The ZOG database was text-only and originally ran on an IBM mainframe. A PERQ workstation implementation of ZOG was used on the nuclear-powered aircraft carrier USS Carl Vinson. Two of the original developers of ZOG, Donald McCracken and Robert Akscyn, later developed KMS, Knowledge Management System, which was an improved version of ZOG. KMS ran on Sun and HP Apollo workstations with much enhanced performance. Though KMS included a GUI, it still remained a text-based system. It was intended to be a collaborative tool, in that users could modify the contents of a frame and the changes would be immediately visible to others through dynamically updated.
- In 1978, Andrew Lippman of MIT Architecture Machine Group, lead a team of researchers that developed what is argued to be the first true hypermedia system called the Aspen Movie Map. This application was a virtual ride simulation through the city of Aspen, Colorado.
1.3 Xanadu
Theodor Holm Nelson, a writer, film-maker, and software designer, conceived the idea of Xanadu in 1981. In his own words, "explaining it quickly:"
- Xanadu is a system for the network sale of documents with automatic royalty on every byte.
- The transclusion feature allows quotation of fragments of any size with royalty to the original publisher.
- This is an implementation of a connected literature.
- It is a system for a point-and-click universe.
- This is a completely interactive docuverse.
In the Xanadu scheme, a universal document database (docuverse), would allow addressing of any substring of any document from any other document Additionally, Xanadu would permanently keep every version of every document, thereby eliminating the possibility of a broken link and the ever-so-familiar 404-Document Not Found error. Xanadu was never implemented. In his article in Wired Magazine, The Curse of Xanadu, Gary Wolf writes:
1.4 Other Landmarks
- Janet Walker's 1985 Symbolics Document Examiner which was the first hypertext-based system to gain wide-spread acceptance and usage. The system provided the manual for Symbolics computers in hypertext format as opposed to the 8000 page printed version. This application was significant in that it was generic enough to be used for general purposes. This was a change from other hypertext applications of that time which were written for specific needs. The application gave the users the option to bookmark nodes within the document database.
- In 1985, Xerox released NoteCards, a LISP-based hypertext system. NoteCards' unique features included scrolling windows for each notecard, pre-formatted specialized notecards, and a separate browser/navigator window. Another hypertext application released in 1985 was Brown University's Intermedia for the Macintosh A/UX system.
- In 1986 Office Workstations Ltd (OWL) introduced OWL-Guide, which was a hypertext system developed for the Macintosh. The original version of Guide was a PERQ workstation hypertext system based on the work of Peter Brown of University of Kent at Canterbury developed in 1982. OWL-Guide was later ported to the IBM-PC platform and became the first multi-platform hypertext system. The application gained wide-spread acceptance due to the popularity of the Macintosh platform.
- In 1987, Bill Atkinson of Apple Computers introduced HyperCard. Apple bundled the application free with all Macintosh machines. HyperCard soon-after became the most widely used hypertext system and many HyperCard-based applications were developed. Many believe HyperCard to be the application that contributed the most to the popularization of the hypertext model. ACM held the first Conference on Hypertext later that year.
- In 1989, the World Wide Web came along...
2.History of HTTP
Introduction
The main protocol behind the World Wide Web (WWW) is the Hypertext Transfer Protocol (HTTP). It forms the basis for the transfer of documents between WWW servers and clients. It runs on top of the TCP/IP protocol suite.
History of HTTP
Around the time when the WWW started gaining popularity, there were a host of specialized data transfer protocols related to file transfers, news broadcast, mail transfers, search and retrieval facilities. But the WWW brought along its own specific needs. The protocol needed mainly a subset of file transfer functionality, the ability to request and index search, automatic format negotiation and the ability to refer the client to another server. This initiated the development of HTTP.
- The original HTTP developed in 1991 was a very simple minded protocol. Its main purpose was to do raw data transfer. This protocol is HTTP/0.9. It had facilities for just making connections, initiating requests and getting back responses. All the responses are simply streams of ASCII characters.
- This protocol was replaced by HTTP/1.0, described in RFC 1945. This protocol added MIME-like messages, header lines containing information about the data being transferred and modifiers for the request/response messages.
- This protocol also had disadvantages that were overcome by the next proposed enhancement, HTTP/1.1. HTTP/1.0 and HTTP/1.1 are described in more detail in the next two sections. HTTP 1.1 considered hierarchical proxy servers, caching, adding persistent connections and virtual hosts.
- HTTP 1.1 is being considered as a standard by the Internet Engineering Task Force (IETF).
- HTTP-NG (Next Generation) is another proposed modification to HTTP/1.0. It preceded development of HTTP/1.1 (draft 7). It was supposed to be an enhanced replacement for HTTP/1.0. Some of the key ideas behind HTTP-NG include persistent connections, ability to make multiplex multiple requests/responses over a single transport connection, and the ability to pass URLs to a protocol in a response message. S-HTTP (Secure HTTP) was developed to provide secure communication mechanisms between an HTTP client-server pair in order to enable spontaneous commercial transactions for a wide range of applications.
And the second part
Shunfeng Zhang