Wednesday 22 November 2006

History and Evolution of the Internet Part III

This ensured that individual network would retain their specific character while having access to a larger community of computers. The second feature was the establishment of a ‘gateway’ within each network. This would be its means of linking up to the larger network outside of itself. Basically, this ‘gateway’ would be in the form of a larger computer with the capability of handling large volumes of traffic and a software which would enable it to redirect and transmit data ‘packages’. Also, another peculiarity of this ‘gateway’ would be that it would retain no memory of the data being transferred and transmitted through it. While this was basically designed to cut down on the workload of this computer, it had the added advantage of the deterring of any sort of censorship or control of the traffic.

Data packages were to be transferred through the fastest accessible path. For example, if one of the computers in the network was either slow or blocked, packages would be rerouted through other computers until it reached its final destination. Any gateway linking different sorts of networks together would have to, perforce, always remain open. Also, it could not discriminate between the traffic being routed through it and out of it. The implicit principle of this sort of an ‘open architecture’ system, of course, is that the underlying operating principles of the network are accessible to all the networks participating in it. Thus, immediately, it democratizes the whole organization. Thus since the basic design information to design such an interconnected network was available to any individual or organization, theoretically any new network could be easily linked up to the existing one. This feature would, later, enable a range of technological innovations in the internet.

We need to keep in mind that at this point in the history of the internet; we are basically talking about huge mainframe computers only. These machines were not accessible to the public and were largely owned by huge corporations, universities or government organizations. We are still far away from the user friendly World Wide Web of today. Initially it was thought that this kind of a system would ultimately depend only a select few national or sub-networks.

By this time, a number of independent computer networks had come into being. Some of the more important of these were the Telenet developed by Stanford in 1974. This was also the first communication network which was available to the public at large and functioned somewhat like the commercial version of the ARPANET. Markedly different in character was the network developed by the US Department of Energy. It was called the MFENet and was meant to facilitate research into Magnetic Fusion Energy. This spurred NASA to develop its own SPAN for the use of space physicists. 1976 saw the expansion of network to reach the larger academia network. This year saw the development of a UNIX-to-UNIX based protocol by AT&T Bell laboratories. This was because they provided free access to this software to all UNIX users and the fact that UNIX was the main operating system employed by the academia of the time.

Further developments were the establishment of the still operational Usenet in 1979 and Bitnet (by the City University New York) in 1981. The US National Science Foundation funded the development of CSNet to enhance communication between computer scientists situated in disparate locations like the government, industry or the university. Throughout we have only talked about the development taking place in integrated computer networking in the United States only. This is not to mean that similar experimentation was being carried outside the boundaries of the US. In 1982, Eunet was launched. This was primarily a European adaptation of the American UNIX network. It linked together networks based in the UK, Netherlands and Scandinavia. EARN (European Academia and Research Network) was established in 1984 and was modeled on the Bitnet.

Sunday 19 November 2006

History and Evolution of the Internet Part II

At first, it was thought that the network could take advantage of the existing telephone network infrastructure for this kind of information routing. However, when the first remote computers located at MIT and Berkeley were linked together, it was found that the phone lines were just too slow to allow for a successful transfer of data and the running of programs between the two locations. But the experiment was successful in one regard, this was the first time two computers had been linked to each other to form a ‘Wide Area Network’ or WAN. Finally, in 1966-67, the new head of computer research at APRA, Leonard Roberts, proclaimed a blueprint for an interconnected computer network.

It was to be called the APRANET. Following this announcement it was discovered that a few other research institutions were also studying the advantages of computer networking for speedier transfer of data and communication. Independent of each other, and in fact in total ignorance of similar research, scientists at MIT, the RAND Corporation and the UK based National Physics Laboratory were working in the same field. The first functional version of the ARPANET integrated within itself the best features of all these investigations.

This was just the hardware part. Now the need was to develop a suitable software platform which could integrate these computers and enable the transfer of data between them. This was the impetus towards the development of the ‘interface message processor’ or IMPs, the work on which was finished only in 1968. Finally in 1969, IMPs were loaded on to the computers located at the remote locations of UCLA and Stanford. This was the first computer network of its kind which enabled the students at UCLA to ‘login’ in to the Stanford computer, gain access to its database and transfer data between the two locations. The success of this experiment led to the addition of four more host computers to ARPANET and the establishment of research centers at Utah and Santa Barbara. By the December of 1971, ARPANET had linked together 23 host computers. This was the first true fully functional computer network. At the First International Conference on Computers and Communication in the October of 1972, the ARPANET was presented to the public. Another resultant development of this conference was the formation of the IWG or the Internetworking Working Group. Its main task was to coordinate the research happening in this field.

Simultaneously, research was being carried out on the means to increase the functionality, flexibility and expansion of this system. 1972 marked the launch of the prototype of the program we know today as e-mail. The early 70s also mark the development of host-to-host protocols which would enable two remote computers to be merged together so that they function as one host (though only for the period that they are linked together). However, the most important innovation of this period was the development of the ‘transmission control protocol/internet protocol’ or TCP/IP.

This was a language which would enable disparate computer networks to interact with each other on a common platform. Finally we come to the one innovation which definitively determined the present nature of the internet as we know it today. The scientists working on computer networking decided that such a network would have an ‘open architecture’. In this sense, they remained true to the basic idea of Licklider’s ‘Galactic Network’. Firstly, in an ‘open architecture’ system, every network is given the freedom to develop applications which would be specific to that particular network. At the same time, ‘open architecture’ insured that these networks would not need to undergo any modification to join the larger complex of interconnected computers.

Wednesday 15 November 2006

The History and the Evolution of the Internet Part I

Today, the internet has become an integral part of our lives. In fact, it is hard to name a technology which has left so big an impact on modern life. Apart from the fact that the internet is a mine of information, a viable business platform and a means of social interaction, what is truly astounding about it is its global reach; and this is the reason why the internet today has emerged the primary platform for business, academia and pleasure - regardless of location. It is a compendium of a range of technologies – it integrates the capabilities of the telegraph, the telephone with that of the radio and the computer.

The form in which most individuals access it today – the World Wide Web – is the result of a concerted effort to bring together continual investment, path breaking research and committed infrastructure development. What had started out as an experiment in military communication technology is now a means of global information gathering and dissemination. The history of the internet involves three major aspects – technology, infrastructure organization and the role of community in shaping its current form. Also, we should keep in mind that the evolution of the internet has been the result of the government, the academia and the industry working together.

The story begins in the year 1957. This was the year that the erstwhile USSR launched the satellite Sputnik, stealing a march over the US in the space wars. Spurred by this defeat, the Ministry of Defense created the Advanced Research Projects Agency (ARPA). While the primary function of this agency was to supply state of the art defense technology, it also became the nodal site for computer research in the United States. It is here that the first computer network of its kind would be developed. The need for such a communication network was determined by the fact that from the beginning, ARPA was interested in developing a communication interface between the computers on its operational base and those at the sites of its various sub-contractors locate din various top academic institutes and research laboratories.

In the August of 1962, J C R Licklider circulated a series of memos discussing the possibility of building a ‘Galactic Network’. He envisioned this as a global network, formed of interconnected computers through which any individual would be able to access data, from any location in the world. This was the first time that the possibility of such a network was posited in the scientific community. Licklider was the first head of the computer research program at ARPA and was thus particularly influential in directing its research. At the same time, another scientist within the ARPA was developing another technology which would later form an integral part of the internet.

This scientist was Leonard Kleinrock and he was involved in the development of ‘packet switching’. This is the method of sending information through breaking down the intended message into ‘packets’. These individual ‘packets’ are sent separately and the computer at the other end reassembles them to form the complete message. The main advantage of this method was that it increased the flexibility of the network and increased its capacity to handle traffic. Also, this was a more secure process of sending data as it makes spying difficult along with the fact that it removes the need to rely on only one router.
Continued in Next few posts...

Thursday 9 November 2006

An Introduction to the Semantic Web

I believe in the last few months, all of us have heard a lot about the Web 2.0 revolution. There is some awesome work going on in these technologies. With websites like Digg, Delicious becoming billion dollar success stories, this is certain that Web 2.0 is going to be a Hit.

One of the very primitive Web 2.0 ideas was the internet directory Dmoz, where the surfers could index any website of their choice onto a particular category, thus creating a large user based directory. It was a marathon of an effort and is now quite successful. The ideas like Digg and Delicious take this idea a step further.

Ajax has also played a pivotal role in all this success. The people who invested in technologies like Ajax a year ago are now reaping the fruits of their investment.

This is so far the success story so what’s next. What after Web 2.0? Well the idea is called Web 3.0 or the Semantic Web. Great institutions like the W3C have already started the research about the web 3.0.

So what is the Semantic Web?
The idea is that we should be able to do segregate the internet on the basis of meanings rather than the words, as the current web is. Like for example a search engine like Google returns around half a billion results when I query it for the keyword “Apple”. And all these results are related to Steve Job’s Apple. What if I am talking about the Fruit Apple? Shouldn’t and Search Engine who is giving me half a billion results first ask me in what context I am querying something. This is an argument worth entering into.


I am suggesting that we should stop using google; all I am saying is that in the Next Generation Search engine the context or rather the meaning of a work will play a much more crucial role then just the word itself.

And apart from it think about the videos, images and audio files that are there on the internet. With the onset of Web 3.0 we would be able to not only search these multimedia sources but also search the objects present in these.

These ideas are still in white papers and will take some time to actually get implemented on the internet, and thus the time is right for an inquisitive soul of research about the semantic web.