正在加载图片...
but it did open up use of the Internet to many more people in universities in particular. Other departments besides the libraries, computer, physics, and engineering departments found ways to nake good use of the nets--to communicate with colleagues around the world and to share files d While the number of sites on the Internet was small, it was fairly easy to keep track of the resources of interest that were available. But as more and more universities and organizations--and their libraries-- connected the Internet became harder and harder to track. there was more and more need for tools to index the resources that were available In 1991, the first really friendly interface to the Internet was developed at the University of Minnesota. The University wanted to develop a simple menu system to access files and information on campus through their local network. a debate followed between mainframe dherents and those who believed in smaller systems with client-server architecture. The mainframe adherents"won"the debate initially, but since the client-server advocates said they could put up a prototype very quickly, they were given the go-ahead to do a demonstration system he demonstration system was called a gopher after the U of Minnesota mascot--the golden gopher. The gopher proved to be very prolific, and within a few years there were over 10,000 gophers around the world. It takes no knowledge of UNIX or computer architecture to use. In a gopher system, you type or click on a number to select the menu selection you want Gopher's usability was enhanced much more when the University of Nevada at Reno developed the VERONICA searchable index of gopher menus. It was purported to be an acronym for Very Easy Rodent-Oriented Netwide Index to Computerized Archives. A spider crawled gopher menus around the world, collecting links and retrieving them for the index. It was so popular that it was very hard to connect to, even though a number of other VERONICA sites were developed to ease the load. Similar indexing software was developed for single sites, called JUGHEAD (Jonzy's Universal Gopher Hierarchy Excavation And Display) In 1989 another significant event took place in making the nets easier to use. Tim Berners-Lee and others at the European Laboratory for Particle Physics, more popularly known as CERN, proposed a new protocol for information distribution This protocol, which became the World Wide Web in 1991, was based on hypertext--a system of embedding links in text to link to other text, which you have been using every time you selected a text link while reading these pages. Although started before gopher, it was slower to develop The development in 1993 of the graphical browser Mosaic by Marc Andreessen and his team at the National Center for Supercomputing Applications(NCSA)gave the protocol its big boost ater,Andreessen moved to become the brains behind Netscape Corp, which produced the most successful graphical type of browser and server until Microsoft declared war and developed its Microsoft Internet Explorer Since the Internet was initially funded by the government it was originally limited to research, education, and government uses. Commercial uses were prohibited unless they directly served the goals of research and education. This policy continued until the early 90s, when independent commercial networks began to grow. It then became possible to route traffic across the country from one commercial site to another without passing through the government funded NSFNet internet backbone Microsoft's full scale entry into the browser, server, and Internet Service Provider market completed the major shift over to a commercially based Internet. The release of windows 98 in June 1998 with the microsoft browser well integrated into the desktop shows Bill Gates determination to capitalize on the enormous growth of the Internet. Microsoft's success over the past few years has brought court challenges to their dominance A current trend with major implications for the future is the growth of high speed connections 56K modems and the providers who support them are spreading widely, but this is just a small step compared to what will follow. 56K is not fast enough to carry multimedia, such as sound and ideo except in low quality. But new technologies many times faster, such as cable modems digital subscriber lines(DSL), and satellite broadcast are available in limited locations now, and will become widely available in the next few years. These technologies present problems, not just in the user's connection, but in maintaining high speed data flow reliably from source to the user Those problems are being worked on, too During this period of enormous growth, businesses entering the Internet arena scrambled to find economic models that work. Free services supported by advertising shifted some of the directbut it did open up use of the Internet to many more people in universities in particular. Other departments besides the libraries, computer, physics, and engineering departments found ways to make good use of the nets--to communicate with colleagues around the world and to share files and resources. While the number of sites on the Internet was small, it was fairly easy to keep track of the resources of interest that were available. But as more and more universities and organizations--and their libraries-- connected, the Internet became harder and harder to track. There was more and more need for tools to index the resources that were available. In 1991, the first really friendly interface to the Internet was developed at the University of Minnesota. The University wanted to develop a simple menu system to access files and information on campus through their local network. A debate followed between mainframe adherents and those who believed in smaller systems with client-server architecture. The mainframe adherents "won" the debate initially, but since the client-server advocates said they could put up a prototype very quickly, they were given the go-ahead to do a demonstration system. The demonstration system was called a gopher after the U of Minnesota mascot--the golden gopher. The gopher proved to be very prolific, and within a few years there were over 10,000 gophers around the world. It takes no knowledge of UNIX or computer architecture to use. In a gopher system, you type or click on a number to select the menu selection you want. Gopher's usability was enhanced much more when the University of Nevada at Reno developed the VERONICA searchable index of gopher menus. It was purported to be an acronym for Very Easy Rodent-Oriented Netwide Index to Computerized Archives. A spider crawled gopher menus around the world, collecting links and retrieving them for the index. It was so popular that it was very hard to connect to, even though a number of other VERONICA sites were developed to ease the load. Similar indexing software was developed for single sites, called JUGHEAD (Jonzy's Universal Gopher Hierarchy Excavation And Display). In 1989 another significant event took place in making the nets easier to use. Tim Berners-Lee and others at the European Laboratory for Particle Physics, more popularly known as CERN, proposed a new protocol for information distribution. This protocol, which became the World Wide Web in 1991, was based on hypertext--a system of embedding links in text to link to other text, which you have been using every time you selected a text link while reading these pages. Although started before gopher, it was slower to develop. The development in 1993 of the graphical browser Mosaic by Marc Andreessen and his team at the National Center for Supercomputing Applications (NCSA) gave the protocol its big boost. Later, Andreessen moved to become the brains behind Netscape Corp., which produced the most successful graphical type of browser and server until Microsoft declared war and developed its Microsoft Internet Explorer. Since the Internet was initially funded by the government, it was originally limited to research, education, and government uses. Commercial uses were prohibited unless they directly served the goals of research and education. This policy continued until the early 90's, when independent commercial networks began to grow. It then became possible to route traffic across the country from one commercial site to another without passing through the government funded NSFNet Internet backbone. Microsoft's full scale entry into the browser, server, and Internet Service Provider market completed the major shift over to a commercially based Internet. The release of Windows 98 in June 1998 with the Microsoft browser well integrated into the desktop shows Bill Gates' determination to capitalize on the enormous growth of the Internet. Microsoft's success over the past few years has brought court challenges to their dominance. A current trend with major implications for the future is the growth of high speed connections. 56K modems and the providers who support them are spreading widely, but this is just a small step compared to what will follow. 56K is not fast enough to carry multimedia, such as sound and video except in low quality. But new technologies many times faster, such as cable modems, digital subscriber lines (DSL), and satellite broadcast are available in limited locations now, and will become widely available in the next few years. These technologies present problems, not just in the user's connection, but in maintaining high speed data flow reliably from source to the user. Those problems are being worked on, too. During this period of enormous growth, businesses entering the Internet arena scrambled to find economic models that work. Free services supported by advertising shifted some of the direct
<<向上翻页向下翻页>>
©2008-现在 cucdc.com 高等教育资讯网 版权所有