Network count
In the 1990s, the Internet spread to all parts of the world, and became effective tools for communicating information and collaborative work. More importantly, thousands of computing resources, data resources, software resources, and various digital devices collected on the Internet. And the control system together constitutes an important carrier for production, propagation, and knowledge of knowledge. People have begun to think about how to bring a number of resources that physically interconnected, jointly provide services, re-recognize the essence of network computing technology.
At present, the network calculation is in the development stage, and people have formed a relatively acceptable consensus: "Network Calculation" is a combination of various autonomous resources and systems connected to realize resource sharing, collaborative work and Combined calculations provide a variety of users with network-based comprehensive services.
Network computing form:
1) Enterprise calculation
2) Grid calculation
3) peer calculation
4) Popularization
Enterprise calculation: Core as middleware
What is middleware
The middleware is software between operating systems and applications. When people use the middleware, it is often integrated together to form a platform (including development platform and running platform), but there must be a communication middleware in this set of middleware, so I agree to this definition:
Middleware = Platform Communication
This definition is also limited to only in a distributed system, which can also be distinguished from support software and utility. To make a clearance of this problem with an example in a life. Watch distributed systems as transportation systems in Shanghai, online viewing as urban roads, communicating through vehicle (automobile), there will be tens of thousand vehicles on the road, if there is no corresponding transportation facilities and management Planning, Shanghai will become a group, and various traffic accidents have occurred.
Enterprise Calculation is "Network Computing Technology" formed by realizing information sharing and collaborative work between large organizations and organizations, and its core technology is a CLIENT / Server (client / server) computing model and related middleware. technology.
As early as the 1980s, people proposed a new type of distributed operating system on the interconnected computer hardware, and fully manage the entire system, give the user a single system view. Although this effort has produced many technical results and experimental systems, there has been no useful products, and people intuitively feel that centralized management of resources in constant expanding local autonomous heterogeneous systems is almost impossible, so they started Piece platform technology, with the isomer of the shielding system, support the information interaction and synergies of local autonomous systems. After more than ten years of development, the middleware has achieved remarkable development, and there is a remote database access, remote procedure call, message delivery, transaction management and other types of middleware.
At the end of the 1990s, object-oriented middleware technology became mainstream technology of middleware platforms, and three technical branches represented by Sun Company's EJB / J2EE, Microsoft COM / DNA and OMG. Its research hotspot is to establish a standardized object request agency, blocking the isomerization and complexity of computing platforms, operating systems, programming languages, network protocols in a network environment, so that the application system distributed on the network can work together to provide web applications. General Advanced Network Management Services and Value-added services related to the application arena.
Entering the new century, with the development of e-commerce needs, enterprises calculate information sharing and collaborative work issues between enterprises, and the Web-oriented enterprise computing solutions are hot, and Web is proposed for this International Internet Forum W3C (World Wide Web Consortium). The Service Technical System, Microsoft launched .NET technology, Sun launched Sun One architecture, and enterprise computing technology entered the Internet era. Grid calculation: Let computational power "publicize"
Grid computing is another branch of network computing with important innovation ideas and huge development potential. Initially, the goal of grid computing research is to intend to connect supercomputers into a telemorphorputers; now, this goal has deepened to establish a general-scale basic support structure for large-scale computing and data processing, A variety of high-performance computers, servers, PCs, information systems, massive data storage and processing systems, application simulation systems, virtual reality systems, instrumentation and information acquisition equipment (such as sensors) integration, providing various application development The underlying technology supports the Internet to a powerful, ubiquitous computing facility.
Grid calculations can be understood from three aspects.
(A) First, from the concept, the goal of grid calculation is the resource sharing and distribution cooperation. This concept of grid can clearly guide the industry and enterprises to conduct industry-based or enterprises' unified planning, deployment, integration, and sharing of resources in various departments, not just in the industry or large enterprises, possession, possession and Use resources. The communication and identity of this idea is critical to the industry and businesses, which will increase or change the planning deployment, operation and management mechanism of the entire industry or enterprise information system.
(B) Second, the grid is a technology. In order to achieve a variety of types of distributed resources sharing and collaboration, grid computing technology must solve multiple levels of resource sharing and cooperation technologies, formulate grid standards, upgrade Internet from communication and information interactions to a platform shared platform .
(C) Finally, the grid is an infrastructure, which is a variety of networks to integrate infrastructure in resources such as computer, data, equipment, services. With the gradual maturity of mesh technology, establish a geographical distribution of large or worldwide large resource nodes, integrated networks on multiple resources, and jointly provide all-round information services to the whole society. The establishment of this facility will make the user like today we need to use traffic, you can easily obtain a variety of services provided by the grid.
As with the transportation system, the grid facilities are planned, construction and operation management, and their complexity and difficulty are quite large. There are changes in ideas and concepts, technology difficulties, and The issues of national laws and policies, etc., need more than years of hard work. However, large companies, industries, defense and other departments can fully implement grid infrastructure strategies from now.
The important strategic significance of grid computing and its broad application prospects have made it a research hotspot that attracts many researchers and huge funds, and some large grid computing research projects are successive. As of now, the most famous grid computing research project includes the following:
● The "distributed grid" research project implemented by the US Natural Science Foundation began in the end of 1997, its goal is to establish all over the United States and the national computing grid, support significant science and engineering calculations, provide users with virtual high performance on the desktop. Calculate the environment.
● IPG (INFORMATION POWER GE) project of NASA (NASA). This is a 20-year research program that allows people to use computing resources and information resources as convenient as transportation resources provided by the transportation network.
● The ASCI grid developed by the US Department of Energy has been put into productive use, and its main use is nuclear weapon research.
● The US Department of Defense's Global Information Grid (GIG) project is the largest grid plan for the US military new century combat support, which is expected to be completed in 2020. ● EURO grid and data grid of the European Community. Mainly used in applications including high energy physics, biological computing, climate simulation and other fields.
● In August 2001, the US NSF announced a major scientific research project, developed a grid system named "Distributed Terscape Facility, referred to as TERA grid, it is the world's first from design Starting a wide-area supercapacity platform for grids is also the first computer infrastructure where there is an ubiquitous computer.
● The Ministry of Science and Technology has carried out national high-performance computing environment (grid) construction and key technologies in "Ninth Five". During the "Tenth Five-Year Plan", the Ministry of Science and Technology has increased the intensity of grid technology research and promotion. The goal is to break through the key technology of grid, establish a grid computing technical standard, and apply grid computing technology to industry and enterprise applications. And enterprise application mesh, further strengthen the construction of national high-performance grid computing environment in the whole society, and promote the formation and development of my country's grid industry.
At present, large grid project research and implementation have a significant feature, that is, each project is directly-oriented, closely related to the application field. Currently, companies such as IBM, HP, Sun, LSF, BoEing have entered the field of grid computing, and to step up research-related technologies and products.
It is necessary to emphasize the relationship between "Grid Computing" and "High Performance Computer". High-performance computers are nodes and important components of grid computing environmental structures; grid computing technology is one of the development directions of high performance computing technology, which cannot replace ultra high performance computer systems. However, future ultra-high performance computer systems must support grid computing environments, which should be easily integrated into the grid computing environment, providing powerful computing and data storage processing capabilities to numerous users. The purpose of grid computing technology is to combine high-performance computing technology and network computing technology to release high-performance computers, construct a public high-performance handling and massive information storage computing infrastructure, so that all kinds of users and applications can share Resource. Therefore, grid calculations will promote the development of high-performance computer applications, promote the development of high-performance computer service markets, stimulate market demand for high performance computers and mass storage systems.
Peer calculation: advocate "equal" sharing
Peer-to-peer, referred to as P2P is a new mode for implementing network computing on the Internet. In this mode, the server and client boundaries disappear, all nodes on the network can share computing resources for other nodes "equality".
IBM is defined below P2P:
The P2P system consists of several interconnected computers, and at least one of the following features: the system is actively collaborated by the marginalization (non-central server) device, each member directly from other members instead of from the participation of the server; Members in the system simultaneously play the role of the server and the client; the user of the system can be aware of each other, constitutes a virtual or actual population.
It is not difficult to see that P2P puts network computing mode from centralized, distributed to the Internet, sharing the spirit of sharing. There is a view that at least hundreds of applications can be developed. However, from the current application, P2P 's power is mainly reflected in the advantages of a wide range of sharing and search.
1) peer calculation
2) Collaborative work
3) Search engine
4) file exchange
That is, P2P diffuses the core of the network application from the central server to the network edge of the terminal device: the server to the server, the server to the PC, the PC to the PC, the PC to the WAP mobile phone, all network nodes can Establish a P2P conversation.
Popularization calculation: calculation is omnipos
Ubiquitous computing or pervasive computing emphasizes the close relationship between the person and the computing environment, making computer and network more effectively into people's lives, allowing people to easily quickly obtain various network computing at any time. service. The content of the popularization calculation research mainly includes two aspects: natural human machine interaction and network calculation. None exceptions in the top 10 universities in the United States have set up a research plan for "popularization calculations" as the main direction. There are currently four research programs, the goal of these programs is to propose a new architecture model and method such as a new architecture, application model, programming model.
● Oxygen Research Program of MIT (MIT)
The program's researchers believe that the future world will be a environment that is full of embedded computers, which have been integrated into people's daily lives. Oxygen wants to make full use of these computing resources to achieve "do less, complete more (to do more do doing less)".
● AURA Research Program of CMU (Cambridge Management Unit Cambridge Management)
It is committed to studying a layer of software layer (referred to as AURA) between users and computing environments, by Aura proxy user, maintains frequently varying, loosely coupled multiple computing devices in distributed computing environments. To complete the user's target task. The idea of AURA is: "User Attention" is the most valuable resource, which should be concentrated on the task to be completed, not managing, configuring hardware and software resources. "
● UC Berkeley (Endeavor) of California University Berkeley
This is a plan to interact with information, equipment, and others by using information technology, new, global information infrastructure. These information facilities should be able to coordinate any available resources in the world in real time to meet the needs of users, and one of its innovations is "Fluid Software). This software can adaptively choose where execution, Where to store, it obtains the available resources through the protocol and serve other entities.
● Washington University Portolano Plan
The program proposes "data-centric network" to accommodate the requirements that make the calculation itself into invisible computing. The program believes that the development of computer technology is still technically driving rather than user demand drive. In order to change this, the program is committed to optimizing the mechanism of software user interface based on the user's location changes, with data-centric networks and new distributed service models.
Although the above four types of network calculations are different, the final goal is consistent:
1) Wide sharing:
The so-called extensive sharing refers to the use of various resources on the network through a variety of methods, technologies, and strategies to share, use;
2) Effective aggregation:
The so-called effective aggregation means that huge resources on the network are integrated through collaborative work connection, producing huge comprehensive efficiency, jointly completing application tasks;
3) Full release:
The so-called full release means that users provide good development methods and use environments, transmitting aggregation of multiple resources on the network to users, providing users with personalized information services, computing services, and decision support services.
But in the face of many network computing techniques and applications, people sometimes distinguish between the technical differences between them, do not know who will become the leading future network calculation. In fact, although the final goal is consistent, the application range of various network computing technology and the scale of research objects, and the level is different.
Object-oriented distributed computing technology emphasizes the integration of distribution systems, with two or multi-layer client / server as the main computing model, and care is to simplify the user's work, strengthen the functionality of multi-layer server, and pay attention to distribution systems Collaborative work and fast application development and implementation, emphasize the interactive, operability, and code portability between application services, usually pay attention to resource sharing within an organization. P2P technology weakens the function of centralized server, pay attention to the role of all individuals in the network, emphasizing the direct communication and contact between individuals, systems, computers, each participant is both a client and a service party. This makes people's sharing behavior on the Internet to a wider level, making people participating in the network in a more active manner. It is essentially different from the CLIENT / Server model used in the current distributed computing technology based on middleware.
Grid calculations emphasize the integration of network basic resources such as computing, data, equipment in the Internet, and force Internet as a social computational infrastructure. In calculating models, technology paths, research objectives, grid computing and current distribution calculation intermediates are different, interoperability, interoperability, and development. It emphasizes the large-scale resource sharing and cooperation between multiple institutions, providing the basic method of resource sharing, and distribution computing technology does not provide a resource sharing universal framework between multiologies. Obviously, grid computing is building a new Internet basic support structure (such as TCP / IP, WWW protocol, and corresponding software system laying the current Internet based on the current Internet), which is a prite practice of information processing infrastructure in the 21st century Terascale facility.
The popularization of computing model is to subvert the traditional way of "human use computer", and change the relationship between people and computers to "Computer as a service". In a sense, it is a better integration with the computing environment.
Although various network computing technologies are different, they are not a conflict relationship, but a orthogonal relationship, sometimes even fuse, therefore, various network computing techniques can coexist. For example, mesh computing and CORBA (public object request agent architecture), SOAP (Symbol Most Assembler Symbolic Optimal Assembly Program), XML (Extension Markup Language EXTENSIBLE MARKUP LANGUAGE) can access multiple The resources of the virtual organization of institutions.
The variability of information technology makes us unable to affirm that the network calculation will develop from 10 years will develop, but a variety of network calculations coexist, integral and fusion are affirmative. In any case, from today's Internet-based network computing practice and research, realize the sharing of network resources, providing large-scale coordinated computing capabilities and effective access to resources, is the trend of network calculation future development, is the next generation of Internet technical foundation.