Chapter-1
Introduction
Grid computing is an extension to parallel and distributed computing. It is an emerging environment to solve large scale complex problems. It enables the sharing, coordinating and aggregation of computational machines to full the user demands. Computational grid is an innovative technology for succeeding generations. It is a collection of machines which is geographically distributed under different organizations. It makes a heterogeneous high performance computing environment.
Task scheduling and machine management are the essential component in computational grid. Due to the widespread use of resources, systems are highly prone to errors and failures. Hence fault tolerance plays a key role in grid to avoid the problem
…show more content…
• Data Grid: Data grids primarily deal with providing services and infrastructure for distributed data-intensive applications that need to access, transfer and modify massive datasets stored in distributed storage resources [4].
1.1.3 Basic grid model:
The basic grid model generally composed of a number of hosts, each composed of several computational resources, which may be homogeneous or heterogeneous. The four basic building blocks of grid model are user, res ource broker, grid information service (GIS) and lastly resources.
Figure 1.2: Basic Grid Model [5]
When user requires high speed execution, the job is submitted to the broker in grid.
1. Users: The user enters the jobs to be executed on processor in computational grid.
2. Resource Broker: Users typically do not interact with Grid services directly. Resource Broker is used to discover computing resources with the help of Information System and provide the jobs suitable resource for their computation. Resource broker is used to find the appropriate resource for the jobs, to do so it contacts the grid information server that keeps the status of all the currently available resources in the grid
…show more content…
Grid was originally conceived and designed in this community to allow access to computing resources that were geographically dispersed. The notion was that underutilized resources in places other than where the researchers were physically located could be used. Also fundamental in the formative thinking was the prospect of sharing access to data, typically in the form of files that were being jointly produced and used by collaborators in disparate locations.
Before discussing more about Grids lets go back to birth of distributed computing:
In the early 1970's when computers were first linked by networks, the idea of harnessing unused CPU cycles was born. A few early experiments with distributed computing — including a pair of programs called Creeper and Reaper — ran on the Internet's predecessor, the
The project will bring several changes to the company; it will first expand the current physical IT environment. It will provide the ability to increase the storage capacity of the current storage requirement and expected growth of data, while establishing a new data warehouse and business analytics applications and user interfaces. The project will also improve security by establishing security policies and it will leverage newer cloud based technology to provide a highly redundant, flexible and scalable IT environment while also allowing the ability to establish a low cost disaster recovery site.
The internet works on the basis that some computers act as ‘servers’. These computers offer services for other computers that are accessing or requesting information, these are known as ‘clients’. The term “server” may refer to both the hardware and software (the entire computer system) or just the software that performs the service. For example, Web server may refer to the Web server software in a computer that also runs other applications or it may refer to the computer system dedicated only to the Web server applicant. For example, a large Web site could have several dedicated Web servers or one very large Web server.
... and so quite one grid, the value of every grid are going to be touch a smaller variety of subscribers, and therefore the monetary value per subscriber, and thence worth, are going to be higher (445 textbook).
As its core essences cloud computing is nothing but a specialized form of grid computing and distributing computing’s which various in terms of infrastructure , deployment, service and Geographic’s dispersion (Veeramachanenin, Sepetember 2015) the cloud enhance scalability, collaboration, availability , ability to adapt to fluctuation according to demand accelerate development work and provide optional for cost reduction and through efficient and optimized computing. (BH kawljeet, June 2015) cloud computing (CC) recently become as a new paradigm for the delivery and hosting of services our the internet. There are mainly three service delivery model Software as Service (SaaS) required software, operating system and network is provided or we can say in SaaS the customer can access the hosted software instead of installing it in local computer and the user can access these software through local computer internet browser (e.g web enabled E-mail ) the user only pay and the cloud service provider is responsible for management or control of mobile cloud infrastructure some of the company which provide such service are Google, Microsoft , Salesforce ,Facebook, etc…..Infrastructure as Service(IaaS)the cloud provider only provide some hardware resources such as network and virtualization is
Valerdi, Ricardo. "The Cloud Systems." Industrial Engineer n.d.: 28. ABI/INFORM Complete. Web. 1 May 2014. .
Cloud computing is computing in which large groups of remote servers are networked to allow the centralized data storage, and online access to computer services or resources. Clouds can be classified as public, private or hybrid
Also, it requires a fast and secure communication to monitor real-time connection status to act as Energy Manager. [SANCHEZ]
What we know today as the Internet began as a Defense Advanced Research Projects Agency (DARPA) project in 1969, which was designed to connect several research databases across the country. However, until the end of 1991, the advances were almost completely technical, as the goals set by those responsible in its growth were beyond what the hardware was capable of providing. In 1988, the Internet began to receive attention in the popular press, when the first documented computer virus was released at Cornell University. 1991 marked the beginning of the transition of the Internet as we know it today, with the National Science Foundation’s reinterpretation of its Acceptable Use Policy to allow for commercial traffic across its network, the development of the first graphic interfaces, the formation of the Internet Society, and the formation of ECHO (East Coast Hang Out), one of the first publicly available online communities.
The United States federal government funded new developments in computer science, which resulted in the creation of ARPANET, a project that connected computer systems at five universities with the intent that if one server was destroyed, the connection would remain due to the four other locations . This fundamental structure of the internet was developed as a peer-to-peer system, which means that there is no central control point in the network, therefore the internet is arranged like a web, in which all pieces of information travel as equals. The interconnectivity of the internet resulted in the creation of the World Wide Web in the early 1990s, which is an internet program that developed the internet into a massive, interactive mass medium
4. increase speed with low cost bring out new solutions and services using speed method on cloud based shared development operations.
Why NetBatch? At my workplace, we have way more computing needs for the number of machines we own. Hence, it would be economically infeasible to buy enough machines to satisfy our peak consumption, which is growing constantly. NetBatch is a tool, which allows our organization to maximize utilization of the available computing resources. This paper discusses about NetBatch and NBS, a package around NetBatch that handles job management, which use principles of queuing, job scheduling, sequencing to achieve its goals.
The author successfully compared the cloud computing with conventional data storage with the help of a graph of computing process and a chart of the advantages of each method. Detailed calculations made by the author are also convincing for illustrating the efficiency of cloud computing. The article helped people to understand why cloud computing was better than conventional way of computing.
The SDI should use open data with collaboration between government and private sector. The data supports interoperability by adhering to the standards. Some of the databases will be deployed on the cloud infrastructure using Infrastructure as a Service (IaaS), Data Storage as a Service (dSaaS), etc.
It simplifies the storage and processing of large amounts of data, eases the deployment and operation of large-scale global products and services, and automates much of the administration of large-scale clusters of computers.
A cloud service is any resource that is provided over the internet. Service delivery in cloud computing comprises three different service models.