32
Truba College of Science & Technology, Bhopal Cloud Computi ng Unit I CLOUD COMPUTING Definition Cloud computing is defined as a type of computing that relies on sharing computing resources rather than having local servers or personal devices to handle applications. In cloud computing, the word cloud (also phrased as "the cloud") is used as a metaphor for "the Internet," so the phrase cloud computing means "a type of Internet-based computing," where different services — such as servers, storage and applications —are delivered to an organization's computers and devices through the Internet. Example of Cloud computing Facebook , LinkedIn , MySpace , Twitter, e- mail, Hotmail or Windows Live Mail, Google Docs, Zoho Office, Yahoo!'s Flickr and Google's Picasa etc. Goal of cloud computing The goal of cloud computing is to apply traditional supercomputing, or high-performance computing power, normally used by military and research facilities, to perform tens of trillions of computations per second, in consumer-oriented applications such as financial portfolios, to deliver personalized information, to provide data storage or to power large, immersive online computer games. To do this, cloud computing uses networks of large groups of servers typically running low-cost consumer PC technology with specialized connections to spread data-processing chores across them. It shared IT infrastructure contains large pools of systems that are linked together. in cloud, virtualization techniques are used to maximize the power of cloud computing. 1 Compiled By – Ms. Nandini Sharma

Cloud computing notes unit I as per RGPV syllabus

Embed Size (px)

Citation preview

Truba College of Science & Technology, Bhopal Cloud Computing Unit I

CLOUD COMPUTING

Definition

Cloud computing is defined as a type of computing that relies on sharing computing resources rather than having local servers or personal devices to handle applications.

In cloud computing, the word cloud (also phrased as "the cloud") is used as a metaphor for "the Internet," so the phrase cloud computing means "a type of Internet-based computing," where different services — such as servers, storage and applications —are delivered to an organization's computers and devices through the Internet.

Example of Cloud computing

 Facebook, LinkedIn, MySpace, Twitter, e-mail, Hotmail or Windows Live Mail,  Google Docs, Zoho Office, Yahoo!'s Flickr and Google's Picasa etc.

Goal of cloud computing

The goal of cloud computing is to apply traditional supercomputing, or high-performance computing power, normally used by military and research facilities, to perform tens of trillions of computations per second, in consumer-oriented applications such as financial portfolios, to deliver personalized information, to provide data storage or to power large, immersive online computer games.

To do this, cloud computing uses networks of large groups of servers typically running low-cost consumer PC technology with specialized connections to spread data-processing chores across them. It shared IT infrastructure contains large pools of systems that are linked together. in cloud,  virtualization techniques are used to maximize the power of cloud computing.

Advantages of cloud computing

1. Worldwide Access. Cloud computing increases mobility, as you can access your documents from any device in any part of the world. For businesses, this means that employees can work from home or on business trips, without having to carry around documents. This increases productivity and allows faster exchange of information. Employees can also work on the same document without having to be in the same place.

2. More Storage. In the past, memory was limited by the particular device in question. If you ran out of memory, you would need a USB drive to backup your current device. Cloud computing provides increased storage, so you won’t have to worry about running out of space on your hard drive.

3. Easy Set-Up. You can set up a cloud computing service in a matter of minutes. Adjusting your individual settings, such as choosing a password or selecting which devices you want to connect to the network, is similarly simple. After that, you can immediately start using the resources, software, or information in question.

1 Compiled By – Ms. Nandini Sharma

Truba College of Science & Technology, Bhopal Cloud Computing Unit I

4. Automatic Updates. The cloud computing provider is responsible for making sure that updates are available – you just have to download them. This saves you time, and furthermore, you don’t need to be an expert to update your device; the cloud computing provider will automatically notify you and provide you with instructions.

5. Reduced Cost. Cloud computing is often inexpensive. The software is already installed online, so you won’t need to install it yourself. There are numerous cloud computing applications available for free, such as Dropbox, and increasing storage size and memory is affordable. If you need to pay for a cloud computing service, it is paid for incrementally on a monthly or yearly basis. By choosing a plan that has no contract, you can terminate your use of the services at any time; therefore, you only pay for the services when you need them.

Disadvantages of cloud computing

1. Security. When using a cloud computing service, you are essentially handing over your data to a third party. The fact that the entity, as well as users from all over the world, is accessing the same server can cause a security issue. Companies handling confidential information might be particularly concerned about using cloud computing, as data could possibly be harmed by viruses and other malware. That said, some servers like Google Cloud Connect come with customizable spam filtering, email encryption, and SSL enforcement for secure HTTPS access, among other security measures.

2. Privacy. Cloud computing comes with the risk that unauthorized users might access your information. To protect against this happening, cloud computing services offer password protection and operate on secure servers with data encryption technology.

3. Loss of Control. Cloud computing entities control the users. This includes not only how much you have to pay to use the service, but also what information you can store, where you can access it from, and many other factors. You depend on the provider for updates and backups. If for some reason, their server ceases to operate, you run the risk of losing all your information.

4. Internet Reliance. While Internet access is increasingly widespread, it is not available everywhere just yet. If the area that you are in doesn’t have Internet access, you won’t be able to open any of the documents you have stored in the cloud.

Historical Development

It was a gradual evolution that started in the 1950s with mainframe computing.

Multiple users were capable of accessing a central computer through dumb terminals, whose only function was to provide access to the mainframe. Because of the costs to buy and maintain mainframe computers, it was not practical for an organization to buy and maintain one for every employee. Nor did the typical user need the large (at the time) storage capacity and processing power that a mainframe provided. Providing shared access to a single

2 Compiled By – Ms. Nandini Sharma

Truba College of Science & Technology, Bhopal Cloud Computing Unit I

resource was the solution that made economical sense for this sophisticated piece of technology.

After some time, around 1970, the concept of virtual machines (VMs) was created.

Using virtualization software like VMware, it became possible to execute one or more operating systems simultaneously in an isolated environment. Complete computers (virtual) could be executed inside one physical hardware which in turn can run a completely different operating system.

The VM operating system took the 1950s’ shared access mainframe to the next level, permitting multiple distinct computing environments to reside on one physical environment. Virtualization came to drive the technology, and was an important catalyst in the communication and information evolution.

In the 1990s, telecommunications companies started offering virtualized private network connections.

Historically, telecommunications companies only offered single dedicated point–to-point data connections. The newly offered virtualized private network connections had the same service quality as their dedicated services at a reduced cost. Instead of building out physical infrastructure to allow for more users to have their own connections, telecommunications companies were now able to provide users with shared access to the same physical infrastructure.

The following list briefly explains the evolution of cloud computing:

Grid computing: Solving large problems with parallel computing.

Utility computing: Offering computing resources as a metered service.

SaaS: Network-based subscriptions to applications.

Cloud computing: Anytime, anywhere access to IT resources delivered dynamically as a service

About the present

SoftLayer is one of the largest global providers of cloud computing infrastructure.

IBM already has platforms in its portfolio that include private, public and hybrid cloud solutions. The purchase of SoftLayer guarantees an even more comprehensive infrastructure as a service (IaaS) solution. While many companies look to maintain some applications in data centers, many others are moving to public clouds.

Even now, the purchase of bare metal can be modeled in commercial cloud (for example, billing by usage or put another way, physical server billing by the hour). The result of this is that a bare metal server request with all the resources needed, and nothing more, can be delivered with a matter of hours.

3 Compiled By – Ms. Nandini Sharma

Truba College of Science & Technology, Bhopal Cloud Computing Unit I

In the end, the story is not finished here. The evolution of cloud computing has only begun. What do you think the future holds for cloud computing?

Vision of cloud computing

A cloud is simply a centralised technology platform which provides specific IT services to a

selected range of users, offering the ability to login from anywhere, ideally from any device

and over any connection, including the Internet.

4 Compiled By – Ms. Nandini Sharma

Truba College of Science & Technology, Bhopal Cloud Computing Unit I

It believes that a true cloud computing service is one which removes the traditional barriers

which exist between software applications, data and devices. In other words, it is the nirvana

of computing from a user’s perspective, No need to worry about location, device, or type of

connection, all the data and the software applications required by the user are fully available

and the experience remains consistent. The highest standards of data protection must be a

5 Compiled By – Ms. Nandini Sharma

Truba College of Science & Technology, Bhopal Cloud Computing Unit I

given, whereby users do not have to think about protecting the integrity of the data they use

and store.

It provides a broad spectrum of both application delivery services to its clients, ranging from

the design, implementation and management of private clouds, right through to the provision

of hosted cloud solutions delivered via own, cloud infrastructure.

Characteristics of Cloud computing as per NIST

The NIST Definition of Cloud Computing

National Institute of Standards and Technology, Information Technology Laboratory

Note 1: Cloud computing is still an evolving paradigm. Its definitions, use cases, underlying

technologies, issues, risks, and benefits will be refined in a spirited debate by the public and

private sectors. These definitions, attributes, and characteristics will evolve and change over

time.

Note 2: The cloud computing industry represents a large ecosystem of many models, vendors,

and market niches. This definition attempts to encompass all of the various cloud approaches.

Definition of Cloud Computing:

Cloud computing is a model for enabling convenient, on-demand network access to a shared

pool of configurable computing resources (e.g., networks, servers, storage, applications, and

services) that can be rapidly provisioned and released with minimal management effort or

service provider interaction. This cloud model promotes availability and is composed of five

essential characteristics, three service models, and four deployment models.

Essential Characteristics:

On-demand self-service. A consumer can unilaterally provision computing

capabilities, such as server time and network storage, as needed automatically without

requiring human interaction with each service’s provider.

6 Compiled By – Ms. Nandini Sharma

Truba College of Science & Technology, Bhopal Cloud Computing Unit I

Broad network access. Capabilities are available over the network and accessed

through standard mechanisms that promote use by heterogeneous thin or thick client

platforms (e.g., mobile phones, laptops, and PDAs).

Resource pooling. The provider’s computing resources are pooled to serve multiple

consumers using a multi-tenant model, with different physical and virtual resources

dynamically assigned and reassigned according to consumer demand. There is a sense

of location independence in that the customer generally has no control or knowledge

over the exact location of the provided resources but may be able to specify location

at a higher level of abstraction (e.g., country, state, or datacenter). Examples of

resources include storage, processing, memory, network bandwidth, and virtual

machines.

Rapid elasticity. Capabilities can be rapidly and elastically provisioned, in some cases

automatically, to quickly scale out and rapidly released to quickly scale in. To the

consumer, the capabilities available for provisioning often appear to be unlimited and

can be purchased in any quantity at any time.

Measured Service. Cloud systems automatically control and optimize resource use by

leveraging a metering capability at some level of abstraction appropriate to the type of

service (e.g., storage, processing, bandwidth, and active user accounts). Resource

usage can be monitored, controlled, and reported providing transparency for both the

provider and consumer of the utilized service.

Service Models:

Cloud Software as a Service (SaaS). The capability provided to the consumer is to use

the provider’s applications running on a cloud infrastructure. The applications are

accessible from various client devices through a thin client interface such as a web

browser (e.g., web-based email). The consumer does not manage or control the

underlying cloud infrastructure including network, servers, operating systems,

storage, or even individual application capabilities, with the possible exception of

limited user-specific application configuration settings.

Cloud Platform as a Service (PaaS). The capability provided to the consumer is to

deploy onto the cloud infrastructure consumer-created or acquired applications

created using programming languages and tools supported by the provider. The

consumer does not manage or control the underlying cloud infrastructure including

7 Compiled By – Ms. Nandini Sharma

Truba College of Science & Technology, Bhopal Cloud Computing Unit I

network, servers, operating systems, or storage, but has control over the deployed

applications and possibly application hosting environment configurations.

Cloud Infrastructure as a Service (IaaS). The capability provided to the consumer is

to provision processing, storage, networks, and other fundamental computing

resources where the consumer is able to deploy and run arbitrary software, which can

include operating systems and applications. The consumer does not manage or control

the underlying cloud infrastructure but has control over operating systems, storage,

deployed applications, and possibly limited control of select networking components

(e.g., host firewalls).

Deployment Models:

Private cloud. The cloud infrastructure is operated solely for an organization. It may

be managed by the organization or a third party and may exist on premise or off

premise.

Community cloud. The cloud infrastructure is shared by several organizations and

supports a specific community that has shared concerns (e.g., mission, security

requirements, policy, and compliance considerations). It may be managed by the

organizations or a third party and may exist on premise or off premise.

Public cloud. The cloud infrastructure is made available to the general public or a

large industry group and is owned by an organization selling cloud services.

Hybrid cloud. The cloud infrastructure is a composition of two or more clouds

(private, community, or public) that remain unique entities but are bound together by

standardized or proprietary technology that enables data and application portability

(e.g., cloud bursting for load-balancing between clouds)

Cloud computing reference model

Reference models share the following characteristics:

They represent a problem domain

They are often defined for problem domains that are not well understood or

understood in a variety of different ways by different people, or that are sufficiently

complex so that understanding them requires that the problem domain for which

8 Compiled By – Ms. Nandini Sharma

Truba College of Science & Technology, Bhopal Cloud Computing Unit I

they’re created be decomposed into lower-level entities that promote common

understanding

They often consist of a diagram of entities, the relationships between the entities, and

descriptive text that clearly defines each entity and relationship in the diagram

They are typically vendor/product-agnostic and standards-agnostic to allow for

various implementations that are based on them

They provide common terminology in the problem domain for which they’re created

They can serve as a foundation for designing and implementing solutions in the same

problem domain for which they were created

The problem domain for the Cloud Services Foundation Reference Model (CSFRM) is cloud

services foundation. Although the term is defined extensively in the Overview article of this

article set, the short definition is:

The minimum amount of vendor-agnostic hardware and software technical capabilities and

operational processes necessary to provide information technology (IT) services that exhibit

cloud characteristics, or simply, cloud services.

It’s important to note that although the problem domain is the foundation for providing cloud

services, it does not include cloud services. 

 Usage of reference model

In addition to the attributes of reference models already listed, the CSFRM serves as a

framework that can be used to help cloud services providers answer the following questions:

What kinds of service level requirements should I define before I either design or

implement a new cloud service or technical capabilities that support or enable cloud

services?

What kinds of operational processes do I require to operate a cloud service over its

lifetime?

What technical capabilities do I require to host, support, or manage cloud services?

How will the services I provide be offered and presented to my consumers? 

Cloud Services Foundation Reference Model

9 Compiled By – Ms. Nandini Sharma

Truba College of Science & Technology, Bhopal Cloud Computing Unit I

 

It includes three types of entities:

1. Subdomains: The large blue and green boxes, some of which contain components

2. Components: The small boxes inside many of the subdomains

3. Relationships: The arrows between subdomains

Subdomains exist in the CSFRM to:

Divide the cloud services foundation problem domain so that each subdomain can be

defined separately.

Enable a collection of components to be referred to collectively. For example, the

components in the Infrastructure subdomain are Infrastructure components.

10 Compiled By – Ms. Nandini Sharma

Truba College of Science & Technology, Bhopal Cloud Computing Unit I

Enable a relationship entity to represent the relationship between all of the

components in a subdomain to the components of other subdomains. As a result, the

relationships between subdomains then also collectively apply to the components that

are contained in each subdomain. The relationships are represented by arrows in the

model. The verbs by the arrows describe the relationship between the components in

the subdomain that the arrow points from and the components in the subdomain that

the arrow points to. Therefore, you could say that the Service Delivery subdomain

components define the Service Operations subdomain components.

Cloud and dynamic infrastructure

A dynamic infrastructure is designed for today’s instrumented and interconnected world,

helping clients integrate their growing intelligent business infrastructure with the necessary

underlying design of a flexible, secure and seamlessly managed IT infrastructure.

To leverage the advantages of a dynamic infrastructure—designed to be service-oriented and

focused on supporting and enabling end users in a highly responsive way—businesses need

to investigate their needs and create a plan of action.

As an IBM Business Partner, we can offer in-depth briefings, collaborative workshops and

assessments, and testing centers, as well as many services, to help you integrate both the

business and IT infrastructures while taking a smarter, more streamlined approach to helping

improve service, reduce cost, and manage risk.

A dynamic infrastructure aligns business and IT assets to support the overall goals of the

business while taking a smarter, new and more streamlined approach that:

Integrates visibility, control, and automation across all business and IT assets.

Is highly optimized to do more with less.

Addresses the information challenge.

Manages and mitigates risks.

Utilizes flexible delivery choices like clouds.

Overview of cloud applications

11 Compiled By – Ms. Nandini Sharma

Truba College of Science & Technology, Bhopal Cloud Computing Unit I

ECG Cloud is the award winning CLOUD based remote 12-lead resting ECG reporting

SAAS (software as a service) application developed by Technomed Limited.

ECG Cloud is operated by Technomeds own inhouse telemedicine service, the Technomed

Monitoring Centre. ECG Cloud is also available for licence to other third party cardiology

service providers.

The Technomed Monitoring Centre, using ECG Cloud, offers GP practices, medical centres,

and hospitals access to immediate, expert, clinician interpretation of ECGs at the point of

care. This has the potential to save the NHS money by reducing the need for outpatient

referrals. It also improves patient care by providing support for clinician patient management

together with reduced waiting times for diagnostic tests.

By directly engaging specialist cardiology expertise at an early stage, a secondary care

referral only occurs if the diagnostic result indicates that secondary care attention is

immediately required or that all diagnostic or treatment options have been exhausted in

primary care. This strategy has significant economical and patient healthcare benefits. As our

team of experts operate remotely, we deliver a scalable and flexible service that easily

accommodates the requirements of our customers 365 days of the year 24 hours a day.

ECG acquisition & interpretation issues

The ability to acquire a high quality electrocardiogram and subsequently accurately interpret

it without specialist cardiology training is a recognised problem both inside and outside a

hospital environment.

Many ECG machines are available with built-in computer generated ECG interpretation.

Whilst these are sensitive, they lack specificity. The large number of false positive results that

are produced, leads to unnecessary patient referral and anxiety. In addition, the absence of a

relevant patient history reduces the likelihood of providing accurate patient specific advice.

Clinical studies suggest that non-cardiology clinicians have difficulty in interpreting all types

of ECG when compared to cardiologists. The 2007 SAFE trial concluded.

Many primary care professionals cannot accurately detect atrial fibrillation on an

electrocardiogram, and interpretative software is not sufficiently accurate to circumvent this

12 Compiled By – Ms. Nandini Sharma

Truba College of Science & Technology, Bhopal Cloud Computing Unit I

problem, even when combined with interpretation by a general practitioner. Diagnosis of a

trial fibrillation in the community needs to factor in the reading of electrocardiograms by

appropriately trained people.

The historically paper based process of printing and scanning followed by faxing or posting

of ECG’s for expert interpretation is often extremely time-consuming, results in poor quality

tracings and is generally inefficient. The ECG Cloud was developed to preserve the fidelity

of the original recordings in digital format, speed up the reporting process and improve

efficiency by integrating with existing systems to streamline referrals and subsequent patient

management.

Although we recommend ECG Cloud is used with the Mortara range of ECG machines, ECG

Cloud allows the option for digital upload of an ECG from any ECG machine brand.

What is ECG Cloud?

ECG Cloud is a browser based reporting and automated interpretation system. It allows test

data and accompanying patient history to be acquired from multiple remote sites and

analysed centrally by a competent ECG experts. An ECG machine can be placed in each

clinical environment or can be deployed in a hub & spoke configuration. Both acquisition and

technical reporting are carried out in a quality controlled environment. Interpretation and

patient management best practice is provided in a reproducible manner by using a consultant

cardiologist board adjusted algorithm. Acquisition, reporting and algorithm interpretation are

subject to continuous audit. The audit output is used to further enhance and refine the system.

Why is ECG Cloud different?

Conventional primary care ECG service models use computer based methods for automated

ECG measurement and pattern recognition followed by human interpretation of the results.

The developers of ECG Cloud recognise that computer based methods for automated ECG

measurement and pattern recognition are highly susceptible to signal artifact and that the

common ECG environment is prone to sources of signal artifact. The developers believe that

a well trained human brain is more effective at rejecting noise and artifact that arise in real

life environments than a computer.

13 Compiled By – Ms. Nandini Sharma

Truba College of Science & Technology, Bhopal Cloud Computing Unit I

The ECG Cloud developers also recognise that presenting the same ECG to a number of ECG

experts is likely to result in a variance in interpretation. In fact presenting the same ECG to

the same expert on different days will sometimes result in a variance between interpretations.

Algorithms are more reproducible in this respect.

ECG Cloud therefore turns the traditional model of ECG interpretation on its head by

employing a human for pattern recognition and measurement using a standardised analysis

protocol together with subsequent results processing by a computer algorithm to derive the

optimum patient management recommendation.

Methodology

A detailed breakdown of the methodology, including visuals and a step by step process is

available in the supporting videos.

The ECG Cloud System allows ECG’s to be recorded and immediately transmitted to a

remote cardiology expert with the scope to return the results on a while-you-wait basis. The

technology has proven easy to use in general practice and can be operated by healthcare

assistants with the minimum of training. Using a Mortara ELI-10 with barcode data entry, an

operator can process up to 20 patients per hour per workstation using a 6-lead ECG

configuration (rhythm check) or 8 patients per hour with a standard 12-lead ECG

configuration.

Practices subscribing to the service send ECG’s digitally to the Technomed Monitoring

Centre and receive an immediate verbal interpretation, if required, followed by a full written

clinician interpretation. The cardiology specialists at the reporting and analysis centre are

fully qualified and are routinely audited. The telemedicine facility is operated on the NHS N3

network. The built-in quality control processes assure the highest standards of care and

clinical governance.

Virtual Outpatient Department deploys accredited ECG acquisition centers (Hubs). Only

Protein structure prediction

The goal of protein structure prediction programs is to predict the secondary, tertiary, or

quaternary structure of proteins based on the sequence of amino acids. Protein structure

14 Compiled By – Ms. Nandini Sharma

Truba College of Science & Technology, Bhopal Cloud Computing Unit I

prediction is important because the structure of a protein often gives clues to its function.

Besides being an interesting computational problem, determining a protein's function is

important for rational drug design, genetic engineering, modelling cellular pathways, and

studying organismal function and evolution. Currently, protein structures may be found via

complicated crystallography experiments. Homology studies, mutagenesis, biochemical

analysis, and other modeling studies on the solved structure can then be used to deduce the

protein's function. As the whole process is long and uncertain, computer algorithms capable

of shortening the structure prediction step greatly enhances protein studies.

Protein Structure

Proteins are composed of monomers called amino acids. Amino acids contain amine and

carboxyl functional groups and variable R side chains. There are twenty types of amino acids

i.e. twenty different R groups, and they can be joined together via peptide bond formation

(dehydration synthesis). Depending on the polarity of the side chains, amino acids can be

hydrophobic or hydrophilic to varying degrees.

 Proteins have four levels of structure:

Primary: the sequence of amino acids

Secondary: basic structures, such as alpha helices, beta sheets, and loops

Tertiary: the three-dimensional conformation of the protein

Quaternary: how several peptide strands interact with each other. For example,

haemoglobin has four protein subunits.

Protein folding generally follows several principles that may be implemented by algorithms

to predict structure:

Rigidity of the protein backbone: may be determined by the size and structure of

amino acids

Steric complementarity: whether the shape of a section of protein fits with another

section. If atoms are brought too close together, there is an energy cost due to

overlapping electron clouds.

Secondary structure preferences/hydrogen bonds: chemical groups of opposite

polarities tend to be attracted to each other.

15 Compiled By – Ms. Nandini Sharma

Truba College of Science & Technology, Bhopal Cloud Computing Unit I

Hydrophobic/polar patterning: sections of protein that are hydrophobic tend to be

shielded from water (which usually surrounds the protein).

Electrostatics: some amino acids have polar side chains, so proteins typically have

sections that are positively or negatively charged.

Protein Databases

The software, known as Myrna, uses "cloud computing," an Internet-based method of sharing

computer resources. Faster, cost-effective analysis of gene expression could be a valuable

tool in understanding the genetic causes of disease. The findings are published in the current

edition of the journal Genome Biology. 

Cloud computing bundles together the processing power of the individual computers using

the Internet. A number of firms with large computing centers including, Amazon and

Microsoft, rent unused computers over the Internet for a fee.

"Cloud computing makes economic sense because cloud vendors are very efficient at running

and maintaining huge collections of computers. Researchers struggling to keep pace with

their sequencing instruments can use the cloud to scale up their analyses while avoiding the

headaches associated with building and running their own computer center," said lead author,

Ben Langmead, a research associate in the Bloomberg School's Department of Biostatistics. "

Satellite Image Processing

The specific process which should be implemented is the matching of satellite earth observation imagery to road vectors by correlation, in order to precisely geo-locate the image on the ground, using the road vectors as reference. This process is also known as georeferencing. To accomplish this task the images are divided into a predefined number of subimages (also called correlation cells) and for each subimage the the displacement vector in x and y dimension is calculated for maximal correlation with the road reference image. The complete number of steps to perform for each subimage are the following:

Extract satellite subimage and road vector subimage with given coordinates and dimensions

Apply edge filter on satellite subimage to extract edges.  Correlate edge filtered subimage with road subimage for a given number x/y offsets and

identify x/y combination with maximal correlation.

16 Compiled By – Ms. Nandini Sharma

Truba College of Science & Technology, Bhopal Cloud Computing Unit I

Input Data

As a representative real world example the PoC was carried out with a single satellite scene over Germany with approximately 5 m ground resolution. 

Parameters of this scene are typical values

Image dataFormat: raw byte arrayPixel rows: 44000Pixel columns: 40000Byte per pixel 1 (greyscale)Files / bands: 3

Road reference data

Same as image data, one single file

Byte per pixel: 1

A table containing the processing steps to perform on the data was provided as CSV file with the following structure:

Field Description

Type Defines type of processing step (extract and filter image, extract roads, correlate)

band which band (file) shall be processed

X x position in image file for extraction or x offset for correlation

Y y position in image file for extraction of y offset for correlation

Xdim horizontal dimension of subimage

Xdim vertical dimension of subimage

Solution Design on Google Cloud Platform

17 Compiled By – Ms. Nandini Sharma

Truba College of Science & Technology, Bhopal Cloud Computing Unit I

For solving the task using the Google Cloud Platform we have decided to store the satellite images on Google Cloud Storage. Each file has a size of about 1.6 GB and we had four of them: three satellite images (red, green and blue channel) and one road reference image.

For the processing of the image data we had the alternatives of using App Engine or Compute Engine. As we would have had to orchestrate Compute Engine by an App Engine application and the scope of the PoC was only 5 men days we have chosen to completely solve the task using App Engine and Java as the programming language.

The following image illustrates the high level solution design:

The main components of the solution are:

A web servlet showing a simple UI which allows to set some configuration parameters, start a new job or see the current status of the job.

The application core (controller) which controls the processing of the image data. It reads the processing steps and puts new tasks into the task queue. We have also implemented the usage of the Pipeline API as an alternative. In both cases we interact with the App Engine Datastore for storing configuration of the individual tasks.

Child tasks that are spawn by the Task queue / Pipeline API automatically and that operate on subimages of the image data. They access the image data located on Google Cloud Storage using the Google Cloud Storage Java API. The API provides methods to position the read cursor at a specific location inside the file so that it will be possible to read subimages without having to read the whole file.

The child tasks will also perform the image processing itself (edge detection and correlation). 

Calculation results are stored into datastore for later display / download.

CRM in Cloud Computing

What is CRM? 

18 Compiled By – Ms. Nandini Sharma

Truba College of Science & Technology, Bhopal Cloud Computing Unit I

CRM (Customer Relationship Management) cloud apps allow sales managers to monitor and

analyse their team's activities so they can forecast sales and plan ahead. For sales reps, CRM

cloud apps make it easy to manage customer profile and case history information, freeing up

their time and empowering them with expertise.

For sales and marketing

For sales managers, CRM cloud apps provide real-time visibility into their team’s activities

so they can forecast sales with confidence. For sales reps, CRM cloud apps make it easy to

manage customer information so reps spend less time handling data and more time with

customers.

For marketers, nothing is more important than tracking the sales that result from leads

generated through marketing campaigns on your Web site, in email, or with Google

AdWords.

For customer service

Your customers have questions about your products. Today, they might go to Google or

Twitter to look for answers and only contact your call center if they can’t find what they

need. To deliver stellar customer service, you need to connect all the conversations that

happen on social networks with the internal knowledge your agents use every day.

CRM Cloud Platform

CRM cloud apps need to be easy to use for sales, marketing, and service professionals in any

industry. That’s why smart companies rely on a CRM platform that gives them complete

freedom to customize CRM for their business. It’s the best way to boost adoption and make

sure your CRM apps are working the way you do.

CRM Cloud Infrastructure

Successful CRM customers rely on a proven, trusted infrastructure—the servers and software

in a data center—for running their CRM applications. For CRM to work effectively, it must

have three characteristics:

19 Compiled By – Ms. Nandini Sharma

Truba College of Science & Technology, Bhopal Cloud Computing Unit I

High reliability – uptime that exceeds 99.9%

High performance – data access in less than 300 ms

High security – industry certifications such as ISO27001 and SAS 70 Type II

An effective CRM infrastructure is based on multitenancy: multiple customers sharing

common technology and all running on the latest release, much like Amazon.com or Google.

With multitenancy, you don’t have to worry about application or infrastructure upgrades—

they happen automatically. In fact, multitenancy lets companies focus on managing CRM,

not managing technology.

ERP is short for enterprise resource planning.

Enterprise resource planning (ERP) is business process management software that allows an organization to use a system of integrated applications to manage the business and automate many back office functions related to technology, services and human resources. ERP software integrates all facets of an operation, including product planning, development, manufacturing, sales and marketing.

ERP software is considered an enterprise application as it is designed to be used by larger businesses and often requires dedicated teams to customize and analyze the data and to handle upgrades and deployment. In contrast, Small business ERP applications are lightweight business management software solutions, customized for the business industry you work in.

Customer-Focused Organizations Must Take a Strategic Approach to "Identity Relationship Management"

ERP Software Modules

ERP software typically consists of multiple enterprise software modules that are individually purchased, based on what best meets the specific needs and technical capabilities of the organization. Each ERP module is focused on one area of business processes, such as product development or marketing. A business can use ERP software to manage back-office activities and tasks including the following:

Distribution process management, supply chain management, services knowledge base, configure, prices, improve accuracy of financial data, facilitate better project planning, automate employee life-cycle, standardize critical business procedures, reduce redundant tasks, assess business needs, accounting and financial applications, lower purchasing costs, manage human resources and payroll.

Some of the most common ERP modules include those for product planning, material purchasing, inventory control, distribution, accounting, marketing, finance and HR.

20 Compiled By – Ms. Nandini Sharma

Truba College of Science & Technology, Bhopal Cloud Computing Unit I

As the ERP methodology has become more popular, software applications have emerged to help business managers implement ERP in to other business activities and may incorporate modules for CRM and business intelligence, presenting it as a single unified package.

The basic goal of using an enterprise resource planning system is to provide one central repository for all information that is shared by all the various ERP facets to improve the flow of data across the organization.

Top ERP Trends

The ERP field can be slow to change, but the last couple of years have unleashed forces which are fundamentally shifting the entire area. According to Enterprise Apps Today, the following new and continuing trends affect enterprise ERP software:

1. Mobile ERP 

Executives and employees want real-time access to information, regardless of where they are. It is expected that businesses will embrace mobile ERP for the reports, dashboards and to conduct key business processes.

2. Cloud ERP

The cloud has been advancing steadily into the enterprise for some time, but many ERP users have been reluctant to place data cloud. Those reservations have gradually been evaporating, however, as the advantages of the cloud become apparent.

3. Social ERP

There has been much hype around social media and how important – or not -- it is to add to ERP systems. Certainly, vendors have been quick to seize the initiative, adding social media packages to their ERP systems with much fanfare. But some wonder if there is really much gain to be had by integrating social media with ERP.

4. Two-tier ERP

Enterprises once attempted to build an all-encompassing ERP system to take care of every aspect of organizational systems. But some expensive failures have gradually brought about a change in strategy – adopting two tiers of ERP.

ERP Vendors

Depending on your organization's size and needs there are a number of enterprise resource planning software vendors to choose from in the large enterprise, mid-market and the small business ERP market.

21 Compiled By – Ms. Nandini Sharma

Truba College of Science & Technology, Bhopal Cloud Computing Unit I

Large Enterprise ERP (ERP Tier I)

The ERP market for large enterprises is dominated by three companies: SAP, Oracle

and Microsoft. (Source:EnterpriseAppsToday; Enterprise ERP Buyer's Guide: SAP,

Oracle and Microsoft; Drew Robb)

Mid Market ERP (ERP Tier II)

For the midmarket vendors include Infor, QAD, Lawson, Epicor, Sage and IFS.

(Source: EnterpriseAppsToday; Midmarket ERP Buyer's Guide; Drew Robb)

Small Business ERP (ERP Tier III)

Exact Globe, Syspro, NetSuite, Visibility, Consona, CDC Software and Activant

Solutions round out the ERP vendors for small businesses.

(Source: EnterpriseAppsToday; ERP Buyer's Guide for Small Businesses; Drew

Robb)

Millions of people are connected to the Internet and a lot of those people are

connected on social networking sites.

Social networks have become an excellent platform for sharing and communication

that reflects real world relationships.  Social networking plays a major part in the

everyday lives of many people. Facebook is one social networking site that has more

than 400 million active users. The possibility of social media and cloud integration is

compelling.

Social networks are being more than an online gathering of friends. It’s becoming a

destination for ideation, e-commerce and marketing. For instance, there are some

organizations and integrated applications that make use of Facebook credentials for

authentication rather than requiring their own credentials (for example the Calgary

Airport authority in Canada uses Facebook Connect2 to grant access to their WiFi

network).

There is a certain report which aims to create a Social Storage Cloud that looks at

probable mechanisms to be used in creating a  dynamic cloud infrastructure in a

Social network environment. It is believed that combining the pre-established trust

with suitable incentive mechanisms can be a way to generate sustainable resource

sharing mechanisms.

22 Compiled By – Ms. Nandini Sharma

Truba College of Science & Technology, Bhopal Cloud Computing Unit I

Social network is a dynamic virtual organization with inherent trust relationships

between friends. This dynamic virtual organization can be created since these social

networks reflect real world relationships. It allows users to interact, form connections

and share information with one another.  This trust can be used as a foundation for

information, hardware and services sharing in a Social Cloud.

Typically, cloud environments provide low level abstractions of computation and

storage. Computation and Storage Clouds act as building blocks where high level

service Clouds and mash-ups can be created. Storage Clouds are often used to prolong

the capabilities of storage-limited devices and provide transparent access to data from

anywhere.

A large number of commercial Cloud providers like Microsoft Azure, Amazon

EC2/S3, Google App Engine, and smaller scale open Clouds like Nimbus and

Eucalyptus provide access to scalable virtualized resources. Through pre-dominantly

posted price mechanisms, these computation, storage, applications resources can be

accessed.

Thus, a Social Cloud is a scalable computing model wherein virtualized resources

contributed by users are dynamically provisioned amongst a group of friends. Users

may choose to share these resources freely and make use of a reciprocal credit-based

model; This compensation free model is similar to the Volunteer computing approach,

where guarantees are offered through customized SLAs.  However, accountability

through existing friend relationships exists in this model.

By leveraging social networking platforms, people can gain access to huge user

communities, exploit existing user management functionality and rely on pre-

established trust formed through user relationships.

23 Compiled By – Ms. Nandini Sharma