12
WHY BANKS NEED THE DOCKER DATACENTER BRIEFING DOCKER AUTHORISED CONSULTANCY & TRAINING PARTNER

Docker Datacentre

Embed Size (px)

Citation preview

Page 1: Docker Datacentre

WHY BANKS NEED THE DOCKER DATACENTER

BRIEFINGDOCKER AUTHORISED CONSULTANCY & TRAINING PARTNER

Page 2: Docker Datacentre

E: [email protected] W: www.contino.io

Page 3: Docker Datacentre

This Briefing describes:• The complex, interconnected, legacy application stacks which make up

modern banking platforms;

• The challenging environment of digital disruption potentially impacting the banks, and the need to speed up software deployment cycles in response to this;

• Why containerization, specifically implemented via the Docker Datacenter toolchain is a key tool in solving this dichotomy;

• Specific challenges and situations found within banking application delivery which the Docker Datacenter stack helps to overcome;

• Why banks need to develop a container strategy to drive successful production deployment of the Docker Datacenter and the elements of a holistic container strategy.

The volume and scale of software underpinning operation of a modern bank is mind boggling.

Within the largest global banks, you’ll find thousands of applications, many of which will consist of millions of lines of code. These applications will be integrated with each other into a complex spiders web of dependencies across the bank, exchanging data and messages to fulfil various business processes.

This large interconnected application landscape will have evolved over decades, resulting in lots of legacy technology which the modern applications need to integrate into. Within a large bank, you will find almost any technology under the sun if you look hard enough - right the way up the stack from the physical servers, operating systems, middleware, programming languages to the application frameworks.

Moving down a level, the infrastructure landscape is no simpler. Hundreds of thousands of physical and virtualised servers are likely in tens of data centres, again with their own legacy and diversity of technology in evidence.

Together, this complex stack of technology will be processing transactions on a 24x7x365 basis in data centres around the world, and thousands of engineers will be

working hard and pushing out changes into the environment every day.

If any piece of this process fails badly at any time, the cost, regulatory and reputational impacts to the bank can be significant, so banks must work hard to maintain quality at all times in this incredibly complex environment.

When you put it like this, running banking technology sounds like a challenging prospect!

DIGITAL DISRUPTIONAt the same time, banking is under the same pressure as many other industries - the need to go faster and be more agile with their software delivery in order to build better and more fully featured online experiences for their customers.

This “digital” agenda means there is an increasing need to innovate and ship even more high quality software more frequently than before into the complex technology environment described above.

This is not a simple challenge to meet however. As well as the complex technology legacy described above, there are also legacy ways of working to modernise, increasingly competitive market environments, and more regulatory

changes to implement - all which needs to be overcome if banks are to get to where they need to be in terms of their software delivery.

The imperative to get this right is huge. For the first time, banks are at risk of genuine disruption, with industry insiders talking about an “Uber moment” where startups with less technology legacy and better ability to execute have the potential to disrupt long established business models.

This all adds up to the fact that banks are finding themselves stuck between a rock and a hard place - a complex, legacy landscape in terms of technology and ways of working on one side, and the need to innovate at scale on the other.

“THE DIGITAL AGENDA MEANS THERE IS AN INCREASING NEED TO INNOVATE AND SHIP EVEN MORE HIGH QUALITY SOFTWARE MORE FREQUENTLY THAN BEFORE INTO THIS COMPLEX TECHNOLOGY ENVIRONMENT”

3

WH

Y BA

NKS

NEE

D T

HE

DO

CKER

DA

TACE

NTE

R

Page 4: Docker Datacentre

LEFT: Virtual machines incorporate a separate guest operating system and hypervisor, making them a heavyweight solution to achieving process isolation.

RIGHT: Docker containers package libraries and applications together and can run on a virtual machine or on top of bare metal without any hypervisor.

WH

Y BA

NKS

NEE

D T

HE

DO

CKER

DA

TACE

NTE

R

4

We believe that containerisation, specifically supported by the Docker Datacenter stack is a key technology approach and toolset which will allow banks to solve this dichotomy.

This document aims to comprehensively explain why this is the case, illustrating the advantages of the Docker Datacenter platform with reference to real situations that are commonly found within software development in banking environments.

After making the case why we believe that the Docker Datacenter is so important for banks, this document goes on to describe how to be successful in the adoption of the platform in a banking environment. Though the Docker datacenter is an incredibly powerful toolkit, organizations typically need to go on a journey of alignment, upskilling and governance definition to understand how best to harness the technology and take it through to a successful production deployment.

As compelling supporting evidence of the potential here, investment bank Goldman Sachs recently went on record with their aim of containerising 90% of application workloads, including legacy platform applications. The publicly stated aim of this incredibly ambitious project is to secure operational efficiencies in how software is deployed and ran. Goldman Sachs would not be adopting such an enterprise-wide project without a large prize at the end of it.

ABOUT DOCKERBefore jumping into the detail of why containers and the Docker Datacenter are such a powerful platform for banks, we wanted to spend some time explaining the concept of Docker for people new to the platform.

Typically, developers deploy their software onto physical or virtual machines. Within banking environments this is usually within private data centres, though occasionally and increasingly within public cloud environments.

An operating system is deployed on top of these machines (e.g. Redhat Linux or Microsoft Windows.).

On top of the operating system might be some middleware, (e.g. A Java Virtual Machine or an application server) and also some dependent libraries (e.g. Java JAR files or C++ shared libraries).

On top of all of that, our application or service is deployed. This is the actual code which delivers business value for our users and the thing we want to iterate on quickly.

Docker allows us to package up this operating system, dependencies of the application and middleware into a reusable and shippable binary called an image.

Docker then allows us to move that image around easily - deploying it to different machines from the developer’s laptop, into development and test machines, and ultimately into production.

CONTAINERS AND THE DOCKER DATACENTERContainers and the Docker Datacenter stack are an incredibly powerful toolset which makes it easier to build, ship and run applications. By leveraging these tools, banks can solve the challenge of needing to innovate against a complex, interconnected legacy software environment.

HYPERVISOR

GUEST OSBINS / LIBS

APP A

BINS / LIBS

APP A

GUEST OSBINS / LIBS

APP B

BINS / LIBS

APP B

HOST OS HOST OS

SERVER SERVER

DOCKER ENGINE

Page 5: Docker Datacentre

451 RESEARCH BELIEVES DOCKER AND MODERN APPLICATION CONTAINERS ARE UNDOUBTEDLY A DISRUPTIVE TREND IN ENTERPRISE IT, PARTICULARLY GIVEN THE NUMBER OF STARTUPS THEY HAVE PROMPTED, THE NUMBER OF ESTABLISHED VENDORS THAT ARE SUPPORTING AND INTEGRATING WITH CONTAINERS AND END USER ADOPTION, INCLUDING PRODUCTION IMPLEMENTATIONS.

DRIVEN BY SPEED, SIMPLICITY, MANAGEABILITY AND EFFICIENCY BENEFITS AS WELL AS GOOD TIMING WITH ENTERPRISE ADOPTION OF CLOUD, CONTAINERS ARE HAVING A BROAD IMPACT ON THE INDUSTRY. WE ALSO SEE ADDITIONAL INNOVATION AND DISRUPTION IN CONTAINER MANAGEMENT AND ORCHESTRATION SOFTWARE, WHICH IS DRAWING INTERESTING VENDORS, INTEGRATIONS AND PARTNERSHIPS, AS WELL AS COMPETITION.

Jay LymanResearch Manager, Cloud Management and Containers

451 Research

5

WH

Y BA

NKS

NEE

D T

HE

DO

CKER

DA

TACE

NTE

R

Page 6: Docker Datacentre

Once the images are deployed, we can then instantiate live instances of those images into running containers - and here is where the key benefit is realised.

Because our applications are packaged alongside their dependencies and the underlying operating system, they are guaranteed to run in exactly the same way, regardless of where they are deployed to - laptop, data centre, private or public cloud.

We can run many of these containers on a single host. Processes within these containers share resources of the underlying machine, but only the resources to which they are explicitly granted permission. Containers also can’t communicate with other containers on the same host unless specifically wired up to do so. This fine grained isolation gives us many security benefits whilst allowing us to make more efficient use of the server resources.

It’s a simple idea, but the ability to containerise applications in this way is a powerful way of building, shipping and running software which solves real challenges in the software development lifecycle. So much so that over 2 billion pulls have occurred from Docker’s central images registry, making Docker the fastest growing open source tool of all time.

ABOUT THE DOCKER DATACENTERDocker is about more than just containers however. Docker Inc, the commercial entity behind Docker, have an integrated stack of tooling which supports the entire lifecycle of building, shipping and running Docker containers. Together, this tooling is integrated into what is called the Docker Datacenter.

First, the Docker Datacenter incorporates the Docker toolbox which encapsulates the development tools and develop experience for building containers. It allows the developer to provision Docker environments on their laptops, and a set of tools for building containers and signing them.

Second, it includes the Docker Trusted Registry for the management and versioning of the images. This incorporates a workflow called Content Trust which allows us to sign and verify our images as they move between environments and machines.

Third, Docker Datacenter has a runtime component called Swarm. This manages the orchestration and scaling of containers when they are deployed into the production environment. We can deploy our containers into Swarm, and Swarm will handle the scaling and deployment of them across the nodes.

Next, the Docker Datacenter includes the Universal Control Plane for managing the containers, including versioning and management of them through a GUI interface which incorporates role based access control.

Together, the Docker Datacenter components combine into an integrated toolchain which allows us to securely build, manage, deploy and run our containers within the enterprise.

Running many containers on a single host, isolated from each other…

Docker Datacenter – an integrated pipeline of tools for the build, shipping and running of containerized applications

DOCKER CONTAINERS AS A SERVICE (CAAS)

DOCKER ENGINE

DEVELOPERS

BUILDDEVELOPMENT ENVIRONMENTS

SHIPSECURE CONTENT & COLLABORATION

RUNDEPLOY, MANAGE, SCALE

DOCKER TOOLBOX

IT OPERATIONS

DOCKER DATACENTER

BIN / LABS

APP A

APP A

APP B

APP B’

APP B’

APP B’

BIN / LABS

HOST OS

SERVER

WH

Y BA

NKS

NEE

D T

HE

DO

CKER

DA

TACE

NTE

R

6

Page 7: Docker Datacentre

THE SIMPLIFYING EFFECT OF THE CONTAINER ABSTRACTION LAYERAs mentioned above, banking environments are like a museum of software technologies. If you look hard enough, you can find pretty much any piece of software in existence.

This technology diversity is incredibly heavyweight for banks to maintain and limits their economies of scale.

For instance, a common situation is to find both Weblogic and Websphere application servers inside a bank. This means that we need to duplicate up all of the resources skills and processes for managing these technologies, even if we do have a valid reason for keeping both technologies in place.

When you Dockerise your software and services, the common unit of management and operation becomes the container. In the example above, we would containerise both Weblogic, and from that point forward it would be the container which is versioned, deployed, ran, scaled, monitored, started and stopped all in a consistent way. The container becomes like a common abstraction layer, with less concern for what is running inside of it.

With the examples above, it’s still likely we would need specialist knowledge related to the technologies within the containers to effectively run them, but much of the management and operation of them becomes consistent and simpler through the container abstraction layer.

Though Docker began as a Linux centric

tool, this will also grow to the Windows ecosystem over the next year as it graduates from the Windows technical previews. Windows containers will continue to run Windows applications, but the container will be managed through the same API as the containers we manage on Linux. This is incredibly powerful as we have a common tool and way of managing the lifecycle of applications in Unix and Windows environments for the first time, further driving economies of scale.

INCREASED PLATFORM PORTABILITYThere is an incredible amount of churn in the platform space currently. Just as banks are finally breaking the back of virtualising their infrastructure, they are now looking at private and public cloud options for their future state, and there are lots of competing offerings within both of these worlds.

Banking applications however tend to become wedded to the servers that they run on - so called snowflake servers because of how they become unique over time. Teams then lose confidence that they can recreate the server if it were to break due to a lack of configuration management maturity. Lifting and shifting applications onto new servers in this situation is a time consuming and risky project in itself.

Containers add a very powerful layer of portability which helps with this problem. Once we have containerised a service, we can move it into on another server with a much higher degree of confidence that it will continue working in the same way and with much less effort.

At this point, a lot of platform risk goes away. If the organization successfully adopts private or public cloud, we can pick up the container and relatively simply move it onto a new server.

IMPROVED RESILIENCE AND ROBUSTNESSBanking applications are often mission critical - they need to stay up and remain performant, sometimes in a 24x7 operational environment. The regulatory and reputational risks of downtime are massive for many banking systems.

Banks can add a huge degree of resilience to their containerised applications by leveraging the Swarm orchestration layer within the Docker Datacenter. Swarm forms a cluster over a pool of underlying servers, and then manages the allocation of containers on top of that pool. If a server within the pool dies, Swarm will reallocate the container onto another server with no loss of service.

Development and operations teams have built this resilience into their applications for some time, but containers allow you to solve this at the platform level.

“WHEN YOU DOCKERISE YOUR SOFTWARE AND SERVICES, THE COMMON UNIT OF MANAGEMENT AND OPERATION BECOMES THE CONTAINER”

WHY BANKING TECHNOLOGY NEEDS THE DOCKERISED DATACENTERContainers are a key tool within a banking environment, and the tooling within the Docker Datacenter is the best way to realize these benefits.

There are lots of benefits to Docker, but below are some of the key ones which are specifically relevant to a banking environment:

7

WH

Y BA

NKS

NEE

D T

HE

DO

CKER

DA

TACE

NTE

R

Page 8: Docker Datacentre

ENHANCED PROVENANCE AND TRACEABILITYFor regulatory reasons, banks need to understand exactly what is running in their environments. What version of the code and dependencies is deployed, where those assets came from, who pushed the code on the path to production and when.

Today, they achieve this with fairly stringent change control processes, but often fail to get this end to end traceability due to people side stepping processes or failing to integrate tooling. Even where it does work, these processes slow development teams down massively in their ability to ship code quickly.

The Docker Datacenter stack gives us this ability out of the box - to clearly understand the provenance of our containers and verify authenticity of the container on the complete path to production using a signing and checking workflow. All of the shipped images remain in the versioned registry with all changes audited and reported.

GREATER ENVIRONMENT CONSISTENCYBecause of their application and environment complexity, many banks struggle with environments. Sometimes there might not be enough environments to support all of the parallel work that is going on, whilst at the same time there are too many environments which are not production realistic and inconsistent with each other.

Docker containers drive a huge increase in

consistency in the environment. Because the application processes are running in containers, less configuration needs to be applied into the environments. Because configuration is embedded within the container we can promote the containers from development, to test, and into production in a much more consistent way.

IMPROVED COMPUTE UTILISATION AND DENSITYThe average server in a banking environment is poorly utilized. They may be sized correctly for peak loads, but the same server can consume much less CPU and memory for much of the week due to limited use of cloud-like autoscaling and workloads which are not dynamically allocated across nodes. This is a significant waste of resources when you account for everything including data center space, manpower, cooling, software license costs etc.

Through the Swarm orchestration layer, Docker Datacenter allow us to improve this situation and squeeze much more use out of the servers, realising cost saving by requiring less hardware.

For instance, we can use scheduling algorithms to tightly pack containers onto a single host until all resources are used. Then, if the CPU starts spiking we can choose to move our container to another host or scale up the particular service to more nodes in a very dynamic fashion.

This has a very real impact on the cost of running the system whether it’s ultimately deployed on an internal data center or the public cloud.

BENEFIT FROM MULTI-TENANCY ENVIRONMENTSRelated to the above, the process isolation features of Docker allow us to prevent processes from impacting or interacting with each other.

This capability allows different teams within banks to share machines, further improving compute density whilst ensuring that the processes are isolated and secured from each other.

FROM PLATFORM TO CONTAINERS AS A SERVICEMany banks have tried to implement ‘Platform As A Service’. This involved the dream that developers could easily deploy their applications into a central platform with much less concern for how the application was deployed, ran and scaled, freeing them up from infrastructure and operational concerns.

Unfortunately, platform as a service doesn’t appear to have been a huge success within banking and financial services, with limited implementations and even more limited adoption. Often, the underlying reason for this was that those platforms were too restrictive to the developer experience.

The Docker Datacenter potentially overcomes this by moving platform as a service slightly down the stack to “containers as a service”. Developers should still have the workflow of pushing their containers into a platform, with less operational concerns, but they also retain full visibility, flexibility and control within the container.

If this is successful, the efficiency benefits in terms of highly enabled development teams and less operational staff is significant.

PROMOTING CODE REUSE AND CONSISTENCYBanks have wanted to do reuse for a long while. In most banking environments there are people working on libraries and frameworks in an attempt to drive reuse across various development teams.

Docker Datacenter give us a really powerful tool in this arsenal. We can create a tree of reusable containers which developers inherit from within the registry, driving consistency into the container ecosystem.

If a base image needs to be updated, for instance because of a vulnerability, any dependent containers can also be rebuilt and redeployed.

“CONTAINERS ARE ALSO MUCH MORE LIGHTWEIGHT AND DYNAMIC THAN VIRTUAL MACHINES”

WH

Y BA

NKS

NEE

D T

HE

DO

CKER

DA

TACE

NTE

R

8

Page 9: Docker Datacentre

STOP PAYING THE VIRTUALIZATION TAXFor many years, banks have been virtualising their infrastructure. The aim of this was to turn a few big machines like lots of logically isolated machines which can be distributed to teams, and also achieve some resiliency benefits to boot.

Containerisation allows us achieve the same end goals without any virtualization platform. We can run many containers on the host and limit them from impacting each other with fine grained controls.

Containers are also much more lightweight and dynamic than virtual machines. We can easily scale them up and down and port them across machines, stopping and starting them in seconds instead of minutes it would take to move bulky virtual machines around.

When the virtualization layer is removed we obviously save licensing costs and potentially headcount from people who are managing virtualization platforms.

This also has performance benefits as we remove a layer of the infrastructure if we move to running containers on bare metal machines.

CONTAINER MANAGEMENT AND ORCHESTRATION IS ALSO EMERGING AS A SOURCE OF SIGNIFICANT MARKET ACTIVITY, INCLUDING COOPETITION AMONG THE PROVIDERS AND MIXED USE BY ENTERPRISE END USERS.

A SEPARATE VOICE OF THE ENTERPRISE: CLOUD Q3 2015 SURVEY QUESTION FOR 534 IT DECISION-MAKERS ALSO SHOWS SIGNIFICANT USE OF CONTAINER ORCHESTRATION. WHEN ASKED ABOUT PLANS FOR CONTAINER ORCHESTRATION, 9% OF RESPONDENTS SAID THEY WERE CURRENTLY USING IT, 36% SAID THEY WERE CONSIDERING USE IN THE NEXT TWO YEARS, 40% SAID THEY WERE NOT FAMILIAR WITH THE SOFTWARE, AND 15% REPORTED NO PLANS FOR CONTAINER ORCHESTRATION IN THE NEXT TWO YEARS.

Jay LymanResearch Manager, Cloud Management and Containers

451 Research

9

WH

Y BA

NKS

NEE

D T

HE

DO

CKER

DA

TACE

NTE

R

Page 10: Docker Datacentre

Containers are incredibly powerful technology and the benefits are incredibly compelling as illustrated above.

However, because they are a new approach to building, shipping and running applications, many organizations need help to understand and upskill on the technology, and then align on how and where to use containers within their application portfolio.

To achieve this alignment, we recommend that organizations develop a container strategy. This covers the main considerations for successful deployment across both governance and technical considerations.

CONTAINER STRATEGY - GOVERNANCE & STRATEGY STREAMThis work stream of your container strategy is all about aligning on how they container technology will be used within your organization.

The elements of this work stream are as follows:

• Education: Educating management, enterprise architects on other senior stakeholders on what Docker is and the implications of adopting the technology.

• Deployment Roadmap: Align on where and how containers should be used within the application portfolio, identifying which applications and platforms are most suited for incorporating containers.

• Enterprise Architecture: Understand the implications for your target enterprise architecture and uncover any barriers for adoption.

• Governance: Develop a control and governance model around the use of containers incorporating development, deployment, management and operations considerations.

• Consensus: Achieving consensus across management and senior technical stakeholders on how Docker will be used through education, PR, training and bringing together disparate parties.

• Reference Architecture: Developing a reference architecture which incorporates operational and secure best practices which other teams in the bank can use in future.

• Best Practices: Developing standards and best practices in a reusable and accessible format which makes it easy to adopt Docker in a consistent way according to operational and secure best practices.

• Audit: Supporting Docker and supporting toolsets through any internal audit or acceptance processes.

CONTAINER STRATEGY - TECHNICAL STREAMThis work stream of your container strategy involves implementing Docker Datacenter and Dockerising a number of services into production as a proof of concept.

The elements of this work stream are as follows:

• Upskilling: Upskilling developers and operations engineers in the use, deployment and run of a Dockerised application through a tailored combination of classroom based training, workshops and pairing.

• Operational Readiness: Understanding and articulating how you will ensure the necessary resilience, robustness and scalability of your containerised applications in a production setting.

• Tooling: Deploying and configuring the tools that support the Docker datacenter - the Docker toolbox, the daemon, the registry and the Universal Control Plane.

• Proof Of Concept: Identifying and delivering an appropriate PoC in a real productionised application according to operational and secure best practices.

• Support: Helping to support any productionised application services until fully handed over to client operations engineers.

“BECAUSE CONTAINERS ARE A NEW APPROACH TO BUILDING, SHIPPING AND RUNNING APPLICATIONS, MANY ORGANIZATIONS NEED HELP TO UNDERSTAND AND UPSKILL ON THE TECHNOLOGY, AND THEN ALIGN ON HOW AND WHERE TO USE CONTAINERS WITHIN THEIR APPLICATION PORTFOLIO.”

SUCCESSFULLY DEPLOYING DOCKER DATACENTER WITH A CONTAINER STRATEGYTo successfully deploy containers and the Docker Datacenter stack, we advise our clients to initiate a consolidated and focussed container strategy. This covers all of the technical and governance streams in order to accelerate a production deployment.

WH

Y BA

NKS

NEE

D T

HE

DO

CKER

DA

TACE

NTE

R

10

Page 11: Docker Datacentre

CONCLUSIONBanks are at a critical juncture. On the one hand they have an ageing and complex technical platform, and on the another they are under pressure to go faster and innovate more to stave off industry disruption.

Containers, specifically as implemented by the Docker Datacenter stack are an incredibly powerful tool in overcoming this challenge.

Implemented successfully, banks will secure benefits such as simplified workflows, more efficient use of infrastructure, more reuse, platform portability, increased resiliency and higher security.

To get there however, a consolidated container strategy is needed in order to rapidly drive adoption of this powerful technology.

To read more about the Docker Datacenter please visit

https://www.docker.com/products/docker-datacenter

To read more about the Contino container strategy, please visit

http://www.contino.io/container-strategy

Page 12: Docker Datacentre

ABOUT USContino is a technology and services company specializing in DevOps, Continuous Delivery, and transformational programs. The company’s Rapid Prototyping and DevOps Acceleration services help organizations speed time-to-market for high quality new and re-tooled applications. From strategy and operations to culture and technology, Contino helps business and technology leaders identify and address opportunities for growth and profitability. Contino provides training, development, deployment and optimization services for the full stack of DevOps and Agile technologies including application lifecycle management (ALM), modern development and Continuous Delivery tools, micro-services architecture, containerization, security, analytics, testing and cloud infrastructure platforms.

Learn more at CONTINO.IO

E: [email protected] W: www.contino.io