33
IN THIS ISSUE Introduction ........................ 1 SAP Sybase IQ - Turning Big Data into a Big Advantage ............................ 2 Gartner Research: Magic Quadrant for Data Warehouse Database Management Systems ........ 5 About Sybase .................... 29 Issue 1 2012 Introduction “Big Data” is the new hot topic for IT managers, and is causing quite a panic amongst some organizations; but, there is no need to panic, Big Data can be looked upon as Big Opportunity. With the data explosion companies now have access to more information than ever before – if the data can be exploited properly it can lead to a big competitive advantage. With companies acquiring massive amounts of data in different forms from different sources, ranging from traditional channels with structured formats to social media channels with unstructured formats, it has changed the focus of analytics in the “real-world”. Throughout organizations there are changes in the way data is being analyzed – in marketing, the focus has shifted to digital channels – click streams and social media – to understand buying patterns, and target marketing activities for maximum impact. In sales, the focus is on what we call “deal DNA”, to correlate emails, meeting notes and chatter to assess the probability that a sales deal will close. On the financial side, simulation is being used to predict margins and portfolio values; while on the operational side, machine data via sensors, and other kinds of digital data are being analyzed to track down operational inefficiencies – it’s no wonder companies are having information overload and are at a loss as to how to manage the information let alone how to use that information intelligently. The key to Big Data is the ability to access and connect all the data no matter what type or where it came from, in order to achieve this you have to break the information silos that trap data – turning massive amounts of data into actionable insight while providing complete access to decision makers – creating an environment that offers “intelligence for everyone”. Featuring research from Sybase IQ

Sybase IQ ve Big Data

Embed Size (px)

Citation preview

Page 1: Sybase IQ ve Big Data

IN THIS ISSUE

Introduction ........................1

SAP Sybase IQ - Turning Big Data into a Big Advantage ............................ 2

Gartner Research: Magic Quadrant for Data Warehouse Database Management Systems ........ 5

About Sybase ....................29

Issue 1 2012

Introduction “Big Data” is the new hot topic for IT managers, and is causing quite a panic amongst some organizations; but, there is no need to panic, Big Data can be looked upon as Big Opportunity. With the data explosion companies now have access to more information than ever before – if the data can be exploited properly it can lead to a big competitive advantage.

With companies acquiring massive amounts of data in different forms from different sources, ranging from traditional channels with structured formats to social media channels with unstructured formats, it has changed the focus of analytics in the “real-world”. Throughout organizations there are changes in the way data is being analyzed – in marketing, the focus has shifted to digital channels – click streams and social media – to understand buying patterns, and target marketing activities for maximum impact. In sales, the focus is on what we call “deal DNA”, to correlate emails, meeting notes and chatter to assess the probability that a sales deal will close. On the financial side, simulation is being used to predict margins and portfolio values; while on the operational side, machine data via sensors, and other kinds of digital data are being analyzed to track down operational inefficiencies – it’s no wonder companies are having information overload and are at a loss as to how to manage the information let alone how to use that information intelligently.

The key to Big Data is the ability to access and connect all the data no matter what type or where it came from, in order to achieve this you have to break the information silos that trap data – turning massive amounts of data into actionable insight while providing complete access to decision makers – creating an environment that offers “intelligence for everyone”.

Featuring research from

Sybase IQ

Page 2: Sybase IQ ve Big Data

2

SAP Sybase IQ - Turning Big Data into a Big Advantage is published by Sybase. Editorial supplied by Sybase is independent of Gartner analysis. All Gartner research is © 2012 by Gartner, Inc. All rights reserved. All Gartner materials are used with Gartner’s permission. The use or publication of Gartner research does not indicate Gartner’s endorsement of Sybase’s products and/or strategies. Reproduction or distribution of this publication in any form without prior written permission is forbidden. The infor-mation contained herein has been obtained from sources believed to be reliable. Gartner disclaims all warranties as to the accuracy, completeness or adequacy of such information. Gartner shall have no liability for errors, omissions or inadequacies in the information contained herein or for interpretations thereof. The opinions expressed herein are subject to change without notice. Although Gartner research may include a discussion of related legal issues, Gartner does not provide legal advice or services and its research should not be construed or used as such. Gartner is a public company, and its shareholders may include firms and funds that have financial interests in enti-ties covered in Gartner research. Gartner’s Board of Directors may include senior managers of these firms or funds. Gartner research is produced independently by its research organization without input or influence from these firms, funds or their managers. For further information on the independence and integrity of Gartner research, see “Guiding Principles on Independence and Objectivity” on its website, http://www.gartner.com/technology/about/ombudsman/omb_guide2.jsp.

SAP Sybase IQ – Advanced Analytics Platform for Big Data

SAP Sybase IQ is an analytic DBMS designed specifically for advanced analytics, data warehousing, and business intelligence environments. Able to work with massive volumes of structured and unstructured data it is ideally suited to Big Data.

Sybase IQ is built on an open, flexible column-store technology, unlike traditional relational databases, that store data by row, slowly working through each row of entire tables, clogging I/O channels, memory, and disk, Sybase IQ uses a strategy called “vertical portioning” that stores data by column, reading only the columns of data used by the query. Using columns, not rows, delivers a 10 to 100 times performance boost compared to the traditional row-based approaches – and Sybase IQ supports most of the popular hardware and OS configurations.

Big Data is not new to Sybase. Sybase IQ has been building on the vision of a big data analytics platform for several years now – the new Sybase IQ 15 family has been a steady progression of releases that have followed a conscious roadmap, each one adding innovations that build upon the foundation and strengths of the previous release. Sybase IQ has been designed to meet the growing needs of IT and Business Analysts to tame the

volume, variety and velocity of today’s massive data needs and demands in a cost effective and attainable manner.

Sybase IQ is based on a three layer architecture. A strong data management layer is the foundation with a highly compressed column store, and shared everything distributed MPP elastic cluster that supports a variety of workloads and active user community. The application services layer sits above that to provide a variety of drivers, APIs, web services, and federation capabilities to empower developers. And wrapped around these two technology layers, is a rich ecosystem of BI tools, partner libraries, packaged applications, and data integration tools to give you an end to end solutions. (See Figure 1)

Centralized Access to All Your Enterprise Data

Sybase IQ centralizes “Big Data” analysis of massive volumes of structured and unstructured data together using a wide range of advanced techniques and technologies – offering a data type agnostic engine Sybase IQ doesn’t care what format of data you have or even where it came from. Whether it be structured in a defined format, semi-structured available electronically, unstructured requiring text mining or analytics tool extraction or web data, such as, social media – it simply doesn’t matter with Sybase IQ.

Massive Scalability

With a state of the art query processor Sybase IQ thrives on heavy ad hoc query usages by large numbers of concurrent users – it’s designed to handle it. Built on PlexQ™ technology framework that delivers a shared-everything massively parallel processing (MPP) architecture based on a columnar data store, it delivers new levels of performance. Unlike shared nothing solutions, a PlexQ grid dynamically manages analytics workloads across an easily expandable grid of computing resources dedicated to different groups and processes, making it simpler and more cost-effective to support growing volumes of data and rapidly growing user communities.

With PlexQ grid technology, enterprise IT departments can more easily overcome the scalability limitations of traditional data warehouses. Organizations are now able to support user communities across the enterprise, and integrate analytics into business workflows. And, it’s easy to leverage advanced analytics within applications by using hundreds of algorithms and data mining models that can run inside Sybase IQ. Elastic computing of logical servers in the PlexQ™ technology framework within Sybase IQ allow IT staff to group together compute resources, in a PlexQ grid, into virtual groups in order to isolate the impact of different workloads and users from each other. When a user connects to a logical server and runs a query, the

Page 3: Sybase IQ ve Big Data

3

Figure 1: SAP Sybase IQ - A complete and comprehensive big data analytics platformSource: Sybase

query execution is only distributed to member nodes of the logical server, and member nodes can be dynamically added or dropped as necessary.

Specialized Tools & Techniques

Sybase IQ has partnered with a number of key advanced analytic partners in order to provide key in-database analytics techniques. Using in-database analytics enterprises and application vendors can answer complex questions without having to move mountains of data to 3rd party tools. With hundreds of statistical and data mining techniques, advanced text analytics capabilities, and APIs to execute proprietary algorithms safely inside Sybase IQ, companies can gain insights in unparalleled time.

For statistics and data mining Sybase IQ supports a DBLytix library from Fuzzy Logix containing hundreds of advanced analytic, statistical and data mining algorithms that can run inside Sybase IQ.

For text analytics Sybase IQ provides comprehensive in-database text search capabilities. With Sybase IQ’s key Analytics partnerships – both internal and external, such as, SAP BusinessObjects, ISYS and KAPOW, hundreds of document formats and Web content can be ingested and/or extracted into Sybase IQ for analysis.

Sybase IQ provides a native MapReduce API that can leverage massively parallel processing across a PlexQ™ grid. Using MapReduce allows you to move beyond limitations with SQL queries, enabling you to more easily execute alternative

techniques such as network analysis or for searching large amounts of unstructured data that is not indexed.

In addition to a native MapReduce API, Sybase IQ offers four ways to integrate results from 3rd party Hadoop frameworks into Sybase IQ queries, giving a tiered approach to analyzing massive data sets. In essence, massive volumes of data can be searched from distributed file systems. The data returned from a Hadoop analysis can then be integrated into a Sybase IQ database in any of the four ways:

• ETL Processing, which bulk load data from Hadoop data stores into Sybase IQ using the open source utility SCOOP from Sybase’s partner Cloudera.

• Data Federation, which exposes HDFS files as tables in a Sybase IQ database that participate in SQL

Page 4: Sybase IQ ve Big Data

4

queries (HDFS files do not need to be loaded into Sybase IQ).

• Query Federation, allowing SQL queries in Sybase IQ to execute Hadoop processes that return data that is incorporated into the SQL result set, and finally.

• Client-side Federation, which federates queries across Sybase IQ databases and Hadoop files using the TOAD© SQL tool from Sybase’s partner Quest.

Use “R”, the popular open source statistical tool, to query Sybase IQ databases using an RJDBC interface. Furthermore, you can execute R libraries from Sybase IQ as a function call within SQL queries and return result sets.

Sybase IQ also offers in-database execution of Predictive Model Markup Language (PMML) models through a certified plug-in from Zementis. This allows you to automate the execution of analytic models defined using industry standard language and that are created in SAS, SPSS Clementine, and other popular predictive workbench products. By using industry standard languages it enables you to leverage your existing investments while providing better performance and scalability.

Within Sybase IQ, the row store SQL Anywhere engine, allows you to also create indexes of geospatial information

to search and filter data for analysis in combination with its column store engine. Following the SQL Multimedia (SQL/MM) standard for storing and accessing geospatial data, Sybase IQ supports 2D geometries in the form of points, curves (line strings and strings of circular arcs), and polygons. Sybase IQ also supports flat and round-Earth representations, allowing you to choose the approach that best addresses your situation.

Sybase IQ provides enterprises with APIs to create proprietary analytic algorithms that can run inside the Sybase IQ database server for top performance. In particular, Sybase IQ offers Java and C++ APIs, with these APIs you can create User Defined Functions (UDFs) that are called through SQL queries. The UDFs can access all of the data within a Sybase IQ database and can leverage a PlexQ™ grid for massively parallel processing. Sybase IQ also offers an In-database analytics simulator, which allows you to test a custom built UDF before deploying it into a production database.

As you can see in-database analytics is a key component to Sybase IQ’s success in being an advanced analytics platform for Big Data. Data volume, accuracy, and swift processing time are all factors critical for success but the balancing act between these key components continues to pose serious challenges for most organizations. With traditional analytics this data

movement can impose severe constraints on timely delivery of the results you need to succeed and can account for up to 75% of cycle time. By running analytic techniques inside the database Sybase IQ dramatically accelerates performance, while avoiding governance and security concerns caused by data movement. You need an analytics environment that can analyze large volumes of data from diverse sources and provide fast, accurate results - Sybase IQ’s in-database capabilities can give you this advantage.

Successful Analytics Platform for Big Data

Sybase is on a mission to revolutionize Big Data Analytics with Sybase IQ. With our centralized data analysis delivering insights across your Enterprise, and our support of large user communities running a wide range of analytics workloads – allowing organizations to analyze hundreds of terabytes, even petabytes of data in speeds up to 100 times faster – you can see that the big data challenges introduced at the beginning of this article – volume, velocity, variety, costs and skills, are matched with the growing set features and capabilities offered by SAP Sybase IQ. Now with accurate complete information across your enterprise, Big Data doesn’t seem like such a Big Problem it has turned into a Big Advantage – with SAP Sybase IQ!

Source: Sybase

Page 5: Sybase IQ ve Big Data

5

Gartner Reserch: Magic Quadrant for Data Warehouse Database Management SystemsThe data warehouse DBMS market is undergoing a transformation with the introduction of “big data” and the logical data warehouse demand for new techniques in practices and technology. The integration of professional services with product offerings also increased in importance in 2011. Market Definition/DescriptionThis document was revised on 05 March 2012. The document you are viewing is the corrected version. For more information, see the Corrections page on gartner.com.

The supplier side of the data warehouse database management system (DBMS) market consists of those vendors supplying DBMS products for the database infrastructure of a data warehouse and the required operational management controls.

For the purposes of this Magic Quadrant analysis, a DBMS is defined as a complete software system that supports and manages a logical database or databases in storage. Data warehouse DBMSs are systems that, in addition to supporting the relational data model (extended to support new structures and data types such as materialized views, XML and metadata-enabled access to content), support data availability to independent front-end application software and include mechanisms to isolate workload requirements (see Note 2) and control various parameters of end-user access within a single instance of the data.

This market is specific to DBMSs used as a platform for a data warehouse. It is important to note that a DBMS cannot be

used as a data warehouse – rather, a data warehouse (solution/data architecture) is deployed on a DBMS platform. A data warehouse solution architecture can and often does, use many different data constructs and repositories. Importantly, the definition of this market is changing and a DBMS will become only part of the overall market definition as the logical data warehouse (LDW) continues to grow in acceptance and deployment.

A data warehouse is a database in which two or more disparate data sources can be brought together in an integrated, time-variant information management strategy. Its logical design includes the flexibility to introduce additional disparate data without significant modification of any existing entity design. A data warehouse DBMS is now expected to coordinate virtualization strategies, as well as distributed and/or processing approaches such as MapReduce, to handle one aspect of big or extreme data situations.

A data warehouse can be of any size. The sizing definitions of traditional warehouses remain as:

• Small data warehouses are less than 5 TB.

• Midsize data warehouses are 5 TB to 20 TB.

• Large data warehouse are greater than 20 TB

Importantly, none of these categories qualify a warehouse as a “big data” warehouse. Volume alone is not “Big data.” For the purpose of measuring the size of a data warehouse database, we define data as source-system-extracted

data (SSED), excluding all data warehouse design-specific structures (such as indexes, cubes, stars and summary tables). SSED is the actual row/byte count of data extracted from all sources.From 2012 onwards, defining the size of a warehouse will become less important and information asset access will become more important. Within SSED it is important to separate the actual data size in a data warehouse from the database total size. Gartner clients report that many 100-terabyte warehouses often hold less than 30 terabytes of actual data. Throughout 2012 and 2013, the size of a warehouse will evolve toward a combined metric, relative to the repositories under direct management of the warehouse and complemented by the volume of available information accessed by the warehouse, as well as its performance in doing so (see Note 3).

In addition, for the purposes of this analysis, we treat all of a vendor’s products as a set. If a vendor markets more than one DBMS that can be used as a data warehouse DBMS, we note this fact in the section related to the specific vendor, but evaluate its products together as a single entity. Further, a DBMS product must be part of a vendor’s product set for the majority of the calendar year in question. If a product or vendor is acquired mid-year, it will be labeled appropriately but placed separately on the Magic Quadrant until the following year (see Figure 1).

There are many different delivery models, such as stand-alone DBMS software, certified configurations, data warehouse appliances (see Note 1) and cloud (public and private) offerings. These are also evaluated together within the analysis of each vendor.

Page 6: Sybase IQ ve Big Data

6

Magic QuadrantVendor Strengths and Cautions1010data1010data (www.1010data.com) was established 11 years ago as a managed service data warehouse provider with an integrated DBMS and business intelligence (BI) solution primarily for the financial sector and more recently, the retail/consumer packaged goods (CPG) sector. 1010data can host its solution using traditional software as a service (SaaS) model or support a managed solution at the customer’s site. 1010data has approximately 200 customers.

Strengths• Since 1010data offers a complete

SaaS solution, the customer’s business unit and IT organization need little experience of data warehousing or BI. The SaaS model also allows multiple organizations to

is either a visionary with cloud and data warehouse as a service, but does not execute against the rest of the market, or it is good at execution against two of the many use cases in the market with little vision for the remainder.

The 1010data position is almost perpendicular to our combined evaluation criteria. Therefore, we have placed it with high execution against a sub-section of the market we evaluate. From a visionary perspective, 1010data is difficult to evaluate under current criteria. Its approach in using a cloud-based and “as a service” DBMS/analytics solution is the primary business model and technology approach. Cloud-based analytics as a service and the ability to deliver under a managed on-premises model, leaves 1010data short of the much broader vision desired by the greatest portion of the data warehouse market, but in these few delivery segments of the market 1010data is a formidable performance competitor.

• 1010data is expected to add probabilistic matching in 2012. The company has exhibited significantly more reduced load times than some of its significant big data competitors, as well as orders of magnitude and faster performance in extremely large datasets. 1010data products read SQL, but also utilize their own, non-SQL language that performs high-speed joins with unplanned data rationalization built into the queries without the performance disadvantages of using interim return datasets.

• Perhaps the most important point raised by those customers referenced is that 1010data is utilized by both IT and the business with fast response times on queries running against hundreds of billions of row tables (with a combined number of rows throughout

Figure 1. Magic Quadrant for Data Warehouse Database Management Systems

Source: Gartner (February 2012)

share large amounts of data without needing to manage it locally – for example, large quantities of CPG data can be shared by multiple retail companies.

As a managed service solution vendor, 1010data can complement the customer’s internal IT department with fast-to-market solutions for business units, so reducing resource consumption within the IT department. More importantly, the managed service model enables 1010data to leverage software solutions across multiple customers. As new applications are created, they become available to all clients, increasing the availability of these applications to businesses. With more than 200 customers, 1010data has reached a position to break out of its former niche status. The problem is that the company

challengers leaders

niche players visionaries

completeness of vision

abili

ty to

exe

cute

As of February 2012

TeradataOracle

IBMEMC/Greenplum

Sybase, an SAP Company

Microsoft

Vertica

1010data

ParAccelKognitioSAND Technology

Infobright

Actian

Exasol

Page 7: Sybase IQ ve Big Data

7

databases exceeding a trillion rows in the entire database in some instances). The company also serves as a data aggregator and data marketplace providing datasets for rapid enhancement and enrichment of analytics normally bound to internal datasets only.

Our reference checks and discussions with Gartner clients also show that 1010data is price-competitive with non-SaaS alternatives, especially by reducing the management overheads needed to support a data warehouse environment. 1010data has expanded from the financial sector (where it began) into a broader market, including the retail sector. 1010data now claims more than 200 customers and its customer references support our belief that it is one of the stronger small data warehouse DBMS vendors. In addition, the company has a small number of customers that install its system on-premises as a managed solution, with several using 1010data as an enterprise data warehouse solution vendor. Therefore, from an execution standpoint, 1010data matches performance, pricing and delivery model for two specific needs in the market quite well and it is expanding both its scope of delivery and its vertical customer base.

Cautions• The market continues to resist

fully-managed data warehouse services in many verticals and horizontal use cases. 1010data is susceptible to resistance from IT departments requiring all its data warehouses to be located in-house, along with in-house governance of the organization’s data assets. The IT market is not fickle and persists in its use of better name-branded vendors and not simply because they are name-branded.

As the demand for hybrid analytics mixing structured data with content increases, 1010data will need to introduce unstructured data analysis as well as operational technology or machine-generated data analysis. 1010data’s competitors have greater financial resources and already are in the process of building out this part of the data warehouse vision.

• One of 1010data’s strengths also acts as a caution. While the business prefers a solution that is a complete, deployment-ready stack, IT departments and purchasing offices do not. 1010data’s offering is sold as a fully integrated DBMS and BI solution, which limits potential customers to those wanting a full solution (primarily because of 1010data’s pricing model). 1010data’s product is a compliant, relational DBMS (RDBMS) that customers can use as a stand-alone system if desired – but fees are charged as if the entire solution is managed. Customers are advised to check the total cost of ownership in such cases, as it may not be advantageous to use 1010data in this way.

• As a solution vendor, 1010data has a different competitive model from vendors of pure-play DBMS offerings. In addition to competing in the data warehouse DBMS market, it competes with system integration vendors that offer outsourced solutions, such as Cognizant and HP (via EDS). Additionally, IBM, Oracle and other large vendors with professional service organizations compete with 1010data in two markets, data warehouse DBMSs and services. It remains to be seen if this is a bias to be overcome or if the cloud and on-premises mix will ultimately exclude a vendor like 1010data. However, based on its extremely positive customer references, it is very unlikely 1010data will be excluded from such a mix.

ActianActian (www.actian.com) offers two products, the general-purpose Ingres DBMS and Vectorwise, a new offering introduced in June 2010 and targeted at analytic data warehouses. Open-source Ingres, one of the original RDBMS engines, has a 30-year history and claims more than 10,000 customers running mission-critical applications, including data warehouses.

Strengths• The Actian database contains most

of the features necessary for data warehousing, such as partitioning, compression, parallel querying and multidimensional structures. Release 10 added bulk load, scalar subqueries, long identifiers and a geospatial offering that was community driven with hundreds of committers contributing code. The performance of Vectorwise, especially in analytic applications, was cited by customers interviewed by Gartner. With the emergence of new server platforms with storage-class memory (of 1 TB and more), Vectorwise will prove a valuable asset for data warehousing and analytics as more of the data warehouse moves to memory.

• Actian has aggressively pursued partners, including independent software vendors (ISVs) in the BI market, the primary driver of new installations in data warehousing. Both new and existing customers are looking for an open-source BI stack with partners such as Jaspersoft and commercial BI vendors such as MicroStrategy have also engaged with Actian. Ingres and Vectorwise are gaining attention from vertical application vendors, system integrators and resellers. Vectorwise uses some Ingres software atop a column store from the MonetDB project and uses hardware assists, turning columns into vectors and processing them in x86 chip registers to leverage

Page 8: Sybase IQ ve Big Data

8

instruction parallelism and on-chip caching. Vectorwise has delivered several top non-clustered TPC-H benchmark results at 1 TB and below. The company was renamed in late 2011 and introduced another new product offering, the Cloud Action Platform, to support the delivery of “Action Apps” that will act on the analytic capabilities Actian supports.

• Previous reference checks have shown Ingres customers to be very loyal. Most have online transaction processing (OLTP) applications, but Ingres has also been used for smaller data warehouses (historically up to about 2 TB, the company is targeting warehouses smaller than 10 TB). Among open-source DBMS, only Oracle’s MySQL compares with proven maturity for mission-critical applications, including data warehousing. Vectorwise has begun to gain new customers and software partners, targeting another set of use cases. Now in its version 2.0, it has added Windows as a platform and has a clear road map for several future releases.

Cautions• Although Vectorwise enhances

Actian’s ability to support analytic data marts, the company must continue to address enhanced data warehouse functionality, storage management and mixed workload management if it is to compete with larger, equally mature vendors and meet the needs of the broader data warehouse DBMS market. Vectorwise needs to support more analytic SQL constructs than it does now and add stored procedures and user-defined functions and data types to move closer to competitors. Its new product and restructuring around Action Apps can be synergistic – but could also prove distracting.

• Actian offers professional services in data warehousing and has a go-to-market strategy with a growing stable of partners – it claims half of its 2011 Vectorwise sales have come though channels. However, it lacks data models and must continue to add marketing and sales expertise for data warehousing. Additionally, Actian has strength in open-source, but the overall adoption of open-source for data warehousing remains weak. While Actian has professional services, it tends to lack some of the tools and methodology support that other organizations have readily available.

• Actian’s new brand and name, as well as its portfolio expansions, can help overcome Ingres’s reputation as an older product that has not regained much market traction. Importantly, Actian has taken a bold stance in attempting to re-establish itself with a new vision and new plans for execution. Initial response to Vectorwise is significant with the addition of more than 20 customers in its first year offering and users should consider Actian’s Vectorwise to be a new and innovative solution in that respect. However, market perception is difficult to change. Both offerings have gained new customers and third-party relationships, but to become a serious competitor in this market Actian must continue to show increased growth in both revenue and numbers of new customers at a higher rate than it has thus far. Effective marketing execution is a must-have for Actian to compete.

EMC/GreenplumGreenplum (www.greenplum.com) is part of the Data Products division of EMC with a massively parallel processing (MPP) data warehouse DBMS running on Linux and Unix. It can be sold as an appliance or as a stand-alone DBMS and has more than 400 customers worldwide.

Strengths• Greenplum’s understanding and

vision of the data warehouse market was ahead of the market as it was one of the first to work with MapReduce, manage external files from within the DBMS and optimize for very large database sizes. As big data is now important in the market and the LDW is emerging as a necessary functionality to support today’s mix of volume, velocity, variety and complexity, Greenplum has a base to support this that was launched several years ago, which translates into the high ability to execute.

Greenplum announced the first unified analytics appliance addressing big data (a modular solution for structured and unstructured data), in May 2011 that was released in September 2011. The EMC Greenplum Data Computing Appliance (DCA) uses the Greenplum Database, Greenplum HD (Hadoop), and Greenplum Data Integration Accelerator (DIA) modules that can be configured within one single appliance cluster. In addition, Greenplum has Chorus, its analytics productivity software, leveraging VMware’s technology, to support automated, self-service data services and collaborative analytics. In a recent announcement, EMC announced the first Hadoop NAS attached HDFS system – HDFS running native on EMC Isilon connected to the Greenplum HD or Greenplum Data Computing Appliance (DCA). Finally, through the external file mechanisms and user defined functions (UDF), Greenplum has started along the path to support LDW. Greenplum even supports an iOS, Linux and Windows single-user development system downloadable as free (not open-source) software.

• As Greenplum has settled into the EMC organization, we have

Page 9: Sybase IQ ve Big Data

9

seen an increase in hiring directly related to development. This, coupled with the EMC development organization has led Greenplum to offer its DCA supporting big data for both structured and unstructured data and intergraded MapReduce processing. The DCA is now assembled by EMC and sold by its sales force. In an interesting manufacturing cost management model, EMC is assembling its appliances in different countries around the world, affording EMC Greenplum a tax advantage in many countries where others (such as Oracle and Teradata) are subject to stiff import duties. This positions the company for easier entry into global markets. Due to the acquisition, Greenplum has been able to work more closely with VMware, for example rearchitecting the Chorus private cloud offering.

• Our customer references support the claims of high performance as well as advantageous price/performance ratios. These references also support the Greenplum claim of scalability to very large database sizes. Reported sizes range from 10 terabytes to more than 500 terabytes. When this combination of performance and scalability are joined to an appliance, the potential of EMC/Greenplum to compete in the data warehouse market is increased.

Cautions• Although acquired by EMC 18

months ago and despite doubling the install-base, Greenplum’s market position is sixth or seventh worldwide. To really increase velocity and gain market share, Greenplum must continue to develop the EMC sales force so that it has the necessary skills in the DBMS software market. Greenplum must also continue to leverage the EMC worldwide

presence to compete with all the incumbent, large DBMS vendors. Importantly, EMC’s customer base is primarily within the IT unit of the organization. Data warehousing is the technical infrastructure for an intensely business-oriented use-case. EMC will need to learn from its Greenplum acquired knowledgebase, specifically how to solution sell a data warehouse and analytics solution.

• Interestingly, this year our customer references have raised several issues around support. In these cases it was not related to the attention to rapid support and fixes (with all customers stating fixes were available in an expected, timely manner), but more with the bugs in the first place. We would classify these as “growing pains” especially for a small organization (as Greenplum was pre-acquisition) being integrated into a large organization such as EMC. We should also note that in our inquiries with Gartner clients, we have seen this issue diminish, coupled with consistently high marks for personalized customer support.

• As Greenplum leverages EMC more, it will find itself competing at a higher level with the mature, incumbent vendors. The major vendors (such as IBM, Oracle, SAP and Teradata), have a much larger customer base allowing them, as the incumbent, a stronger position. EMC/Greenplum must continue to demonstrate differentiation as it addresses the data warehouse market and big data is one specific area, as is cloud. The company must continue to support customers accustomed to the type of service provided by a small company with focused, customer-specific professional services solutions, issue-focused support and leveraging key customer inputs for product enhancements.

ExasolExasol (www.exasol.com) is a small DBMS vendor in Nuremberg, Germany. Exasol has been in business since 2000 with the first in-memory column-store DBMS, EXASolution, available since 2004 and primarily used as a data mart for analytic applications.

Strengths• Exasol offers an in-memory column-

store DBMS for data warehousing. As we have stated, this technology is one of the critical capabilities of the future for the data warehouse DBMS market. Exasol runs in a clustered environment offering scalability across multiple servers. Not only does this allow for high-availability in the case of a server failure using EXACluster OS, but also scaling for larger memory sizes. EXASolution maintains redundant copies of the data in memory to reduce the downtime associated with server failures.

Exasol also includes the use of disk for persistence and overflow (if all the data does not fit in memory). However, when data is loaded into Exasol, it is loaded into memory first and then written to the disk, allowing for the applications to begin before the slower activity of disk input/output (I/O) is completed. This separation of the data access and data persistence model is a visionary change for the market. Additionally, as a column-store, Exasol has excellent data compression (reported to be on average, four times faster), thus reducing the amount of memory necessary. EXASolution is sold by the amount of memory used for the data.

• Another advantage of Exasol, as with other in-memory DBMSs, is the high speed of the DBMS. In published benchmarks, Exasol has attained data warehouse transaction speeds up to 20 times the closest

Page 10: Sybase IQ ve Big Data

10

competitor. Server memory is expensive, but these same benchmarks demonstrated costs of approximately one-third of the standard DBMS. Our reference checks also validate the claims of cost reduction and speed. Another strength of the in-memory nature of Exasol is removing the necessity of optimization and calculation structures within the database.

There is no need to build summaries, aggregates and cubes for use in business intelligence and analytics. This reduces the overhead in the DBMS by as much as 10 times, as well as reducing the database administrator (DBA) resources used to maintain such structures. In addition, this also leads to very fast load times, as there are no complicated structures to build during loading.

• Customer references clearly espouse the abilities of EXASolution for both pure performance and cost/performance. The references (although few in number) also state that customer support is excellent. Finally, references corroborate the results of the benchmarks mentioned here, with better than 20 times performance at half to a third of the cost. They also support the claims of 4 times (or more) compression.

Cautions• The primary challenge Exasol faces

is the small size of the company and previous lack of expansion beyond Germany. Exasol was primarily engaged in product development for its first five years of operations and with changes in management two years ago has now obtained the vast majority of its 30 or more customer base in the past two years. These customers are mostly located in Germany, with several in Italy and Japan. Until very recently,

Exasol lacked a marketing vision to grow beyond the borders of its European base. The company began an expansion plan in 2011 and will begin to grow offices in other locations, including North America.

• Another issue is the increasing competition, both in column-store and in-memory. Exasol has a clear advantage being the first with an in-memory column-store DBMS. Now, most of the DBMS vendors offer some form of column-store capabilities. Further, when Exasol began, there were only a handful of in-memory DBMS, mostly used for streaming data applications. There are now many in-memory DBMSs available in both the column and row-store variety. Finally, SAP has released its SAP HANA appliance with an in-memory column-store DBMS for an analytics data mart and now available under the SAP NetWeaver Business Warehouse. As with many technologies, being first is not sufficient unless capitalized in growth of market share. Exasol has missed the window of opportunity of being first and now faces increased competition.

• Customer references report that there is one major issue with the use of EXASolution – the lack of interfaces to common BI tools. Exasol offers the standard ODBC and JDBC interfaces, but this can be a performance drawback with tools such as BusinessObjects, Cognos and SAS. As Exasol has a small installed base, it is difficult to engage the tools vendors to assist in creating native interfaces to the DBMS. We do expect to see this remedied over the next few years as the size of the installed base grows. Similarly, there is a reported lack of software to manage the Exasol environment (EXASolution). Again, with a small installed base, third-party management software

vendors such as Quest are less likely to support the DBMS, requiring Exasol to create their own management software.

IBMIBM (www.ibm.com) offers stand-alone DBMS solutions as well as data warehouse appliances, currently marketed as the IBM Smart Analytics System family (ISAS) and the Netezza brand. IBM’s data warehouse software, InfoSphere Warehouse, is available on Unix, Linux, Windows and z/OS. IBM has also continued research and development and market execution for the Netezza brand and product line following its acquisition. IBM has thousands of database customers worldwide and more than 500 appliance customers (Netezza and ISAS combined).

Strengths• The breadth of IBM technology

offerings is complementary to and part of its solution delivery capability. InfoSphere Warehouse, a data warehouse offering based on IBM DB2, is a software-only solution. IBM’s data warehouse appliance solution, the IBM Smart Analytics System (ISAS) is a combined server and storage hardware solution (using the IBM Power Systems server with AIX, the System x server with Linux or Windows and the IBM InfoSphere Warehouse and a robust System z ISAS data warehouse solution), complete with service and support.

IBM’s introduction of InfoSphere BigInsights includes offerings to aid the design, installation, integration and monitoring of the use of Hadoop technologies within an IBM-supported environment. In IBM’s case, it is important to note that it has embraced the vision for the LDW – which Gartner describes as the emerging new best practices in analytics management. By tying together relational data, data streams and Hadoop files,

Page 11: Sybase IQ ve Big Data

11

IBM’s stack builds confidence among managers of existing warehouse implementations that the product is evolving as new demands for these two components of the logical data warehouse emerge.

Additionally, for Smart Consolidation – rather than developing tooling in isolation, IBM focused on tooling that existed in its Information Integration portfolio (InfoSphere BluePrint Director). This resulted in improvements in the area of integration, including but not limited to the common Data Warehouse Packs and Models now supported on DB2 and Netezza platforms alike.

• IBM combines product sales with solution services. This market demands a widely varied level of sophistication and knowledge depending on each client organization’s maturity in analytics and information management. As noted in the overview, the data warehouse market in 2011 has multiple visions for the future. IBM has embraced the logical data warehouse (via “Smart Consolidation”) approach while continuing to advance its technology solutions and implementation practices supporting traditional data warehousing architectures.

Professional services available from IBM range from expert education through turnkey solutions to managed services for data warehousing. Importantly, where IBM leverages its services organization most, is in feeding field experiences into the overall data warehouse vision. In 2010, clients reported that IBM’s support appears disconnected from its product strategy – this improved in 2011 with an even larger reference base reporting. This does not mean the issue has been resolved, but it appears that IBM’s focus on solution services is paying off (for example,

IBM specifically assigns technical account managers to support accounts). Additionally, IBM’s focus on prospect qualification resulted in a higher growth in 2011 vs. 2009 to 2010 for all of its products.

• The overall effect is that referenced customers are confident regarding release dates and the road map. Customers list concurrency, scalability, performance optimization and support as positives and were the most often repeated phrases in the reference survey in 2011. References elaborated by indicating that partitioning, compression and reduced administrative hours all contribute to their experience to support optimized performance.

At the same time, some references reported that optimization of queries should be targeted rather than being forced to optimize every single query because the system is able to engage a solid query plan for execution. This evaluation considers the LDW concept to be innovative, but has yet to see a wider embrace in the market. IBM’s early adoption of the LDW concept in both its messaging and its product road map has established this vendor as an early resource for the market. However, the majority of the market for data warehousing will remain significantly focused on traditional solutions for a minimum of the next three years.

Cautions• IBM has embraced the logical data

warehouse vision as the likely successor to current best practices in traditional data warehousing. The market has not yet determined if it is ready to adopt this approach as the new vision for the data warehouse and abandon 20 years of traditional best practices. IBM’s professional services have experience in delivering various aspects of the LDW under its

own methodology and highlights that the traditional enterprise data warehouse [EDW] is vital to all data warehouse strategies including as a base component for the LDW.

This was IBM’s first incarnation of the LDW approach. The market is acknowledging that the EDW does not have to be the center of the strategy but will be significant. However, the justification for the LDW and evolving existing warehouses or replacing them will be difficult at first because it appears to supporters of traditional data warehouses to be a radical departure from their beloved traditional data warehouse practices. Gartner’s own research indicates that the LDW approach is quickly emerging as the newest data warehouse best practice. Gartner anticipates the LDW will become a best practices approach during 2013-2015. With market leadership there is risk commensurate with the anticipated rewards. IBM will need to continue their careful education message regarding their leadership approach in LDW practices. When engaging in an LDW approach with IBM, clients should insure they completely understand IBM’s positioning for implementing this solution.

• Gartner inquiries report indicate that IBM data warehouse solutions are also marketed and delivered in isolation from each other. There are strategic reasons to continue such an approach with any acquisition, but Netezza products tend to have their own niche in customers’ minds that is viewed as being separate and distinct from IBM (but Netezza’s growth was more than 30% in 2011, which is faster than its previous growth rate as an independent company).

As a result, IBM customers often engage only part of the organization for solutions and at least in the

Page 12: Sybase IQ ve Big Data

12

customer’s minds, eliminate the others. This creates both marketing and sales process challenges. This is not an issue with shortlisted solutions (IBM should recommend one solution or another), but does carry over into the solution delivery team and IBM is missing some opportunities for the different parts of the sales organization to leverage each other. IBM has implemented organizational changes intended to address these issues.

Netezza and IBM personnel do interact and coordinate with each other behind the scenes. A marketing solution would simply begin branding software and hardware combinations for limited purposes. However, IBM will choose the more difficult (and more appropriate) solution of creating an educational sales and implementation process which will demonstrate how software and hardware capabilities can be leveraged effectively to support each use case.

• IBM customers report (via inquiry and reference survey results) a scattering of intermittent and irregular issues with product performance or their implementation experience. Some of these are possibly attributed to the implementation process and not the products. However, these same customers report that IBM support addresses these issues with efficiency. Nonetheless, as with any IT products, an assumption that appliances or certified configurations alleviate all issues is incorrect. Most issues are irregular in nature and IBM support is intimately involved in the resolution process.

InfobrightInfobright (www.infobright.com) has offices in Canada, Europe and the U.S. and offers a combination of a column-vectored DBMS and a fully

compressed DBMS. The company provides both an open-source version (Infobright Community Edition [ICE]) and a commercial version (Infobright Enterprise Edition [IEE]). Infobright has approximately 200 customers worldwide.

Strengths• Infobright remains one of the only

column-store DBMS in the open-source software environment. Its revenue is generated from the Enterprise Edition (using a commercial license, rather than a General Public License [GPL]) with a subscription support model based on the amount of SSED stored in the system. As we stated in 2011, Infobright decided in mid-2010 to focus on operational technology data (which it calls machine-generated data). This encompasses data from sources such as smart meter data (in the utilities space), customer data records (in the telco space) and clickstream data from Internet interactions.

This focus has helped Infobright during 2011 where its customer base has grown to more than 200 direct and OEM channel customers. Not only has this focus increased customers, but has also attracted a number of additional OEMs (now accounting for approximately 40% of customers). This, along with partnerships with Pentaho, Jaspersoft, Talend and others, will help the company grow substantially faster than direct sales only.

• Infobright has several unique technologies in the DBMS. In addition to the column-store file system for MySQL, the Knowledge Grid in-memory metadata store is a major differentiator for Infobright, as this product analyzes queries to minimize the number of “data packs” that have to be decompressed to give a result (data packs are the compressed domains/regions of data in Infobright’s offering).

Infobright also released an option for the Enterprise Edition called the Distributed Load Processor (DLP) which allows for the parallel loading of data into the system at very high speeds. Infobright has also added connectivity to Hadoop MapReduce for the processing of “Big data.” This is extremely important to the machine-generated data world as much of this data is stored in Hadoop or other such file systems and needs to be extracted into a DBMS for processing.

• Our customer references are clear on several points. Infobright is extremely fast compared to other systems, including MySQL. Reports of up to an average 500% increase in performance over MySQL deployments have been reported. We believe this is not only from the column-store design, but also the Knowledge Grid. References suggest that Infobright is replacing an existing MySQL environment with great gains in stability, compression and performance. Some cases report a year or more without an outage.

Finally, many references state that simplicity is a factor in their choice to use Infobright. We also believe this will interest OEMs that want to build-in Infobright to their existing systems for resale. The simplicity of management, scalability and compression all interest the OEM looking for a DBMS to embed that requires little support on their part. The focus on machine-generated data has been important to Infobright, but we believe that the future will greatly depend on the company’s ability to leverage these OEM partners.

Cautions• One of the biggest challenges for

a small vendor is to focus on what they do well. Infobright has done this with machine-generated data.

Page 13: Sybase IQ ve Big Data

13

However, as a small, relatively young vendor, Infobright must continue to differentiate its offerings and open-source model from mature column-store DBMSs. Sometimes, these two statements are contradictory not least because the focus on machine-generated data cannot be an excuse for ignoring its existing customers addressing other data management use cases, reported in several customer references as an issue. An example is workload management software, where the managed workloads are basically for machine-generated data and may lack the robustness needed for management of overall workload.

• There are other issues raised by our reference checks. As with most small startup vendors, stability from one release to another can suffer. Customer references reveal that there have been issues with new releases, but they are quick to point out that the problems are quickly resolved. The lack of management software (also an issue for smaller vendors) was raised. Third-party software vendors are not quick to pick up new, young software companies, as the potential market is small, so this puts more pressure on Infobright to produce its own management software.

• Finally, Infobright is open-source and makes use of portions of MySQL, under a Commercial OEM License with Oracle. We always question the open-source model for revenue generation. First, Infobright has a community version with less functionality than the Enterprise Edition. This has proven useful as a trial system to attract new customers, but some may opt for the ICE version in lieu of the Enterprise Edition.

The other issue is specifically the use of MySQL, as it is owned by Oracle. This implies risks remain due to the uncertain future of

MySQL. To date, Oracle has not done anything other than enhance the product. However, in the future when the contract is done with EU, we cannot guarantee that Oracle will not change the agreements, especially those with OEMs. This is an issue customers of Infobright should monitor in the future.

KognitioKognitio (www.kognitio.com) started by offering data warehouse appliances and warehousing as a hosted service. Today, it has a mixture of less than 50 customers using its DBMS (WX2) separately as an appliance, a data warehouse DBMS engine, or data warehousing as a managed service (hosted on hardware located at Kognitio’s sites or those of its partners).

Strengths• Kognitio pioneered the data

warehousing database as a service (dbSaaS) model, where a data warehouse DBMS is delivered as a managed service from the DBMS vendor. Clients buy data warehousing services from Kognitio, while Kognitio hosts the database. Data warehousing dbSaaS permits clients to expand their warehouses incrementally and clients note that this model provides for low upfront costs with virtually no capital expenditure required to get started. This is a growing segment of the data warehouse DBMS market. Kognitio also works with deployment partners such as Capgemini (and contributes to Capgemini’s Immediate cloud computing offering).

Additionally, in line with existing market demands, Kognitio has an appliance to install on-site for customers requiring their own infrastructures. Kognitio opened offices in the U.S. three years ago in addition to its U.K. headquarters and has continued to expand its presence in the U.S. by hiring additional resources. This has

started to produce results, with several new customers. Kognitio has also added several hosting partners in the U.S. and the U.K. offering managed services on WX2. Its sales model as dbSaaS makes up almost half of its revenue and has supported much of the company’s growth this year.

• Kognitio continues to invest in in-memory capabilities. Gartner considers that in-memory DBMSs can play a major role in enterprises information infrastructure and as such Kognitio’s technology has an opportunity to meet customer demand, given the maturity of its offering, compared to other more recent offerings. Kognitio’s DBMS, WX2 version 7, already includes in-memory analytics, and customer references continue to report that the speed of query and load performance is excellent. In 2011, Kognitio added Pablo in-memory online analytical processing (OLAP) capabilities to further strengthen its analytical capabilities The DBMS is already an in-memory DBMS, with hot data held in-memory and cold data on disk, managed automatically by the DBMS.

• Those customers referenced reported significant concurrency capabilities, as well as excellent support and product management. Kognitio is gaining visibility thanks to the current market interest in in-memory technologies. Kognitio’s customers report that deployment of large-scale data warehouse efforts takes as little as 10 weeks using this model. References also report predictable, linear scaling of performance and under the “as a service” model, customers report scale up and scale down needs as part of a solid account management approach. Finally and possibly most importantly, references indicate that new queries and new variations on existing analytics can be deployed rapidly.

Page 14: Sybase IQ ve Big Data

14

Cautions• Kognitio has a very substantial

opportunity in the small or midsize business data warehouse and BI market thanks to its dbSaaS model. However, over the past year, managed services offerings from IBM and HP/Vertica have experienced growing acceptance and penetration in the market. These offerings are not direct competitors to Kognitio’s solution, but the customer base views them as an equal alternative from more established vendors.

Kognitio has not yet addressed some of the very large volume or variety of data support issues – more specifically support for content and complexity aspects of extreme information. However, Kognitio’s in-memory analytical capabilities can be of value in low latency, high volume analytics. The market shifted dramatically during 2011 toward a new position. Kognitio did not stand still, but market demand regarding new functionality expanded more rapidly than Kognitio’s product feature sets. This appears to only be a temporary condition while Kognitio addresses these new expectations.

• While Kognitio continues to grow its installed base (with an additional seven clients in 2011) the company remains a small vendor with fewer than 50 customers worldwide. This makes it increasingly difficult to sell to organizations that have incumbent vendors, and to compete with some of the lower-priced appliance offerings. Additionally, as a data warehouse outsourcing solution, organizations should be aware that they are still responsible for contracting and auditing data security procedures.

• Clients report interoperability with third-party popular BI tools,

such as those of IBM (Cognos) and SAP (BusinessObjects), is difficult to manage. This problem is compounded by Kognitio’s small market penetration and the resulting scarcity of tool expertise in the market. References also report the absence of any form of developers’ forum or marketplace, scarcity of skills in the market and an extremely lean global presence makes commitment to the product and consistent delivery difficult.

MicrosoftMicrosoft (www.microsoft.com) continues to market its SQL Server 2008 DBMS (Release 2) Business Data Warehouse and Fast Track Data Warehouse for data warehousing customers not requiring an MPP DBMS. Microsoft released its own MPP data warehouse appliance, the SQL Server 2008 R2 Parallel Data Warehouse (Microsoft) (PDW), in November 2010.Strengths

• Microsoft spent 2011 revitalizing its vision for the data warehouse market. Additionally, it announced two Apache/Hadoop connectors for SQL Server, SMP and Parallel Data Warehouse (PDW) in support of the market’s big data issues. Many would be surprised to learn that Microsoft already provided combined structured and unstructured analysis in SQL Server 2008/R2. A third quarter appliance update included support and enhancements for integration with SAP/Business Objects, MicroStrategy and Informatica.

In addition, Microsoft offers the SQL Server Fast Track Data Warehouse, which includes validated reference architectures for building a balanced data warehouse infrastructure. This road map contributes significantly to the company’s vision for the market and its customers. Microsoft

can also leverage SharePoint and PowerPivot and the ability to include an unstructured information type in analytics is the result of its technology blend and this is a strength that should definitely not be ignored.

• References report that Microsoft exhibits one of the best value propositions on the market with a low cost and a highly favorable price/performance ratio. Skills are widely available in the marketplace to operate a Microsoft data warehouse and there is an easy learning curve to acquire those same skills, as needed. As an added bonus, customers report that the integration and continuity of a complete Microsoft data warehouse and business intelligence stack is highly advantageous to time-to-value in delivery. Noticeably absent are any fears regarding vendor lock-in. According to our reference checks and discussions with our clients, worldwide support from Microsoft is extensive, encompassing partners, value-added re-sellers, vendors of third-party software and tools and widely available SQL Server skills.

• Microsoft references indicate a dominant presence in midsize data warehouses —especially those end-user organizations reporting that their companies and their data management needs are growing. According to customer references, Microsoft assures its customers of a solid data warehouse platform including features and functions that run the gamut of traditional warehouse functionality.

For connectivity in a multi-vendor environment Microsoft offers a SAP/BW, Teradata and Oracle connector. The DBMS supports compression and backup compression, partitioned table parallelism, policy-based

Page 15: Sybase IQ ve Big Data

15

administration and even star-join query optimization. Microsoft also offers analytics capability, coordinated through its data warehouse products, to perform hybrid analytics, which combine data and content —representing an area of important vision in the logical data warehouse space.

Cautions• As of the completion of our

research for this Magic Quadrant analysis (November 2011), Microsoft could not provide a production reference for the PDW – but does have paying customers. Last year our view, while late, was that the PDW was arriving just in time for later adopters in the data warehouse appliance market. This window is not closing anytime soon, but credibility needs to be established and some issues continue to be reported by references. The difficulty with high availability using active-passive server clustering and a relative lack of performance-monitoring tools specifically related to SQL Server Integration Services (Microsoft) (SSIS) are still reported. Customers are still waiting for SQL Server 2012 to fix many of these issues, but for now, Microsoft’s execution in the data warehouse market is suffering.

• The best summary of issues probably comes from one of Microsoft’s customer references, “Easy to use. Hard to make perfect.” Several customers report issues such as not scaling well across a grid of servers, performance issues with complex queries, manual rebuilds of database indexes, a need for multi-server staging environments and more. These same references also report difficulty in changing versions. Put simply, based on the strengths and

weaknesses cited by references, Microsoft offers all of the “parts” of a solution, but it is difficult to assemble and use those parts out of the box.

Nevertheless, the strength of performance/price remains to balance these issues. Microsoft maintains these issues are mitigated by the Reference Architectures in Fast Track (which does receive high praise in the market) and appliances such as Parallel Data Warehouse and Business Data Warehouse.

• It is important to note that Microsoft’s highly improved vision now needs to come to fruition. We have seen other vendors bounce back and forth between focusing on market execution and product and market vision. If this type of bouncing is the result of revitalizing product and solution delivery and simply working out the “kinks,” then the vendor is usually successful in achieving higher success in the marketplace. However, we have also seen this phenomenon in vendors with inconsistent data warehouse product leadership.

At this point, Microsoft is an inconsistent leader with lapses in meeting market demands. Organizations considering Microsoft should have a clear understanding of the expected pace of product enhancements in Microsoft’s road map. This is not a “bug” or release issue, but a release management issue. Make sure Microsoft’s product features and release road map remains 10 months ahead of your plans to use new features and releases. Importantly, many aspects of Microsoft’s vision align with the LDW, but for now the vision suffers from cross-product coordination, which will become necessary as coordination across information asset types grows more prevalent in the market.

OracleOracle (www.oracle.com) offers a choice of products, which allow customers to choose to build a custom warehouse, use a certified configuration or purchase an appliance ready for a warehouse design and load. In addition to the DBMS and certified configurations, Oracle offers three different Exadata branded products: Oracle Exadata X2-2 for data warehousing and mixed workloads, Oracle Exadata X2-8 for cloud solutions and Oracle Exadata Storage Expansion Rack X2-2 for additional storage capacity. Oracle reports more than 300,000 customers worldwide.

Strengths• The Oracle DBMS versions

represent approximately 43% of the total DBMS market share (not just warehouses) by revenue worldwide. Customer references report a tendency to continue to use Oracle’s DBMS and deploy more applications on the database. Further, while there is no accurate representation of data warehouse market share (DBMS licenses can be used for multiple purposes), Oracle announced that 1,000 Oracle Exadata systems for data warehouse and OLTP are installed as of June 2011, purchased as quarter, half and full rack units.

Frequently, customers have multiple units working on one warehouse. Internally, Oracle’s sales training has focused on building value statements with customers and prospects throughout the year – the account positioning remains technology focused, but has begun to move toward solution selling that will become critical for promoting Oracle product use to support the “logical data warehouse.”

• Oracle introduced significant optimization in Oracle Database 11g (version 11.2.0.3) with cluster-wide parallel operations executing in memory (the large memory across

Page 16: Sybase IQ ve Big Data

16

all Real Application Clusters [RAC] nodes in the cluster are treated as a single, large memory pool and data is transformed/optimized for different storage in memory, rather than on disk). Exadata has Exadata Smart Scan (to offload some DBMS functionality to the storage server), Exadata Hybrid Columnar Compression (which reduces storage requirements and increases performance) and Exadata Smart Flash Cache (up to about 5 TB of flash memory to optimize data access and queries).

According to Gartner’s clients and client references (interviews began in late 2010, when Gartner spoke with over 100 Oracle Exadata warehouse implementations in the past 18 months), Oracle Exadata exhibits up to a tenfold increase in average performance, compared with a similar workload running Oracle on stand-alone hardware. It should be noted this is increased performance measuring Oracle against Oracle. Oracle Exadata is not required to deploy a data warehouse on Oracle Database, but many of the solution design issues solved by Oracle Exadata must then be addressed by clients’ implementation teams on their own.

• Oracle’s customer references indicate the product is highly stable, consistently meets the performance requirements of their use case needs, is easily supported by a widely available skill base in the market and has a solid list of features (such as Exadata Hybrid Column Compression, Oracle Partitioning, Oracle RAC and Oracle Automated Storage Management). The net result is that implementers who choose to proceed with Oracle will be able to proceed in a straightforward manner.

Oracle references also report that the products and appliances appear to advance with additional functionality at the desired pace for the expanding requirements (functionality, data volume scaling and the types and number of anticipated queries). In 2011, references showed that some of the previously reported issues regarding management tools were resolved. The result is that Oracle customers have a pre-disposition to using more Oracle products and this includes data warehousing, which increases market sales generally and even more specifically, creates a solid pipeline with existing Oracle customers.

Cautions• Oracle clients are practical and

realistic in that they recognize the consistent strengths of products and that any issues are inconvenient or nuisances. Oracle has the most customer references of any data warehouse platform within Gartner’s client base and so has the greatest mix of comments across these references. However, three types of issues are commonly reported directly by customers. First, frequent “bugs” or software issues are reported in new releases, followed by a difficult patching process. Second, when customers experience specific performance issues involving complex queries, they are difficult to resolve due to “poor visibility into the actual bottleneck.”

Some improvements in customer management experiences are reported, but according to references, these issues result in somewhat higher efforts in administration and management of the environment. Gartner considers Oracle’s support regarding these issues to be effective and 2011 saw

continuing improvement. Finally, the Oracle licensing model continues to be an issue for these reference customers who complain of difficulty in cost planning.

• Oracle faces several challenges with its LDW vision, but it has time to respond and it is possible that an appliance can also contain a complete array of services management, application servers and data management platform to manage a logical data warehouse; or that many appliances can be configured to address the LDW. Oracle’s current strategy for analytics information management remains tightly bound to a repository-based implementation approach. Oracle Big Data Appliance includes its own NoSQL database (based on BerkeleyDB) and an Apache/Hadoop distribution.

Prior to the publication of this Magic Quadrant analysis, Oracle announced that Oracle Big Data Appliance incorporates the Cloudera version of Hadoop. Oracle already has the capability in its various tools (such as Oracle Fusion Middleware, Oracle Data Integrator and various DBMS solutions) to address distributed data and distributed processing, but its primary message for warehousing still emphasizes the repository management approach in favor of virtualization or distributed processes. This combination of challenges and events will not necessarily force Oracle to isolate on a single vision for the market, but the temptation to do so will be significant. During 2012 to 2013, it will be important for Oracle to establish its future vision for this market.

• As a new hardware vendor, Oracle may pursue the vision of hardware exclusivity or it may continue in its agnostic view of

Page 17: Sybase IQ ve Big Data

17

hardware configurations. Gartner believes that an “appliance only” view of the market is a dangerous strategy for Oracle to adopt. The traditional data warehouse exhibits significant hardware and storage optimization issues and solving them represents a significant revenue opportunity. The company’s announcement to support Hadoop/MapReduce implementations via the introduction Oracle Big Data Appliance reinforces its approach to pursue large portions of the information management space in an almost appliance-centric manner.

Customers and implementers should be wary of assuming that purpose-built appliances will solve their analytics issues. This solution may not be the best implementation approach for specific organizations, whereas for other organizations it will be a preferred approach. Additionally, Oracle has retained a significant portion of Sun’s personnel which also constitutes the Sun institutional memory and 2012 will be an important year in determining how well Oracle executes on its hardware support strategy. The company considers this an opportunity for innovative resolutions to how it addresses market.

ParAccelParAccel (www.paraccel.com) offers an analytic platform, based on a column-vectored database designed to enhance multi-recursive analytics, especially those exhibiting self-join requirements. ParAccel has approximately 40 customers worldwide.

Strengths• ParAccel has been in operation for

five years with some of its earlier customers exhibiting production DBMS operations for warehousing for more than three years. In 2010, the company began to put a

new senior management team in place and completed the process in 2011. New marketing, a new chief executive and a new COO have already begun to change the messaging, but more importantly, substantial changes have taken place regarding the company’s approach to delivery.

As a result of the new team’s efforts, Amazon has invested in ParAccel and other channel partner agreements are in place (or under final negotiation) with MicroStrategy, Cisco, Dell, Birst and Accenture. ParAccel has also secured a significantly expanded sales staff with experience of other data warehouse DBMS vendors. Finally, the company has introduced a licensing model with significant potential to serve as a cloud-friendly model that supports the LDW. A single license is charged for based on the intended use, regardless of how much or what type of servers and storage are used.

• ParAccel shipped version 3.1 in June 2011. The new version included high performance connectors with Hadoop/MR and addressed an issue raised by some customers regarding online and incremental backup and restore. The version release also included a query optimizer and storage enhancement features. The net result is that ParAccel is expanding its vision for the logical data warehouse (with much more work to complete, specifically for virtualization support and semantic management layers), but will need to address this larger environment either directly with new features or more likely through technology partnering.

As a smaller vendor, ParAccel may have an advantage in developing new partner channels as they offer an alternative to the mega-vendor solutions. Customer references show performance, support,

minimal tuning and administration as ParAccel’s strengths.

• From its original inception, the ParAccel Analytic Database (PADB) was designed for multi-level, highly recursive analytics on self-joined datasets. Initially, customers adopted this product for queries related to topics such as market basket analysis and clinical trials data. However, this particular strength will also be highly adept in providing social analytics support, as the same processing techniques applying to market basket and trials data analysis also provide significant capability for social analytics and many of the new big data dataset issues. ParAccel customers will be able to utilize the new Hadoop connectors to either bring data into PADB for in-database operations (using MapReduce almost in extraction, transformation and loading role) or to call to Hadoop clusters and receive the results.

ParAccel has shown endurance as its independent niche shrank, after acquisitions of its analytic DBMS competitors. In 2011 it pursued additional funding and changed its executive team (CEO and heads of sales, engineering and marketing) and hired a new set of senior sales directors to compete in a changed market, leveraging a reference base of enthusiastically loyal customers valuing its performance. A point release and aggressive partnering strategy have also helped the company to hold its ground and lay the foundation for future accelerated growth.

Cautions• ParAccel remains a small company.

There are clear advantages and disadvantages from this in today’s market. As a smaller company, ParAccel will become increasingly attractive to larger IT vendor partners as an alternative to

Page 18: Sybase IQ ve Big Data

18

inviting a “stack” vendor into their accounts. Additionally, vendors of new cloud offerings initially sought to find low license costs but now spend funds on their development models.

However, cloud offerings are maturing and this means that the functionality requirements for high availability and backup and restore will become more important to cloud solution providers – and is one of the unresolved issues raised by PADB users. However, at the same time, ParAccel in increasing its research and development spending in this area, which represents a commitment to addressing cloud provider requirements.

• ParAccel customer references show that they are looking for features such as online scaling/re-organization and Lightweight Directory Access Protocol integration. Maturity issues are also raised, some of which could relate to product and others to a lack of implementation experiences. Customers also report DBMS crashes and would like to see a storage environment that integrates solid state storage with hard-disk drives along with better backup.

While customers suggest a solid-state drive (SSD)/disk hybrid solution, Gartner does not necessarily support this approach. ParAccel is monitoring specific use-cases for the benefits of such a configuration. We hear mixed reviews on the standard functionality, with infrequent reports complaining of too many system parameters, slow real-time loads and even infrequent issues with bulk loads. Microsoft’s TSQL is supported, but PADB doesn’t support extensions. In general, customer references indicate more elasticity is required.

• ParAccel has not won significant mind share in the marketplace. Therefore, for all of its new partnerships, there are many competing solutions also vying for attention with these new channel partners and ParAccel cannot be all things to all partners. In the past year, the company has significantly revised its go-to-market plans based on what PADB already does and introducing new channel strategies for exposing the solution. It will become increasingly important for ParAccel to focus on one or two specific differentiators and proceed accordingly, for example, seeking investment in the selected area, spending research and development funds in these areas, identifying partners with use cases and customers supporting these differentiators. Cloud analytics is one such area and ParAccel has made a good start. Social analytics could be another.

SAND TechnologySAND Technology (www.sand.com) is a tokenized, column-store DBMS vendor. It has been in existence for approximately nine years and reports more than 600 customers using its tools via OEMs and direct sales (approximately 100 direct customers). SAND uses techniques such as tokenization and compression to strengthen its column-store design. Its technology is used as an analytic engine and as an archive engine.

Strengths• In 2011, SAND Technology

endeavored to alter the overall position in the data management market and largely succeeded. Its new vision is to promote the use of software features and functionality to support skilled data management professionals so they can concentrate on those tasks which continue to demand human insight. SAND Technology has specifically

targeted the rising role of the data scientist, as it promotes in-database and external analytics against big data as well as planned support for hybrid analytics use-cases, which combine structured data and content (using a unified query language for SQL, like access to a wide variety of information asset types in a single query structure).

From an execution standpoint, SAND Technology has completely revised its product distribution channels and sales organization, removing its heavy dependency as SAP’s searchable archive solution by building up new revenue streams. Importantly, this answers one challenge Gartner identified previously.

In 2011, SAND Technology made significant progress relative to Gartner’s Magic Quadrant criteria and while remaining a niche vendor, it has advanced in a year when the entire market was challenged with greater demands in both execution and vision and where most vendors actually moved backward or struggled to maintain their position. The company has also focused on analysis of customer data, but not only structured data, as stated previously. SAND Technology’s new vision of the market following its improvements during 2011, could mean that 2012 is a proof point year.

• SAND Technology already had text search capabilities (sound/spell like, relevance ranking and other text-based capabilities) and an extremely compact column-store DBMS, as well as cloud support functionality (shared processor/storage and distributed processing management). In 2010, the company added managed, dependent, disconnected data marts, enabling synchronization and updates to intermittently connected data marts.

Page 19: Sybase IQ ve Big Data

19

As an archive tool, SAND Technology’s solution achieves greater compression than other DBMSs because of its use of tokenization in addition to the column-store and the resulting archive is SQL-accessible. SAND Technology plans to introduce Hadoop/MR capabilities and extend its unified query language to put “Big data,” structured data, text analytics/search and content analytics into its UQL. The result is that SAND Technology is another vendor aggressively seeking to identify its role in the LDW.

• Almost all customer references report that the compression rate of SAND’s column-store DBMS is impressive. Additionally, those using it as an archive or an enhancement to SAP’s Business Warehouse Accelerator report solid integration, although direct interfacing proves more difficult when it is the primary warehouse. SAND Technology refers to its core engineering as “infinite optimization” because of the tokenization and column store.

References report significant advantage from indexing features, the power of the database engine, ease of implementation and upgrades and support services (reporting that SAND Technology’s professional staff have high levels of data warehouse and product expertise). It is also a good choice for analytic data marts to support the off-loading of workloads from an enterprise data warehouse. With solid references and reported stability, more implementers during 2011 expressed an interest in SAND Technology’s vision.

Cautions• SAND Technology is small.

Despite the larger customer count attributed to it this year (by including its much smaller OEM indirect customers), company revenue is low (SAND Technology

is a publicly held company with regular earnings announcements). SAND reports that it has adequate funding for its current run-rate and planning horizon. As a company experienced in partnering and channel delivery, SAND Technology can take advantage of the current trend for server, storage and network vendors that have determined it is in their best interests to also become data management software vendors.

SAND’s internal expertise is also a valuable asset, both for building channels or as an acquisition target. The company has determined that big data customer demands provide it with an opportunity to win mind-share as a niche expert in this new demand area and is expanding its consulting capability specifically for this purpose.

• SAND Technology is present in practically all the same large brand name customers that every other start up or mega-vendor lists as their customer base. Additionally, they list many of the same channel partners. This is evidence more of the fickle and inconsistent nature of enterprises regarding how they seek out new technology and how new solutions relate to their enterprise “standards,” which is to say that experimental projects or “skunkworks” and new technology projects both ignore standards and do not necessarily see wide adoption.

However, in SAND Technology’s case, it has OEM and channel experience that has not yet resulted in significant revenue growth. The use of channels is imperative to the company’s success and development of two or three significant channel relationships is necessary at this point. After a year of retrenchment, it is time to succeed and the indicators to watch are channel uptake and delivery on its big data road map.

• Customer references report intermittent difficulties of a widely varied nature. While SAND Technology staff are deeply skilled in their products and data warehousing, skills are almost completely absent from the general marketplace. Some customers report that retiring or archiving data off of SAND Technology is difficult (which is ironic as that was the company’s forte with SAP/BW). Some customers report a lack of metadata capabilities and poor capability to manage data models in the database. The high praise for SAND Technology staff and the wide variety, but lack of specific trends for the issues reported, seem to be a symptom of “there just aren’t enough experienced personnel” available. It is simply that SAND Technology’s support and small professional services team cannot be everywhere at once.

Sybase, an SAP CompanyIn 2010, Sybase (www.sybase.com) was acquired by SAP. Both have several DBMS products, but our analysis is based on Sybase IQ, which was the first column-store DBMS and is SAP/Sybase’s primary data warehouse DBMS. It is available as both a stand-alone DBMS and a data warehouse appliance, through several system integration vendors. We mention SAP HANA here, but we do not have any production references at the time of this analysis. Sybase has thousands of Sybase IQ customers worldwide.

Strengths• With the release of Sybase IQ 15.3

and 15.4, Sybase IQ has moved from being an analytics data mart to a data warehouse now supporting big data and the LDW. It has added substantial mixed workload management, faster loading capabilities (to address the biggest issue with disk-based column-store DBMSs), query parallelism across multiple processors, the ability to

Page 20: Sybase IQ ve Big Data

20

scale horizontally across a cluster of servers with MPP capabilities. Additionally, Sybase has added features to IQ such as integrated text search and analysis, in-database data mining and Web-enabled language drivers such as Python, PHP and PERL – each targeted at a new generation of analytical applications.

With Sybase IQ 15.4, it has added capabilities to address the LDW, such as distributed data sources and the ability to combine both structured and unstructured data in the same query. In addition, Sybase IQ now has in-DBMS MapReduce capabilities and connectors to Hadoop, adding to the ability to work with “Big data.” Finally, Sybase IQ is the underlying technology of Sybase RAP, The Trading Edition, including a built-in package for time-series analytics. Sybase IQ and Sybase RAP, in combination with the Sybase Event Stream Processor (ESP) complex event processing (CEP) technology, creating a general real-time analytics platform for applications requiring low-latency.

• Increasingly, Sybase IQ is on the shortlist for a complete data warehouse solution, as described by our client inquiries. In addition, we see Sybase IQ increasing its participation in and winning proof of concepts (POCs), with performance up to 100 times that of its nearest competitor. Sybase has enhanced the workload management capabilities of IQ over the past several releases and now shows good performance in a complex mixed-workload environment. The market perception of Sybase has changed over the past 18 months since the acquisition by SAP. We seldom get inquiries asking about the viability of Sybase in the market. This has also strengthened the ability of Sybase to lead with IQ as a data warehouse.

Further, SAP positions Sybase IQ as the DBMS of choice for the non-SAP data warehouse sitting beside the SAP NetWeaver Business Warehouse for those customers requiring a separate DBMS for the general data warehouse. To date, Gartner has not had a significant number of inquiry clients reporting that they use Sybase IQ combined with SAP NetWeaver Business Warehouse in a dual warehouse approach. Most report they use SAP NetWeaver Business Warehouse and another product (Oracle, IBM-DB2, Teradata, for example) or Sybase IQ and another product. We believe this will only increase as customers begin to use Sybase ASE under the SAP Business Suite and SAP BW.

• From our reference checks, Sybase IQ enjoys some of the most loyal and happy customers. There were virtually no negative comments. The most common and consistent comment is the performance of IQ – all stating that the query processing is extremely fast with no optimization necessary. Customer references also claim that the compression in Sybase can reach up to 10 times, which is a high amount of compression for a column-store DBMS. It is a combination of the compression and the column-store that allows Sybase IQ to achieve the high performance for data warehousing. It is important to note that Sybase exhibited both execution and vision similar to that of 2010. The broader expectations represented in the 2011 to 2012 market influenced Sybase’s position in 2012.

Cautions• As we described in our analysis

for 2011, column-store DBMS technology is gaining in customer acceptance and has become almost

a standard technology in data warehousing, as all vendors are adding this capability. Not only are there many other column-store DBMSs available (such as Exasol, Infobright, ParAccel and Vertica), but the general DBMS vendors are also adding column storage capabilities (such as EMC/Greenplum, IBM, Microsoft, Oracle and Teradata/Aster Data). Sybase IQ is the most mature and touts the largest market share among column-store DBMSs.

However, Sybase IQ must now differentiate beyond column-store capabilities as it faces competition from most of the other vendors. Some of this pressure will be offset as Sybase IQ matures to a complete data warehouse solution and by SAP’s acquisition of Sybase giving SAP/Sybase a complete DBMS offering with Sybase ASE, Sybase IQ, Sybase ESP (CEP) and SAP HANA.

• A limitation with Sybase IQ is the lack of a Sybase IQ appliance. Sybase has tinkered in this space with agreements for appliances with third-party ISVs, but it has seen little traction with this model. As appliances are widely accepted as the model for a data warehouse, this may slow adoption of Sybase IQ. Sybase does offer certified configurations and automated tools for reference configurations, but the market is keenly aware of the difference forcing Sybase IQ out of the final list of choices for a data warehouse in many situations. With SAP moving toward an appliance model with SAP HANA, we believe we are not far from seeing a renewed effort with hardware vendors to supply a Sybase IQ data warehouse appliance.

• A new challenge for Sybase IQ is emerging because of the acquisition by SAP. As SAP defines its go to market strategy for data warehousing and information

Page 21: Sybase IQ ve Big Data

21

management, SAP and Sybase have been slow to articulate to customers where and how HANA and Sybase IQ coexist in the IT environment, and its overall data management strategy regarding the combined database offerings.

Gartner’s opinion is SAP would like to see SAP HANA eventually emerge as a data warehouse appliance option and Sybase ASE would serve as the DBMS engine for the Business Suite. With this scenario, there is a need for clarity around the role of Sybase IQ, which has had significant capabilities added in recent releases, adding the ability to virtually manage storage and even processing capacity. HANA has similar technology capabilities, but these advances in Sybase IQ may make it more suitable as an EDW platform.

TeradataTeradata (www.teradata.com) has a 30-year history in the data warehouse market supplying a combination of tuned hardware and analytic specific database software. Teradata has more than 1,000 end-user organizations as customers worldwide.

Strengths• Teradata’s products include

departmental, data mining-focused and enterprise solutions. Its portfolio also includes cloud and big data solutions. Aster Data added new capabilities to Teradata’s product line (such as MapReduce, unstructured data and graph analysis). The acquisition demonstrated a persistent strategy to address the growing role of the data warehouse as an information management platform which goes beyond addressing data-volume-based issues, but also information variety, velocity issues and complexity of information types and analytic processing.

Teradata has continued to grow the capabilities of its DBMS technology by adding bi-temporal, columnar support, automatic advanced compression options, for example, further demonstrating its technology investment. The attention to “Big data,” the extensibility framework for processing languages and mixed workload management, indicate a growing capability to support the LDW and Teradata’s strong professional services provide implementation field experience to support its deployment.

• Teradata’s management software, including Teradata Active System Management (TASM) and Viewpoint, is a clear strength. The management software manages the entire data warehouse environment. As a result, organizations are presented with multiple options as their data volumes and query complexity grow by allowing management in dual warehouses, single platforms, various appliances and more. Teradata’s Unity supports multi-system deployments and confers the ability to gain a single operational view across Teradata systems and to move and manage data and applications between multiple analytical systems in an enterprise.

Teradata has a formalized strategy for combining older equipment with new generations (“investment protection”) the use of virtual work units can be distributed, with more work units on newer generation nodes, relieving some of the performance pressure on older equipment. In addition to an enterprise active data warehouse for operational analytics support, features such as object access and query resource filtering, throttles that can be applied to named users, connections or the entire system and performance groups based on relative priority contribute to software management capabilities.

• Teradata’s installed base continues to grow. References are most satisfied by the technology scalability, stability, predictability of performance, mixed workload capabilities. For customers starting new data warehousing projects, Teradata’s industry models continue to be seen of great value. At the same time, Teradata also offers professional services with a cohesive strategy for warehouse deployment which tools and technology with expertise. References also report near 100% uptime.

As a result, Teradata has the ability to deliver a DBMS, an appliance, professional services, logical data models and tools to manage both the project delivery and the eventual deployment. In 2011, the addition and leveraging of the company’s purchase of Aster Data added a distributed processing extensible framework with the immediate addition of MapReduce capabilities (product integration is under way to its overall portfolio), with the potential for further extensibility (for example, graph language commands). The addition of new data types (such as bi-temporality) has extended the analytic capabilities. At the same time, the 2600 series of Teradata’s appliances is seeing increasing market adoption and providing entry level customers with a lower-priced solution.

Cautions• The pre-eminence of appliance

solutions in data warehouse bids results in a more competitive environment where Teradata needs to educate on its differentiating value. This has increased Teradata’s exposure to competition. In the smaller data warehouse markets, Gartner continues to note that some clients report that they selected competitors

Page 22: Sybase IQ ve Big Data

22

because there was “no discernible difference” in performance between Teradata’s offerings and those of its competitors’ appliances in such situations.

The same customers also report that their warehouse workloads are either somewhat predictable, or that they usually require four of the six data warehouse DBMS workloads that Gartner defines and rarely five. In such cases, Teradata must leverage its 2600 series products, which have lower first cost or intentionally lengthen the sales cycle to allow for an education process regarding the enterprise-class benefits of higher end solutions.

• IT vendor management and purchasing practices, as well as current market consolidation have renewed the appeal of a single-vendor approach to IT shops. Teradata continues to take a best-of-breed approach and is aware of this issue, as its partnerships involve both marketing and technological cooperation. We advise that organizations should focus on decision criteria relating to mixed workload demands, balanced system management and data optimization and a timeline for adding big data issues, which are critical factors in selecting the data warehouse DBMS.

However, some organizations will adopt these solutions more slowly and the result could be that Teradata commits resources for prospect education with a lower rate of conversion to revenue, compared to the majority of the data warehouse market. However, Teradata has exhibited lengthy market experience accomplishing a balance in the market between championing new practices and supporting product development to support new analytics and data warehouse practices.

• Gartner clients indicate during inquiries that they do not use the platform to its full potential because some of its optimization features are difficult to understand and use. However, the most important issue is that prospective clients are expected to understand the differentiation between Teradata’s appliance offerings and the enterprise-class product when deciding on a purchase. Most entry-level and even second-generation warehouse implementers have difficulty determining the future needs of their users.

In short, prospective customers need to be educated about Teradata’s approach before they can determine the difference between its products and more importantly, between Teradata’s appliance and those offered by competitors. In the current survey, clients have outlined the overall cost of the Teradata platform as a negative point. However, while the price point seems to be an issue, this has not come up as a reason to consider an alternative technology for the overall data warehouse platform and has generated more interest in managing older and less frequently accessed data with cheaper options.

VerticaVertica, a HP company (www.vertica.com) offers a fully integrated column-store analytic DBMS with a number of additional capabilities for high performance and high availability. Derived from research originally done at the Massachusetts Institute of Technology, its acquisition by HP was closed in May 2011. Vertica has more than 500 direct and OEM customers. HP also offers an appliance version.

Strengths• Vertica has earned its reputation as

an advanced DBMS with a rich array of features. Designed for scalable

shared-nothing deployment on MPP clusters of commodity servers with built-in high availability, including active replicas and automated node recovery, Vertica now claims to have scaled to 460 nodes. Vertica was an early leader in data compression, the use of memory and disk storage in combination for three levels of storage for hot and cold data, connectivity to Cloudera’s Distribution for Apache Hadoop (CDH) and extensibility via an software development kit that has permitted adding in-database execution for pattern matching, statistical and linear regression, geospatial, social graphing and more.

Its separation of a write-optimized store in memory from a read-optimized store on disk, which facilitated improved load speed, not only tackled a substantial problem for columnar databases, it also pointed the way to more general use of in-memory database technology. Release 5, which arrived a month after Vertica’s acquisition closed, added rich event and statistical processing, in-database user-defined analytic functions, geospatial processing and bulk transfer between clusters.

• Vertica’s customers cite its pricing model (based on the amount of SSED loaded into the DBMS, rather than on the number of users, servers, chips or cores) and its performance as key reasons for selection and satisfaction. Gartner spoke with several of Vertica’s claimed 500 customers using more than 100 TB of data (a number of which have a year or more of production history) that also cited availability and reliability, as well as ease of version migration. Set up and automated database design were also frequently mentioned, together with price have helped the company reach into the midmarket for prospects.

Page 23: Sybase IQ ve Big Data

23

• Flexible deployment options have helped Vertica’s reach since its earliest days and it was the first DBMS to run on cloud infrastructure, using Amazon Elastic Compute Cloud (EC2). Its addition of bulk transfer features for rapid provisioning and cloning of clusters across public and private cloud and dedicated hardware continues this theme. This will continue to allow Vertica to perform more POCs and should be an opportunity to leverage HP’s substantial commitment to cloud infrastructure.

Vertica will also continue to benefit from its SaaS and embedded models and presumably will leverage the ecosystem of HP partners to good advantage. Finally, Vertica Community Edition (free up to 1 TB, 3 nodes) was introduced along with version 5.0 to exploit HP’s reach to fill its pipeline. Follow up is also a key opportunity.

Cautions• HP’s acquisition brings uncertainty

along with opportunity. It claims substantial retention of Vertica staff, but its recent history of unsuccessful technology acquisitions, public dithering over strategy and management turmoil do not inspire confidence. HP’s prior acquisitions of data warehousing services leader Knightsbridge and system integrator EDS did not result in significant marketplace presence or visible wins in the data warehouse implementation services market and much of the experienced talent has departed, which leaves EDS lacking any significant data warehouse skilled teams and HP is weak in data warehouse implementation knowledge and experience. None of the customer referees we spoke to mentioned HP (negatively or positively).

• HP’s extensive partnerships seem to be an asset, but could also create challenges. For example, in the data warehouse appliance segment of the market, Microsoft is a strategic global partner whose flagship SQL Server product will receive its first refresh in several years during 2012 and HP is one of two vendors Microsoft will depend on to sell its Parallel Data Warehouse.

How HP’s sales force will treat these two different offerings will create uncertainty, as marketing of the Vertica appliance and its Microsoft appliance have been underwhelming as Oracle’s Exadata has dominated the airwaves. HP’s acquisition of Autonomy in October 2011 adds another element that could prove to be synergistic or delay the formation of a coherent strategy while the management turnover settles. A substantial drop in inquiries to Gartner reinforces the impression of a troubling slowdown of Vertica’s momentum.

• The column-store DBMS is no longer sufficient for differentiation. Vertica continues to face competition from more mature DBMS vendors as they add column-store compression and other capabilities (hybrid column and row store) to their DBMSs. Oracle Exadata’s and Sybase’s marketing have drowned out Vertica’s message, aided by HP’s relative absence from the scene. Vertica must continue to add high data volume customers with large numbers of users to prove its improving workload management capabilities or be relegated to analytic data mart installations instead of bidding for the strategic information platform role HP aspires to build for it.

Vendors Added or DroppedExclusion from the Magic Quadrant should not disqualify a vendor or its products from consideration. Clients should examine vendors and products based on their specific needs and circumstances.

We review and adjust our inclusion criteria for Magic Quadrants and MarketScopes as markets change. As a result of these adjustments, the mix of vendors in any Magic Quadrant or MarketScope may change over time. A vendor appearing in a Magic Quadrant or MarketScope one year and not the next does not necessarily indicate that we have changed our opinion of that vendor.This may be a reflection of a change in the market and, therefore, changed evaluation criteria, or a change of focus by a vendor.

AddedIn 2012, we have added the vendor Exasol, from Nuremberg, Germany to the Magic Quadrant. Exasol is a small DBMS vendor in Nuremberg, Germany and has been in business since 2000 with the first in-memory column-store DBMS, named EXASolution, available since 2004 and primarily used in deploying end-user specific or departmental analytic applications.DroppedWe were unable to collect sufficient information, including client references, market information, product road map, and strategic direction, to qualify Illuminate for inclusion in this Magic Quadrant.

Aster Data Systems was acquired by Teradata.

Vertica Systems was acquired by HP.

Inclusion and Exclusion Criteria• Vendors in this market must have

DBMS software that has been generally available for at least a year. We use the most recent release of the software for our evaluation. We do not consider beta releases.

Page 24: Sybase IQ ve Big Data

24

• Vendors must have generated revenue from a minimum of 10 verifiable and distinct organizations with data warehouse DBMSs in production.

• Customers in production must have deployed enterprise-scale data warehouses that integrate data from at least two operational source systems for more than one end-user community (such as separate business lines or differing levels of analytics).

• Support for these data warehouse DBMS products must be available from the vendor. We also consider open-source DBMS products from vendors that control or participate in the engineering of DBMSs.

• Vendors participating in the data warehouse DBMS market must demonstrate their ability to deliver the necessary infrastructure and services to support an enterprise data warehouse.

• Products that include unique file management systems embedded in front-end tools, or that exclusively support an integrated front-end tool, do not qualify for this Magic Quadrant.

Evaluation CriteriaAbility to ExecuteAbility to Execute is primarily concerned with the ability and maturity of the product and the vendor organization. Criteria under this heading also consider the product’s portability, its ability to run and scale in different operating environments (giving the customer a range of options), and the differentiation between data warehouse DBMS solutions and data warehouse appliances. Ability to Execute criteria are critical to customers’ satisfaction and success with a product, so customer references are weighted heavily throughout.

Specific CriteriaProduct/service includes the technical attributes of the DBMS. We include high availability/disaster recovery, support and management of mixed workloads, speed and scalability of data loading, and support for new hardware and memory models. These attributes are measured across a variety of database sizes and workloads. We also consider the automated management and resources necessary to manage the data warehouse, especially as it scales to accommodate larger and more complex workloads.

Overall viability includes corporate aspects such as the skills of the personnel, financial stability, research and development investment and merger and acquisition activity. It also covers the management’s ability to respond to market changes and the company’s ability to survive market difficulties (crucial for long-term survival).

Under sales execution/pricing we examine the price/performance and pricing models of the DBMS, and the ability of the sales force to manage accounts (judging from feedback from our clients). We also consider DBMS software market share.

Market responsiveness and track record covers references (for example, number and size of client companies,

nature of configurations and workload mix), general customer perceptions of the vendor and its products, and the diversity of delivery models. We also consider the vendor’s ability to adapt to market changes and its history of flexibility when it comes to market dynamics, including use of POCs as required by the market.

Marketing execution explores how well the vendor understands and builds its products in response to the needs of customers (from novices to advanced implementers), and how it develops offerings to meet those needs and the needs of the market in general. We also consider the vendor’s geographical ability to deliver solutions.

We evaluate customer support and professional services as part of the customer experience criterion, together with input from customer references. Also considered are the track record of POCs, customers’ perceptions of the product, and customers’ loyalty to the vendor (this reflects their tolerance of its practices and can indicate their degree of satisfaction).

Operations covers the alignment of the vendor’s operations, as well as whether, and how, they enhance its ability to deliver. We also include channel partnerships and the vendor’s ability to create and use a partnership model.

Evaluation Criteria

Product/Service

Overall Viability (Business Unit, Financial,

Strategy, Organization)

Sales Execution/Pricing

Market Responsiveness and Track Record

Marketing Execution

Customer Experience

Operations

Weighting

high

low

low

high

standard

high

low

Table 1. Ability to Execute Evaluation Criteria

Source: Gartner (February 2012)

Page 25: Sybase IQ ve Big Data

25

Completeness of VisionCompleteness of Vision encompasses a vendor’s ability to understand the functionality necessary to support the data warehouse workload design, the product strategy to meet market requirements, and the ability to comprehend overall market trends and to influence or lead the market when necessary. A visionary leadership role is necessary for the long-term viability of product and company. A vendor’s vision is enhanced by its willingness to extend its influence throughout the market by working with independent, third-party application software vendors that deliver data-warehouse-driven solutions (such as BI). A successful vendor will be able not only to understand the competitive landscape of data warehouses, but also to shape the future of this field.

Specific Criteria

Market understanding covers a vendor’s ability to understand and shape the data warehouse DBMS market and show leadership in it. In addition to examining a vendor’s core competencies in this market, we consider its awareness of new trends, such as the increasing sophistication of end users, the growth in data volumes and the changing concept of the enterprise data warehouse.Marketing strategy refers to a vendor’s marketing messages, product focus, and ability to choose appropriate target markets and third-party software vendor partnerships to enhance the marketability of its products. For example, whether the vendor encourages and supports independent software vendors in its efforts to support the DBMS in native mode.

An important criterion is sales strategy. This encompasses all channels and partnerships developed to assist with sales and is especially important for younger organizations, as it enables them greatly to increase their market presence while maintaining a lower cost of sales. This criterion also includes the vendor’s

ability to communicate its vision to its field organization and, therefore, to clients and prospective customers.

Offering (product) strategy covers the areas of product portability and packaging. Vendors should demonstrate a diverse strategy that enables customers to choose what they need to build a complete data warehouse solution. We also consider the availability of certified configurations and appliances based on the vendor’s DBMS.

Business model covers how a vendor’s model of a target market combines with its products and pricing, and whether it can generate profits with this model, judging from its packaging and offerings.

We do not believe that vertical/industry strategy is a major focus of the data warehouse DBMS market, but it does affect a vendor’s ability to understand its clients. Items such as vertical sales teams and specific vertical data models are considered here.

Innovation is a major criterion when evaluating the vision of data warehouse DBMS vendors in developing new functionality, allocating R&D spending and leading the market in new directions. It also includes a vendor’s ability to innovate and develop new functionality in the

DBMS, specifically for data warehouses. The use of new storage and hardware models is key. Increasingly, users expect a DBMS to become self-tuning, reducing the resources involved in optimizing the data warehouse, especially as mixed workloads increase. Also addressed here is the maturation of alternative delivery methods such as infrastructure-as-a-service and cloud infrastructures.

We evaluate a vendor’s worldwide reach and geographic strategy by considering its ability to use its own resources in different regions, as well as those of subsidiaries and partners. This criterion includes a vendor’s ability to support clients throughout the world, around the clock, in many languages.

Quadrant Descriptions

LeadersThe Leaders quadrant contains the vendors that demonstrate the greatest support for data warehouses of all sizes, with large numbers of concurrent users and management of mixed data warehousing workloads. These vendors lead in data warehousing by consistently demonstrating customer satisfaction and strong support, as well as longevity in the data warehouse DBMS market, with strong hardware alliances. Hence, Leaders

Evaluation Criteria

Market Understanding

Marketing Strategy

Sales Strategy

Offering (Product) Strategy

Business Model

Vertical/Industry Strategy

Innovation

Geographic Strategy

Weighting

high

standard

standard

high

standard

low

high

low

Table 2. Completeness of Vision Evaluation Criteria

Source: Gartner (February 2012)

Page 26: Sybase IQ ve Big Data

26

also represent the lowest risk for data warehouse implementations, in relation to, among other things, performance as mixed workloads, database sizes and complexity increase. Additionally, the market’s maturity demands that Leaders maintain a strong vision for the key trends of the past year: mixed-workload management for end-user service-level satisfaction and data volume management.

ChallengersThe Challengers quadrant includes stable vendors with strong, established offerings but a relative lack of vision. These vendors have presence in the data warehouse DBMS space, proven products and demonstrable corporate stability. They generally have a highly capable execution model. Ease of implementation, clarity of message and engagement with clients contribute to these vendors’ success. Challengers offer a wide variety of data warehousing implementations for different sizes of data warehouse with mixed workloads. Organizations often purchase Challengers’ products initially for limited deployments, such as a departmental warehouse or a large data mart, with the intention of later scaling them up to an enterprise-class deployment.

VisionariesVisionaries take a forward-thinking approach to managing the hardware, software and end-user aspects of a data warehouse. However, they often suffer from a lack of a global, and even strong regional, presence. They normally have smaller market shares than Leaders and Challengers. New entrants with exceptional technology may appear in this quadrant soon after their products become generally available. But, more typically, vendors with unique or exceptional technology appear in this quadrant once their products have been generally available for several quarters. The Visionaries quadrant is often populated by new entrants with new architectures and functionalities that are unproven in the market.

To qualify as Visionaries, vendors must demonstrate that they have customers in production, in order to prove the value of their functionality and/or architecture. Our requirements for production customers and general availability for at least a year mean that Visionaries must be more than just startups with a good idea. Frequently, Visionaries will drive other vendors and products in this market toward new concepts and engineering enhancements. In 2010, the Visionaries quadrant was thinly populated with vendors meeting demand from some market segments for aggressive strategies for specific functions, such as the use of MapReduce for large-scale data analytics and massive process scaling in heterogeneous hardware environments.

Niche PlayersNiche Players generally deliver a highly specialized product with limited market appeal. Frequently, a Niche Player provides an exceptional data warehouse DBMS product, but is isolated or limited to a specific end-user community, region or industry. Although the solution itself may not have limitations, adoption is limited.This quadrant contains vendors in several categories:

• Those with data warehouse DBMS products that lack a strong or a large customer base.

• Those with a data warehouse DBMS that lacks the functionality of those of the Leaders.

• Those with new data warehouse DBMS products that lack general customer acceptance or the proven functionality to move beyond niche status. Niche Players typically offer smaller, specialized solutions that are used for specific data warehouse applications, depending on the client’s needs.

ContextThe apparent backward movement of nearly all the vendors does not specifically indicate a change in their relative ability

to compete against each other. In fact, at times a niche vendor will compete and win due to the use case and customer demands against a leader. This is a function of new and different demands in the market from end-users who are implementing with and using products. The data warehouse market is redefining itself and we fully expect the vendors to adjust to these new demands.

There is more on all of these trends in the Market Overview section of this document. It is certain that the information management market and thus the data warehouse DBMS market, is undergoing significant changes and the result will be the refinement of this Magic Quadrant’s evaluation criteria in coming years reflecting this divergence of tradition from the last 20 years toward the future market demands. It is highly probable that traditional execution will remain important while market understanding of traditional practices becomes less important for visionaries and leaders.

Over the past 20 years, the data warehouse market has exhibited a pattern wherein visionary architectures are adopted by less than 20% of organizations as much as five to seven years before they become more widely accepted. In 2011, the warehouse began a visionary metamorphosis away from a repository-only based solutions toward a coordinated information processing and delivery semantic. Gartner calls this new archetype the “Logical Data Warehouse” (LDW) and it has a significant impact on the vision expectations in the 2012 Magic Quadrant. The LDW is only one potential new set of best practices for warehousing and analytics data management. The warehouse remains mission-critical and even more so from 2012 forward (see Note 4).

Others include wholesale replacement of the warehouse with search, content analytics and MapReduce clusters for example, while dropping the centralized repository completely. Gartner strongly

Page 27: Sybase IQ ve Big Data

27

asserts the position that the LDW is the correct vision for the new wave of analytics best practices that is now emerging. However, current adoption is embryonic and anticipated uptake will be incremental for the next two years.It is very likely that traditional data warehouse solutions will become common practices within the next two years, eclipsed by this new approach. The result will be a radical shift in how this particular Magic Quadrant analysis is presented by 2014. It is worthy to note that vendors such as EMC/Greenplum, HP/Vertica, ParAccel and SAP/Sybase continue to apply pressure in the market to move visionary concepts forward. The leading vendors are demonstrating differing levels of response to this pressure and how they respond will factor significantly into their ability to maintain leadership in visionary delivery to the market.

The most significant execution change in the 2011 market was the rapid rise in the demand for “Big data,” which was seen as visionary in 2010. During 2011, we also saw an increased demand for professional services and demands for increased specific data warehouse skills in the customer base. The result is a much larger playing field, meaning the overall represented area for this Magic Quadrant analysis is much larger in 2012 and will challenge almost every vendor in both vision and execution. Some vendors responded more aggressively than their market counterparts relative to the LDW for big data market demands and integrating their professional services organization with product delivery. Appliance adoption is continuing.

The DBMS market continued its growth in 2011, with niche vendors pushing into challenger and visionary positions. (Gartner tracks DBMS market share and it is possible to create approximations of the data warehouse market size based on informed analysis, but here we are referring to DBMS and not data warehouse market share.)

Acquisitions continued in 2011 with the major vendors adding big data solutions and the smaller vendors finding new opportunities with new partners. In 2011, the role of Sybase as part of SAP became more obvious and EMC established a clearer road map of how Greenplum factors into its larger customer base.

Aster Data (a significant visionary in 2010 and 2011) was acquired by Teradata and Vertica was added to the HP portfolio. At the same time, longtime megavendor rivals IBM and Oracle continued to expand their offerings, with IBM focusing on improved solution selling for a complete data warehouse and analytics delivery and Oracle expanding its appliance, ready to run offerings to maintain its strong ability to execute in the market.

Throughout 2011, Microsoft re-focused on its vision for the data warehouse market and the market is responding with guarded optimism, so good execution in 2012 is essential. It is important to remember that almost any data warehouse DBMS can deliver batch/bulk load support, basic reporting and essential OLAP. It is the addition of data mining, operational BI and real-time components which create complexity. As all data warehouse workload “mixes” are different in each organization and the adoption of warehouse capability also varies, it is imperative to do an appropriate assessment for needs and requirements to determine the best platform for any implementation.

Many trends, such as poor data warehouse performance, heavy competition and widely disparate marketing claims, will continue through 2012 and beyond. In 2011, we saw a surge in a desire to explore and pilot big data analytics and processing, in this case, mistakenly isolated to MapReduce solutions in most inquiries. However this surge in interest is also driving the use of MapReduce technologies into Gartner’s “Trough of Disillusionment.” As a result,

data warehouse DBMS vendors must participate in the marketplace to support big data solutions and how well vendor solutions executed throughout 2011 had significant impact on their execution rating.

Most importantly, the solution can be any form of execution, a separate product, integrated with current products or even via a partner (see “Data Warehousing Trends for the CIO, 2011-2012” for new and continuing trends in this research). In 2011, we saw vendor consolidation with the larger vendors acquiring many of the innovators and adding their visionary capabilities to more traditional market delivery this had the effect of causing a return to a linear trajectory in the analysis results, which means that innovations in delivery models, big data beyond volume management and distributed analytics could be a wide-open, visionary space in 2012.

This Magic Quadrant analysis deals with the key capabilities of your business analytics framework and your information capabilities framework. As such, it should interest anyone involved in defining, purchasing, building and/or managing a data warehouse environment – notably CIOs, chief technology officers, members of business intelligence (BI) competency centers, infrastructure, database and data warehouse architects, DBAs and IT purchasing departments.

Gartner’s Magic Quadrant process also includes contacts and research regarding vendors that ultimately are not included in this analysis. This is the process, which adds new vendors to the research on a periodic basis (for example, Exasol in 2012). In 2013, we are planning for a section in the Magic Quadrant analysis on vendors considered but not included.

Market OverviewIn 2011, the economic “new normal” became better understood and organizations in nearly every vertical market began to target more holistic

Page 28: Sybase IQ ve Big Data

28

and comprehensive efforts to leverage the information available to them as a means of differentiating their business performance.

In 2010, revenue in the relational DBMS market was up almost 10% over 2009, at $20.7 billion. There are many factors contributing to the DBMS market growth, only one of which is the implementation of data warehouses supporting analytics. However, the combination of consumerized information management with consumer-driven analytics makes a strong case for asserting that data warehouse implementations were a significant contributor to market growth in 2011.

DBMS licenses can be implemented for any information management use case (for example, analytics, OLTP, metadata management and master data management), which means that the size of the data warehouse market can be estimated, rather than reporting actual revenue.

The LDW demand in the marketplace is significant, but is being pursued primarily by analytics architecture leader organizations. The LDW incorporates a combined infrastructure of repositories, data virtualization, distributed processing, system auditing metadata, end-user service level declarations and a decision engine to determine which of the data solutions available meet the negotiation between the SLAs and the system auditing results best.

During the period from September 2010 through November 2011, Gartner inquiries mentioning some aspect of the LDW design increased from virtually nil to approximately 15% of data warehouse inquiries. We anticipate that inquiries regarding the LDW hybrid environment will increase at a faster rate, with some aspect of the approach appearing in 25% (or slightly higher) of data warehousing advice inquiries by the end of 2012. However, actual market adoption will

be uneven and very few (if any) fully deployed LDWs will exist by the end of 2012. We anticipate that it will remain a strong component of vision evaluation criteria for some time.

A more subtle aspect of the LDW is that it completely changes the definition of “size” of a data warehouse away from repository concepts to access and performance. Performance and information asset value defined by ease of access and the ability to apply information to use cases will become the new and most important value-metrics. Even with early adoption, the impact on the Magic Quadrant this year is primarily related to vendor vision. In “Does the 21st Century “Big Data” Warehouse Mean the end of the Enterprise Data Warehouse?” Gartner released the LDW concept after nearly 20 months of tracking the phenomenon.

The volume, variety, velocity and complexity issues which constitute big data quantitative capabilities and being able to address them, constituted a significant portion of the ability to execute in 2011 and we anticipate that its importance will increase by the end of 2012. In “’Big Data’ Is Only the Beginning of Extreme Information Management” Gartner defined a twelve dimensional representation of big data solution design, which we call Extreme Information Management issues. Prior to the complete vision, Gartner expressed the quantitative aspects of extreme information management (big data). Gartner first identified this trend; stating “Many organizations that are creating large amounts of data that need to be analyzed and used are turning to MapReduce-enabled DBMSs to gain performance by processing these large sets of data in a parallel environment”.

As part of the answer to a troubled economic environment, the data warehouse has become a central element in information management and analytics for organizations in differentiating their

performance relative to their peers. Organizations expect architecture and implementation leadership from vendors’ professional services and support organizations and/or their partner and distribution channels.

Gartner first identified this trend as part of the overall data warehouse delivery in its Magic Quadrant analysis in 2010 (for 2009 market research), where we stated vendors have placed “significant and appropriate emphasis on the formalization of professional services to support data warehouse delivery in 2009. Some have purchased consultancy organizations, others have introduced formal approaches for identifying best practices from their existing field delivery teams and are creating standards of delivery based on those experiences.” In our 2011 Magic Quadrant analysis, it became an important evaluation execution criteria and under a solution selling model throughout 2011, organizations implementing warehouse attributed significant positive effects to the presence of qualified professional services teams.

Appliances remain popular and most data warehouse environments will eventually include an appliance somewhere. However, the market has not yet determined the acceptable threshold regarding “how much” of it is appliance driven. It is important to note that while appliances continued to be popular in 2011, the No. 1 complaint is inflexibility regarding hardware. Further, the appliance market (even if all of Teradata, Exadata, IBM/Netezza and others are included) after 30 years of Teradata, seven years of Netezza and three years of Oracle Exadata, constitutes less than 15% of the delivered units in the data warehouse total market. Given that most large data warehouses witness a major revision and retrofit between years five to seven of their life cycle, the timing indicates that 2012 to 2013 could see an acceleration of appliance adoption.

Page 29: Sybase IQ ve Big Data

29

In 2011, the draw toward analytics has provided significant new opportunities for entrants into the market or simply new opportunities for some struggling vendors already in the market. The noSQL movement (which is really not only SQL) has opened the door to information repositories that more closely resemble content systems than relational databases.

The market also changed in another way. Many of the previous visionaries were acquired by the mega vendors (IBM/Netezza, HP/Vertica, SAP/Sybase and EMC/Greenplum) and hardware and infrastructure vendors found themselves searching for less threatening hardware partners to share and build their own channels. As a result, opportunities for the smaller data warehouse DBMS vendors abound as the market builds a new set of options for configured infrastructures and eco-system partners.

At the same time, some challenges have emerged for the traditional leaders. Oracle has more than three years experience in the market with Exadata, which is an inflection point for managing and scaling most data warehouses, making 2012 a bell-weather year for determining if Oracle’s appliance strategy will continue to grow or pause.

Additionally, IBM has begun to leverage the Netezza acquisition by gaining significant new customers (especially relative to Linux). Teradata’s appliance strategy for its 2600s and 1600s series of products has resulted in both an “on ramp” for more Teradata customers and a perimeter defense against the incursions of other appliance vendors.

The market demand is clear in that more data miners are competing with more reporting and more basic analytics in a manner that is approaching a 24 hour/day operational window. Some vendors have embraced the dual strategy by developing or acquiring fast replication and synchronization technology between

two identical “warehouses,” while others advise their customers to scale a single warehouse with more processing capacity and memory as well as load balancing, with most of the leaders offer multiple alternatives.

Gartner clients reported an increasing number of “dual” warehouses in 2011. Sometimes, these warehouses are two-tiered with a base warehouse underneath and a query-optimized second warehouse in production above it (these are complete copies of the warehouse simply stored differently). These are sometimes referred to as side-by-side operations. However, regardless of what it is named, this is an optimization strategy based on separating physical workloads – usually isolating loads and basic reporting or basic OLAP from the more data-intensive data mining efforts.

Organizations are also seeking alternatives to the traditional model where they own software licenses and servers and storage. The managed services warehouse is gaining market traction and companies like IBM’s managed services, HP (via EDS) and even Cognizant (a professional services vendor) offer one alternative. Data warehouse database as a service (dbaaS) providers offer a warehouse on a platform from companies such as Kognitio and 1010data. dbaaS vendors such as Kognitio and 1010data offer DBMS implementations hosted on behalf of its customers with the hosted database off-site.

We continue our stance in 2011 and 2012 that POCs are not only mandatory to evaluate implementation options, but should be comprehensive examples of each of the workload types, which regularly occur in your own data warehouse.

Organizations executing POCs using their own data at their own sites have reported experiences different from common market experiences for their

chosen vendor. Additionally, Gartner clients report that one of the most important results of POCs is simply assessing how quickly a solution can be deployed and configured for operations, even though the vendor POC team can overtly influence this experience. For this same reason, while lab-based POCs are acceptable to examine workload mix and performance metrics in general, they are not specific for giving information on your actual time to delivery.

The data warehousing solution space now exhibits two highly distinct populations, traditional data warehouses and hybrid-enabled warehouses combining structured data and content (either in one management system or via database management system-enabled functionality such as UDFs, managed external processing and so on). The traditional data warehouse solution continues to pursue high performance, integrated data analysis, primarily for structured or tabular data. The performance demands in this space continue to rise.

The hybrid warehouse takes many forms, but in general, the market is demanding repository, virtualization and distributed processing capability, managed by a single system and able to respond to various use cases, which is another incarnation of the logical data warehouse.

In addition, we believe the data warehouse DBMS market will continue to change in 2012 to fulfill the demand for high speed, lower latency and large volumes of data brought about by new high-value applications.

As stated in the previous version of this Magic Quadrant analysis, we believe vendors have begun to establish their positions in preparation for a major battle over the data management role in the enterprise. Vendors that do not differentiate their offerings will either leave the market by choice or be forced out by economic necessity.

Page 30: Sybase IQ ve Big Data

30

Once vendors have established their positions, the major tussle will begin, toward the end of 2013. It is becoming clearer that this will represent a major upheaval in the market, one that the larger vendors need to prepare for and that will give smaller vendors a market opportunity.

The new analytics infrastructure is a combination of services, platforms, repositories, metadata and optimization techniques which all work in concert. The “data warehouse” will become “data warehousing” – again. The concept of a single grand repository managing all the information for analytics use cases will be increasingly challenged and near 2017, a new infrastructure of highly-distributed processes and information assets will have emerged.

As described in “The State of Data Warehousing in 2011” several aspects of this battle are emerging:

• The combination of repositories, data virtualization and data buses is now possible, given the state of hardware technology.

• The reduced influence of BI platform optimization, in favor of DBMS optimization.

• The increasing influence of master data management and data quality.

• The demand for cloud solutions.

• The rising demand for combining structured and content information.

Organizations have expressed an interest in technical solutions that is starting to erode the 2009 effect where everyone sought vendor financial viability. A spirit of experimentation, pilot schemes and prototypes has re-emerged. Organizations are reminded to closely align their analytics strategies and vendor road maps when choosing vendors.

The data warehouse DBMS market is complex, with a mix of mature and new products. Its complexity reflects many factors, such as:

• The need for DBMS systems to support database sizes ranging from the 2 terabytes to 1+ petabytes.

• The complexity of data in data warehouses, not only in terms of interrelationships, but also of desired data types.

• The fact that data warehouses are built on many different hardware and operating systems, which a DBMS needs to support.

• The growing and regularly changing variety of operations performed in data warehouses, which requires continuous management of the DBMS.

A prepackaged or preconfigured, balanced set of hardware (servers, memory, storage and input/output channels), software (OS, DBMS and management software), service and support, sold as a unit with built-in redundancy for high availability and positioned as a platform for data warehousing. Further, it must be sold on the basis of the amount of SSED (“raw data”) to be stored in the data warehouse and not of configuration (for example, the number of servers or storage spindles).Our performance criteria have some flexibility to accommodate vendors that have several variations, based on desired performance SLAs and the type of workload intended for the appliance. Our primary concern is that the user (buyer) cannot change the configuration due to budgetary issues, thereby adversely affecting the performance of the appliance.

Note 1Definition of a Data Warehouse Appliance

• A DBMS has to support workloads ranging from simple to complex and to manage mixed workloads in many different combinations.

• Users are getting better at creating specific SLAs and the implications of not meeting them are more serious.

The data warehouse DBMS has evolved from being an information store to a support for reporting and traditional BI platforms and now into a broader analytics infrastructure supporting operational analytics, performance management and other new applications and uses, such as operational BI, real-time fraud detection or consumer experience personalization and operational technologies (technologies that stream data from devices such as smart meters). Organizations are adding additional workloads with OLTP access and data loading latency is falling to near-continuous loading.

There are many other aspects to the data warehouse DBMS market, such as pricing models, geographic reach, partner channels, third-party software partnerships and data warehouse services (see “The State of Data Warehousing, 2012” [forthcoming] and “Data Warehousing Trends for the CIO, 2011-2012” for further information on these trends).

Source: Gartner Research, G00219281, Mark A. Beyer, Donald Feinberg, Merv Adrian, Roxane Edjlali, 6 February 2012

Page 31: Sybase IQ ve Big Data

31

The modern complex mixed workload consists of:• Continuous (near-real-time) data loading, similar to an OLTP workload (due to the updating of indexes and other optimization structures in the

data warehouse) that creates issues for summary and aggregate management to support dashboards and prebuilt reports.• Batch data loading, which persists as the market matures and starts to realize that not all data is required for “right time” latency and that some

information, being less volatile, does not need to be refreshed as often as more dynamic real-time data elements.• Large numbers of standard reports, thousands per day, requiring SQL tuning, index creation, new types of storage partitioning and other types of

optimization structure in the data warehouse.• Tactical business analytics in which business process professionals with limited query language experience use prebuilt analytic data objects with

aggregated data (prejoins) and designated dimensional drill-downs (summaries). They rely on a BI architect to develop commonly used cubes or tables.

• An increasing number of truly ad hoc query users (data miners) with random, unpredictable uses of data, which implies a lack of ability to tune specifically for these queries.

• The use of analytics and BI-oriented functionality in OLTP applications, creating a highly tactical use of the data warehouse as a source of informa-tion for OLTP applications requiring high-performance queries. This is one force driving the requirement for high availability in the data ware-house.

Note 2Definition of Mixed Workload

Issues of “extreme information” arise from the simultaneous and persistent interaction of extreme volume, diversity of data format, velocity of record creation, variable latencies and the complexity of individual data types within formats. Big data is another popular term for this concept, but it encourages a focus on a single aspect (volume), creating definitional issues, which will remain unresolved in the market (see “’Big Data’ Is Only the Beginning of Extreme Information Management”).

Note 3Definition of Extreme Information

Mission-critical systems are systems that support business processes and the generation of revenue and that, if absent for a period of time determined by the organization and its service-level agreements, must be replaced by manual procedures to prevent loss of revenue or unacceptable increases in business costs. Normally, mission-critical systems require high-availability systems and disaster recovery sites. We include the use of a DBMS as a data warehouse engine in the mission-critical systems category, as we believe that many, if not most, data warehouses in use today fit the definition of mission-critical.

Note 4Definition of Mission-Critical Systems

Page 32: Sybase IQ ve Big Data

32

Evaluation Criteria DefinitionsAbility to Execute

Product/Service: Core goods and services offered by the vendor that compete in/serve the defined market. This includes current product/service capabilities, quality, feature sets, skills, etc., whether offered natively or through OEM agreements/partnerships as defined in the market definition and detailed in the subcriteria.

Overall Viability (Business Unit, Financial, Strategy, Organization): Viability includes an assessment of the overall organization’s financial health, the financial and practical success of the business unit, and the likelihood of the individual business unit to continue investing in the product, to continue offering the product and to advance the state of the art within the organization’s portfolio of products.

Sales Execution/Pricing: The vendor’s capabilities in all pre-sales activities and the structure that supports them. This includes deal management, pricing and negotiation, pre-sales support and the overall effectiveness of the sales channel.

Market Responsiveness and Track Record: Ability to respond, change direction, be flexible and achieve competitive success as opportunities develop, competitors act, customer needs evolve and market dynamics change. This criterion also considers the vendor’s history of responsiveness.

Marketing Execution: The clarity, quality, creativity and efficacy of programs designed to deliver the organization’s message in order to influence the market, promote the brand and business, increase awareness of the products, and establish a positive identification with the product/brand and organization in the minds of buyers. This “mind share” can be driven by a combination of publicity, promotional, thought leadership, word-of-mouth and sales activities.

Customer Experience: Relationships, products and services/programs that enable clients to be successful with the products evaluated. Specifically, this includes the ways customers receive technical support or account support. This can also include ancillary tools, customer support programs (and the quality thereof), availability of user groups, service-level agreements, etc.

Operations: The ability of the organization to meet its goals and commitments. Factors include the quality of the organizational structure including skills, experiences, programs, systems and other vehicles that enable the organization to operate effectively and efficiently on an ongoing basis.

Completeness of Vision

Market Understanding: Ability of the vendor to understand buyers’ wants and needs and to translate those into products and services. Vendors that show the highest degree of vision listen and understand buyers’ wants and needs, and can shape or enhance those with their added vision.

Marketing Strategy: A clear, differentiated set of messages consistently communicated throughout the organization and externalized through the website, advertising, customer programs and positioning statements.

Sales Strategy: The strategy for selling product that uses the appropriate network of direct and indirect sales, marketing, service and communication affiliates that extend the scope and depth of market reach, skills, expertise, technologies, services and the customer base.

Offering (Product) Strategy: The vendor’s approach to product development and delivery that emphasizes differentiation, functionality, methodology and feature set as they map to current and future requirements.

Business Model: The soundness and logic of the vendor’s underlying business proposition.

Vertical/Industry Strategy: The vendor’s strategy to direct resources, skills and offerings to meet the specific needs of individual market segments, including verticals.

Innovation: Direct, related, complementary and synergistic layouts of resources, expertise or capital for investment, consolidation, defensive or pre-emptive purposes.

Geographic Strategy: The vendor’s strategy to direct resources, skills and offerings to meet the specific needs of geographies outside the “home” or native geography, either directly or through partners, channels and subsidiaries as appropriate for that geography and market.

Page 33: Sybase IQ ve Big Data

33L03259

About SybaseSybase, an SAP company, is the industry leader in delivering enterprise and mobile software to manage, analyze and mobilize information. We are recognized globally as a performance leader, proven in the most data-intensive industries and across all major systems, networks and devices. Our information management, analytics and enterprise mobility solutions have powered the world’s most mission-critical systems in financial services, telecommunications, manufacturing and government. For more information, visit www.sybase.com. Read Sybase blogs: blogs.sybase.com. Follow us on Twitter at @Sybase.