The Database Report July 2013

With the second quarter of 2013 behind us now it can only mean one thing…it must be time to review what happened in the database systems market during the past quarter. The quarter brought the typical news of acquisitions and quarterly financial results. But it also brought some big news in terms of new technology announcements and the publication of the annual data breach investigations report from Verizon.

So let’s dig in to this quarter’s edition of The Database Report, and examine the news and happenings in the world of data and database systems during the second quarter of 2013.

Verizon Data Breach Investigations ReportLet’s start off by highlighting some of the findings of the Verizon Data Breach Investigations Report (DBIR), which was unleashed in April. Verizon has been publishing the DBIR annually since 2009.

The number of data records compromised in 2012 seems to have diminished. The DBIR reports that 44.8 million records were compromised in 2012 as compared to 174.5 million in 2011. So shouldn’t we all be celebrating? Well, not so fast. The report goes on to clarify that 44.8 million should only be considered as a lower bound of the true number of compromised records because an accurate counting of the complete number of compromised assets was not known in 85 percent of the breaches. The other interesting aspect of the total is that most of the 44.8 million records can be traced back to just a few very large data breaches – a trend that continues from past years.

So what type of data was being breached last year? The DBIR places large-scale financial cybercrime (75 percent) and state-affiliated espionage (20 percent) at the top of the list. Nevertheless 2012’s data breach victims span a wide range of industries from financial organization to retailers, restaurants, manufacturing, transportation, utilities, professional services, and so on. It would seem that whereas some industries are magnets for having data breaches there is no particular industry that is immune from having their data compromised.

The most important takeaway from the DBIR, in my opinion, is that organizations are not doing enough to identify and prevent data breaches. The report contains details on the difficulty of initial compromise, with a surprisingly large percentage of “very low” and “low” marks. But even more troubling is that the difficulties of subsequent actions (after the initial compromise) are also reported as very low to low (although, to be fair, the difficulty does increase at this stage).

Adding to the problem is that once data is breached, organizations often do not discover the breach until much, much later. The percentage of data breaches that remained undiscovered for months or more ranked at 66 percent for 2012.

So how does data get breached? That is, what methods of attack are deployed to get at the data? Well, weak or stolen user names and passwords (76 percent), followed by malware (40 percent), physical attacks (35 percent) and social tactics, such as phishing (29 percent). So protect your login credentials, okay?!?!

A full copy of the Verizon DBIR can be downloaded – free of charge – over the web at: http://verizonenterprise.com/DBIR/2013/.

Finance and NumbersLet’s turn our attention now to the quarterly financial announcements of the major data and database organizations: IBM, Microsoft, Oracle, and SAP. Starting with IBM…

IBM’s Fourth Quarter

IBM reported fourth quarter 2012 earnings of $5.13 per share, compared with earnings of $4.62 per share in the fourth quarter of 2011, an increase of 11 percent. Net income for the fourth quarter was $5.8 billion compared with $5.5 billion in the fourth quarter of 2011, an increase of 6 percent. Total revenues for the fourth quarter of 2012 of $29.3 billion decreased 1 percent (flat adjusting for currency) from the fourth quarter of 2011.

“We achieved record profit, earnings per share and free cash flow in 2012. Our performance in the fourth quarter and for the full year was driven by our strategic growth initiatives — growth markets, analytics, cloud computing, Smarter Planet solutions – which support our continued shift to higher-value businesses,” said Ginni Rometty, IBM chairman, president and chief executive officer.

For the full year of 2012, net income was up 5 percent to $16.6 billion as compared with $15.9 billion in 2011. Revenues for 2012 came in at $104.5 billion, a decrease of 2 percent as compared with $106.9 billion in 2011.

IBM proudly touted its mainframe (System z) revenue for the quarter, which increased 56 percent as compared with the same quarter last year. Delivery of System z computing power, as measured in MIPS (millions of instructions per second), increased 66 percent versus the same quarter in 2011, representing the biggest quarter in terms of MIPS shipped in the company’s history. New workload specialty engines, including Linux, represented one-half of the MIPS shipped.

The mainframe growth is remarkable given the image it has in the marketplace, but evidently mainframe customers continue to invest in the platform as a secure, stable, high performance platform. This comes as no surprise to those who work with mainframe technology every day, but perhaps it surprises the Linux and Windows crowd… And the mainframe is gaining new customers, not just churning the existing base. Since the third quarter of 2010 IBM claims to have added 200 new System x accounts. Additionally, 90 new ISVs delivered products for System z in 2012 alone.

To put the mainframe success into perspective, let’s take a peek at IBM’s other hardware lines of business. Revenues from Power Systems decreased 19 percent as compared to same quarter in 2011; System x revenues decreased 2 percent and storage revenues decreased 5 percent.

Software did well this quarter for IBM, too. Software revenues were $7.9 billion, an increase of 3 percent over the fourth quarter of 2011. Revenues from IBM’s key middleware products, which include WebSphere, Information Management, Tivoli, Lotus and Rational products, were $5.5 billion, an increase of 5 percent as compared to the fourth quarter of 2011. For the Information Management segment, which includes DB2, software revenues increased 2 percent over the same quarter last year. In other IBM software: operating systems revenues were flat at $709 million, WebSphere revenues increased 11 percent year over year, Tivoli (systems management) revenues increased 4 percent, Lotus increased 9 percent and Rational increased 12 percent.

IBM said that it expects to deliver full-year 2013 earnings per share of at least $15.53.

Later, toward the end of April, the IBM Board of Directors approved a 12 percent increase in the quarterly cash dividend it pays to shareholders. The increase will be $0.10 per share on top of the prior quarterly dividend of $0.85 per share. After paying the June 10 dividend, IBM will have paid consecutive quarterly dividends every year since 1916.

At the same time, IBM’s board authorized $5 billion to be used in the company’s stock repurchase program. This amount is in addition to approximately $6.2 billion remaining at the end of March from a prior buyback authorization. With this new authorization, IBM will have approximately $11.2 billion for its stock repurchase program. IBM expects to request additional share repurchase authorization at the October 2013 board meeting. IBM has reduced its share count by a third since the beginning of 2000.

Ginni Rometty, IBM chairman, president and chief executive officer said, “IBM’s business model focused on higher value and continuous transformation continues to generate strong profit and cash flow. This enables the company to deliver value to our shareholders.”

Microsoft’s Third Quarter
In mid-April Microsoft announced its third quarter revenue, which came in at $20.49 billion for the quarter ended March 31, 2013. Operating income, net income, and diluted earnings per share for the quarter were $7.61 billion, $6.06 billion, and $0.72 per share.

“The bold bets we made on cloud services are paying off as people increasingly choose Microsoft services including Office 365, Windows Azure, Xbox LIVE, and Skype,” said Steve Ballmer, chief executive officer at Microsoft. “While there is still work to do, we are optimistic that the bets we’ve made on Windows devices position us well for the long-term.”

The Microsoft Business Division posted $6.32 billion of revenue, an 8 percent increase from the same period of the prior year. The Server & Tools business reported $5.04 billion of revenue, an 11 percent increase from the prior year period. Microsoft reported that the results for this division were driven by double digit percentage revenue growth in SQL Server and System Center.

Microsoft’s other business divisions performed very well, too. The Windows Division posted revenue of $5.70 billion, a 23 percent increase from the prior year period; the Online Services Division reported revenue of $832 million, an 18 percent increase; and the Entertainment and Devices Division posted revenue of $2.53 billion, an increase of 56 percent.

Microsoft provided revised operating expense guidance, adjusting downward to a range of $30.2 billion to $30.5 billion for the full year ending June 30, 2013. The adjustment is due to the European Commission fine imposed on Microsoft in early March for failing to respect an antitrust settlement with regulators. Microsoft had agreed to give users who purchased new computers in Europe a ballot screen that would allow them to easily download other browsers from the Internet and to turn off Microsoft Internet Explorer. But the EU allowed Microsoft to monitor its compliance with the deal. Evidently, as this fine shows, Microsoft’s compliance wasn’t good enough! Microsoft issued the following statement:

“We take full responsibility for the technical error that caused this problem and have apologized for it. We provided the Commission with a complete and candid assessment of the situation, and we have taken steps to strengthen our software development and other processes to help avoid this mistake – or anything similar – in the future.

Microsoft also provided preliminary fiscal year 2014 operating expense guidance of $31.6 billion to $32.2 billion, representing growth ranging from 4 percent to 6 percent.

Microsoft continues to pay dividends on its stock, too. In March, the company announced that its board of directors declared a quarterly dividend of $0.23 per share, payable June 13, 2013 to shareholders of record on May 16, 2013.

Oracle’s Third Quarter
Things were not quite as rosy at Oracle, as the company missed its third quarter numbers. In late March the company announced that fiscal 2013 Q3 total revenues were down 1 percent to $9.0 billion. New software licenses and cloud software subscriptions revenues were down 2 percent to $2.3 billion. Software license updates and product support revenues were up 7 percent to $4.3 billion. Hardware systems products revenues were $671 million.

The press release did not specify the percentage decline or advance for hardware systems, which is indicative of just how bad that number was – down 23 percent from a year ago.

In what may be related news, it was reported in early May by Business Insider that Oracle had laid off employees in the legacy hardware business units… that is, the units acquired in the Sun Microsystems acquisition (circa 2010). However, the same source indicates that the layoff was tiny and involves perhaps a few dozen people. If interested, more details are available at http://www.businessinsider.com/oracle-laid-off-a-few-more-people-today-sources-say-2013-5#ixzz2TKfuXZmm.

So what is the trouble at Oracle? Well, in my opinion, Oracle is facing a lot of competition in its core businesses while struggling to become a significant hardware player. Its flagship database software is being challenged by the Hadoop/NoSQL crowd (as well as the traditional IBM and Microsoft competition). SAP is pushing HANA for its packaged applications, and much of that business went to Oracle databases in the past. The SaaS/cloud business has strong competition from the likes of Salesforce.com and RightNow. And its hardware division is viewed as aging. Oracle seems to be more interested in combining its software and hardware into packaged solutions like Exadata… and that is a good strategy. It just can’t be the only strategy!

“This month we will begin deliveries of servers based on our new SPARC T5 microprocessor: the fastest microprocessor in the world,” said Oracle CEO, Larry Ellison. “The new T5 servers can have up to eight microprocessors while our new M5 system can be configured with up to thirty-two microprocessors. The M5 runs the Oracle database 10 times faster than the M9000 it replaces.”

Will this be enough to bolster Oracle’s sagging hardware business? We will have to wait and see.

SAP’s Fourth Quarter
SAP also experienced a very successful fourth quarter and full year 2012. The company announced that revenue grew 14 percent year-on-year and exceeded 16 billion Euros. Software and software-related service revenue grew 17 percent year-on-year to 13.2 billion Euros. This marked the 12th consecutive quarter of double digit software and software-related service revenue growth. Software and cloud subscriptions revenue grew 21 percent year-on-year to 5 billion Euros.

“In 2012, SAP … invested in our flagship innovation SAP HANA and strengthened the industry’s best portfolio in the cloud. We delivered industry-specific solutions, accessible anywhere on the mobile device,” said SAP Co-CEOs Bill McDermott and Jim Hagemann Snabe. “Our momentum has never been stronger. We are very well positioned to achieve our 2015 goals.”

SAP HANA growth continued during the fourth quarter with nearly 200 million Euros in software revenue in the fourth quarter and almost 400 million Euros for the full year.

In June, the SAP board of directors and its shareholders approved an 0.85 Euro per share dividend on its stock. This represents an increase of 0.10 Euros or 13 percent compared to last year’s regular dividend of 0.75 Euros.

The Quarterly Acquisition RoundupAs usual, the major database vendors made their share of acquisitions during the past quarter and we will take a look at each of the data-related acquisitions that occurred. Additionally, though, a major independent software vendor in the database tools market announced that it was to be acquired by a private investor group. So we’ll examine the ramifications of that one, too. But let’s start, as usual, with Oracle.

On March 13, 2013, Oracle announced it has agreed to acquire Nimbula, a provider of private cloud infrastructure management software Terms of the deal were not disclosed. Nimbula’s technology helps companies manage infrastructure resources to deliver service, quality and availability, as well as workloads in private and hybrid cloud environments. Nimbula’s product is expected to be integrated with Oracle’s cloud offerings. But is this too little, too late? Perhaps Nimbula can bolster Oracle’s cloud expertise but will it help to bolster Oracle’s sales in the cloud arena?

Then, a few weeks later on March 25, 2013, Oracle announced that it would acquire Tekelec, a provider of network signaling, policy control, and subscriber data management solutions for communications networks. Terms of the agreement were not disclosed. Tekelec’s solutions are deployed by more than 300 service providers in over 100 countries. Oracle plans to make Tekelec’s network signaling, policy control and subscriber data management solutions a part of its Oracle Communications portfolio to help service providers efficiently allocate network resources and monetize personalized communications services.

“As connected devices and applications become ubiquitous, intelligent network and service control technologies are required to enable service providers to efficiently deploy all-IP networks, and deliver and monetize innovative communication services,” said Bhaskar Gorti, senior vice president and general manager, Oracle Communications. “The combination of Oracle and Tekelec will provide service providers with the most complete solution to manage their businesses across customer engagement, business and network operations, service delivery and end user applications.”

It would appear that now is the time for consolidation in the cloud world, as IBM agreed to acquire SoftLayer Technologies, a cloud computing company, in early June. Estimated revenue for the company is about $400 million a year. Although the financial terms of the deal were not disclosed, industry scuttlebutt pegged the price at around $2 billion. Now I’m just guessing (not really), but that is probably a lot more than Oracle paid for Nimbula. But IBM gets a lot more for the price.

“As businesses add public cloud capabilities to their on-premise IT systems, they need enterprise-grade reliability, security and management. To address this opportunity, IBM has built a portfolio of high-value private, public and hybrid cloud offerings, as well as software-as-a-service business solutions,” said Erich Clementi, Senior Vice President, IBM Global Technology Services. “With SoftLayer, IBM will accelerate the build-out of our public cloud infrastructure to give clients the broadest choice of cloud offerings to drive business innovation.”

Headquartered in Dallas, Texas, SoftLayer serves approximately 21,000 customers with a global cloud infrastructure platform spanning 13 data centers in the United States, Asia and Europe. Among its many cloud infrastructure services, SoftLayer allows clients to buy enterprise-class cloud services on dedicated or shared servers, offering clients a choice of where to deploy their applications. These clients should benefit from new enterprise grade functionality from IBM as it gets integrated into the SoftLayer services.

At the same time as this announcement, IBM also announced the formation of a new Cloud Services division. Following the close of the acquisition of SoftLayer, which is expected in 3Q 2013, this new division will combine SoftLayer with IBM SmartCloud into a global platform. The new division will be headed by Erich Clementi, Senior Vice President, IBM Global Technology Services.

Earlier in the quarter, in late April, IBM announced the acquisition of UrbanCode Inc. Based in Cleveland, Ohio, UrbanCode automates the delivery of software, helping businesses quickly release and update mobile, social, big data, cloud applications. Financial terms were not disclosed. The UrbanCode offerings have been added to IBM’s Rational business unit.

IBM believes UrbanCode’s software to be a natural extension of its DevOps strategy, designed to simplify and speed the entire software development and delivery process for businesses. The new capabilities also enhance IBM SmartCloud and IBM MobileFirst initiatives by making it easier and faster for clients to deliver software through those channels. The UrbanCode solution also works with traditional applications including middleware, databases and business intelligence.

“Companies that master effective software development and delivery in rapidly changing environments such as cloud, mobile and social will have a significant competitive advantage,” said Kristof Kloeckner, general manager, IBM Rational Software. “With the acquisition of UrbanCode, IBM is uniquely positioned to help businesses from every industry accelerate delivery of their products and services to better meet client demands.”

Microsoft also remained acquisitive when in late March the company bought NetBreeze, a social analytics provider based in Switzerland. Terms of the deal were not disclosed.

NetBreeze’s claim to fame is modern Natural Language Processing (NLP), which it uses along with data mining techniques monitor a large number of social networks, including Facebook, YouTube, and Twitter, along with 6,000 online news sites, 18 million blogs and 500,000 message boards. Microsoft claims that its intentions are to deliver social monitoring and analytics features as an integral component of the user experience and provide these capabilities to all roles and functions in an organization. The NetBreeze solutions can be deployed by Microsoft to accelerate that vision.

SAP participated in the acquisition game by announcing plans to acquire hybris, a provider of enterprise software and on-demand solutions for multi-channel commerce, master data management and order management. Terms of the deal were not disclosed.

Founded in 1997 with headquarters in Zug, Switzerland, hybris provides what it calls a complete omni-channel commerce platform incorporating Web, mobile, call center and store solutions. Upon completion of the transaction, expected in the third quarter of 2013, hybris will operate as an independent business unit and will retain its existing management team led by hybris’ current management team of Ariel Lüdi, CEO, and Carsten Thoma, president and co-founder.

SAP co-CEOs Bill McDermott and Jim Hagemann Snabe said “hybris puts SAP on the leading edge of the consumer economy. With hybris, SAP has made a decisive move to raise the stakes in customer relationship management and define the next generation customer experience.”

SAP believes that the acquisition of hybris will position SAP to be able to deliver a next-generation e-commerce platform, with the choice of on-premise or cloud deployment. By combining SAP’s enterprise solutions with the agile commerce solutions of hybris SAP will attempt to flexibly support enterprises with the enhanced data and tools necessary to optimize margins and customer loyalty.

Another small but interesting item on the acquisition radar arose in late April, as Actian Corporation purchased ParAccel. Actian is slowly but surely building itself into a company to be reckoned with in the Big Data space. The company purchased Pervasive Software last quarter, and completed its acquisition of Versant Corporation late in 2012.

By purchasing ParAccel, a provider of high-performance analytics, Actian adds some high profile customers to its portfolio including Amazon, The Royal Bank of Scotland, and OfficeMax.

“Today’s software will fail to cope with complexities of big data,” said Steve Shine, CEO of Actian. “The Actian software portfolio, which is specifically designed to fully exploit modern hardware architectures, arms organizations with an unmatched single platform to connect to any data, analyze it at scale for relevance and take action to turn big data into business value.”

And finally, at least in terms of M+A, we have the acquisition of systems management juggernaut BMC Software by a private investor group led by Bain Capital and Golden Gate Capital together with GIC Special Investments Pte. Ltd. and Insight Venture Partners. BMC Software was founded in 1980 by Scott Boulette, John Moores and Dan Cloer – the initials of their last names supplying the company with its moniker. Moores was BMC’s first CEO, and the company went public in 1988.

Under the terms of the agreement, affiliates of the Investor Group will acquire all outstanding BMC common stock for $46.25 per share in cash, or approximately $6.9 billion. The agreement was approved by unanimous vote of those directors present. For a period of 30 calendar days, BMC may solicit alternative proposals from third parties. Personally, I don’t foresee any alternative proposals forthcoming – at least not ones anything that would not warrant close scrutiny by the US Department of Justice.

“After a thorough review of strategic alternatives, the BMC board of directors is pleased to reach this agreement, which provides shareholders with immediate and substantial cash value, as well as a premium to our unaffected share price,” said Bob Beauchamp, chairman and chief executive officer at BMC.

This one will be interesting to watch, especially given the history of past Bain Capital acquisitions. Firms like Bain have a reputation for raising money from outside investors; using that money to buy up struggling companies and then restructuring the companies. That usually means layoffs, reduced benefits for workers, and maybe even selling off pieces of the business. The thought process is that in the end, the company will be leaner and more efficient. And then the firm will try to sell the company at a profit. Who knows what the future actually holds for BMC Software, but if I were a gambling man I’d wager that interesting times are ahead for the company, its employees and its customers.

Legally SpeakingIt has been a couple of quarters since we delved into the courtroom expeditions of the major DBMS vendors, so let’s briefly recap. If you are a regular reader, you know that HP and Oracle have been battling in court for the last couple of years. Issues have included intellectual property, executive job hopping and software support for a hardware chip (Itanium).

The last of these lawsuits, regarding Oracle’s support of the HP Itanium processor, was won by HP. The judge ruled that Oracle had to continue to make its database products for Itanium. The next phase is to determine what, if any, economic damage was caused to HP by Oracle’s statement of intent to stop supporting Itanium.

Earlier in 2013, Oracle tried to appeal the decision and end the Hewlett-Packard lawsuit. The judge disagreed with Oracle’s position and required Oracle to continue to support Itanium servers and to pay damages for losses incurred. The amount of said damages, however, has yet to be fixed by a jury.
Of course, HP claims it caused significant damages while Oracle claims the opposite. It remains to be seen what HP will be awarded, but the company has been reported as seeking anywhere from $500 million to $4 billion in damages. That is quite a range!

An economist who testified on HP’s behalf claimed that the uncertainty caused by Oracle’s decision caused customers to abandon HP. The ongoing battle over damages is now in the hands of the court, and Oracle can file no further appeals until the trial is over.

And In Technology NewsThe quarter was not all about acquiring technology, there were also a significant number of data and database related technology announcements.

IBM made the loudest news on the technology front this quarter with its DB2 BLU Acceleration announcement in early April. Basically, BLU Acceleration adds a column store capability to DB2 10.5 for LUW (with the promise that it will come to z/OS soon). A column store physically stores data as sections of columns rather than as rows of data. By doing so, data warehouse queries, customer relationship management (CRM) systems, and other types of ad hoc queries where aggregates are computed over large numbers of similar data items can be optimized.

But BLU Acceleration is not just a column store. IBM had delivered three additional capabilities and improvements with BLU Acceleration. The first is called “actionable compression,” which can deliver up to 10x storage space savings. Some of the beta customers are getting 90 to 95 percent data compression for their large data warehouse database tables. But why is it called “actionable?” Well, there are two key ingredients that make the compression actionable. There are (1) new algorithms enabling many predicates to be evaluated without having to decompress and (2) the most frequently occurring values are compressed the most, thereby saving the greatest level of storage space.

The second new feature of BLU Acceleration comes via the exploitation of the SIMD (Single Instruction Multiple Data) capabilities of modern CPUs. The basic idea behind SIMD is the ability for a single instruction to be able to act upon multiple items at the same time, which obviously can speed up processing.

And finally, BLU Acceleration adds data skipping technology. You can probably guess what this does, but let’s explain it a little bit anyway. The basic idea is to skip over data that is not required in order to deliver an answer set for a query. Metadata is stored for sets of data records that can be accessed by DB2 to determine whether that particular set of data holds anything of interest. If not, it can be skipped over.

BLU Acceleration is being delivered first in DB2 10.5 for LUW. And best of all, it is simple to use. All that is necessary is to specify ORGANIZE BY COLUMN in the DDL for a table to make it BLU.

So what, you may ask? In my opinion, BLU Acceleration is a very significant milestone in the history of DB2. It brings a column store capability that can be implemented right inside of DB2, without any additional product or technology. So you can implement a multi-workload database implementation for the Big Data era using nothing more than DB2 software. BLU Acceleration provides blazing speed and can act upon large amounts of analytical data. And that is something we all should consider when embarking on our Big Data projects.

In other IBM technology news, the company launched it Flash Storage Appliance initiative with a $1 billion investment in mid-April. “We’re announcing three things,” said Ed Walsh, vice president of storage systems marketing and strategy at IBM. “We’re announcing a $1 billion investment in software and systems across IBM, 12 new Centers of Competency around flash and a new line of products called IBM FlashSystem.”

Flash is a highly efficient re-writable memory that can speed the response times of information gathering in servers and storage systems from milliseconds to microseconds. IBM announced the availability of the IBM FlashSystem line of all-Flash storage appliances, which are based on technology acquired from Texas Memory Systems. The IBM FlashSystem provides organizations instant access to the benefits of Flash. The IBM FlashSystem 820, for example, is the size of a pizza box, is 20 times faster than spinning hard drives, and can store up to 24 terabytes of data – more than twice the amount of printed information stored in the U.S. Library of Congress.

This past April IBM also released Informix Warehouse Accelerator Version 12.10, which brings with it several new useful features including Two new methods to refresh data in an existing data mart without having to fully reload the complete data set (automatic partition refresh and trickle feed), Integration of Time Series data, Enhanced support for SQL including UNION/UNION ALL, enhanced privilege control for data mart administration among other new capabilities.

IBM was busy this quarter as it also unveiled the new IBM PureData System for Hadoop. Recall that PureData line is IBM’s database appliance line of products. With the PureData System for Hadoop appliance IBM aims to significantly reduce the time needed to ramp-up and adopt enterprise-class Hadoop systems and applications.

“Big data is about using all data in context at the point of impact,” said Bob Picciano, general manager, IBM Information Management. “With the innovations we are delivering, now every organization can realize value quickly by leveraging existing skills as well as adopt new capabilities for speed and exploration to improve business outcomes.”

Oracle offered up some new technology this quarter, too. In early April the company announced the availability of the Oracle Big Data Appliance X3-2 Starter Rack and Oracle Big Data Appliance X3-2 In-Rack Expansion. The Oracle Big Data Appliance X3-2 Starter Rack contains six Oracle Sun servers within a full-sized rack with redundant Infiniband switches and power distribution units. The Oracle Big Data Appliance X3-2 In-Rack Expansion includes a pack of six additional servers to expand the above configuration to 12 nodes and then to a full rack of 18 nodes.

Both new systems include the existing software stack for Oracle Big Data Appliance X3-2, which consists of Oracle Linux, Oracle Hotspot Java Virtual Machine, Cloudera’s Distribution Including Apache Hadoop (CDH), Cloudera Manager and the Oracle NoSQL Database.

Oracle also unveiled Oracle SQL Developer Data Modeler v3.3, the latest and greatest version of their free data modeling product. New features include improved search capabilities, export and import to Microsoft Excel, better control over DDL generation, and improved reporting functionality, among others. You can download the tool on the Oracle Technology Network product page at http://www.oracle.com/technetwork/developer-tools/datamodeler/downloads/index.html

Meanwhile, this June Microsoft was making news at its annual TechEd conference in New Orleans. The biggest announcement for database folks came during the keynote session when Quentin Clark, Corporate Vice President, Data Platform Group, announced Microsoft SQL Server 2014. Coming just 14 months since SQL Server 2012 was shipped, SQL Server 2014 will provide some intriguing new features including in the box delivery of Hekaton in-memory OLTP. By “in the box” Microsoft means that customers will not need to buy specialized hardware or software to use it. As such, customers can migrate their existing applications easily, and thereby benefit from the expected performance improvements. Appropriately enough, SQL Server 2014 is currently slated as becoming generally available early in 2014.

Microsoft is billing SQL Server 2014 as being designed and developed with cloud-first principles in mind, including built-in in-memory capabilities, new hybrid cloud scenarios and capable of delivering even faster data insights. SQL Server 2014 will include improved high-availability technologies including simplified cloud backup, cloud disaster recovery and easy migration to Windows Azure Virtual Machines. It also will improve upon existing AlwaysOn features with support for new scenarios, scale of deployment and ease of adoption.

Of course, these are only a few of the features touted for inclusion in SQL Server 2014. You can learn more about SQL Server 2014 and download information and new white papers at http://www.microsoft.com/en-us/sqlserver/sql-server-2014.aspx?WT.mc_id=Blog_SSQL_TechEdNA_SQL2014. And you can sign up to participate in the upcoming SQL Server 2014 CTP 1 at http://technet.microsoft.com/en-us/evalcenter/dn205292?WT.mc_id=Blog_SSQL_TechEdNA_SQL2014.

Microsoft also introduced Visual Studio 2013 and demonstrated new capabilities for improving the application lifecycle, both on-premises and in the cloud. A preview of Visual Studio 2013, with its new enhancements for agile portfolio planning, developer productivity, team collaboration, quality enablement and DevOps, is slated for release in the coming weeks.

And a lot of the Microsoft TechEd conference focused on Azure and the cloud. Microsoft introduced upcoming releases of its key enterprise IT solutions for hybrid cloud: Windows Server 2012 R2, System Center 2012 R2 as well as the aforementioned SQL Server 2014. These products are designed to break down boundaries between customer data centers, service provider data centers and Windows Azure. Using them, enterprises can make IT services and applications available across clouds and scale them up or down according to business needs.

Not to be outdone in the big data world by its largest database competitors, Teradata bolstered its Hadoop offerings this quarter with Teradata Enterprise Access for Hadoop. A part of the Teradata Unified Data Architecture, this technology enables business analysts to reach through Teradata directly into Hadoop to find new business value from the analysis of big, diverse data.

Teradata Enterprise Access for Hadoop includes two new features that make access to data in Hadoop easy and secure for business analysts across the enterprise:

  1. Teradata Smart Loader for Hadoop, which enables searching and moving data between Teradata environments and Hadoop
  2. Teradata SQL-H, which can be used to allow analytics to run in-memory on Teradata as queries are pulled from Hadoop.

The Teradata Unified Data Architecture, which was previously announced, brings together Teradata, Teradata Aster, and Hadoop technology as well as partner tools to create a cohesive architecture.

And finally, SAP delivered a fix pack for the HANA platform that added Big Data and spatial processing capabilities. On the Big Data front, smart data access is the name for the new data virtualization technology in SAP HANA that enables dynamic data queries across heterogeneous relational and non-relational database systems such as Hadoop, Sybase Adaptive Server Enterprise, and other third-party data warehouses. The smart data access functionality will help HANA customers to build real-time Big Data applications with fast and secure query access to data across their business networks.

The new spatial data processing capabilities enable HANA customers to combine geospatial data with business data, adding a new dimension to real-time business applications. Customers and ISVs will be able to process combinations of spatial, predictive and text analysis results within one SQL to simplify the development of intelligent, intuitive location-based solutions.

Additional capabilities in the SAP HANA service pack will include:

  • Extended natural language processing
  • Application function modeler for model-driven access to native functions and predictive algorithms
  • Integrated data provisioning and modeling
  • Enhanced data services, including native extract, transform, load (ETL) with increased performance
  • Enhanced disaster recovery
  • Among other new features

SummaryAnd that brings us to the end of another edition of The Database Report. If you are interested in keeping tabs on the data and database system market but find that you do not have the time to invest in following the daily news, then be sure to check in with The Database Report every quarter to catch up.

See you next quarter!

Share this post

Craig Mullins

Craig Mullins

Craig S. Mullins is a data management strategist and principal consultant for Mullins Consulting, Inc. He has three decades of experience in the field of database management, including working with DB2 for z/OS since Version 1. Craig is also an IBM Information Champion and is the author of two books: DB2 Developer’s Guide and Database Administration:The Complete Guide to Practices and Procedures. You can contact Craig via his website.

scroll to top