Among all the cloud news this week, the bulk of attention went to dueling price cuts by Google and Amazon that went across the major services. But equally, or perhaps more important, was that Amazon continued to acknowledge the existence of private cloud and on-premise deployments.
Amazon SVP Andy Jassy in his keynote, and later other Amazon execs in separate conversations, acknowledged that there are some corporate jobs that will not and should not run on the public cloud (e.g. Amazon Web Services) any time soon. Jassy talked up Amazon tools which can foster that on-prem/public cloud coexistence — things like Amazon’s Virtual Private Cloud (VPC) and Direct Connect for example. Later, Matt Wood, GM of Data Science for AWS, told me that for companies running big, decades-old jobs on Digital Alpha boxes, are pretty much best off keeping them there.
Digital Alpha — that’s some serious legacy IT there — but don’t laugh. This is a big admission coming from Amazon. In November at AWS Re:invent the terms “private cloud” and on-premises deployment were pretty much dirty words. The public transformation began a month later when Amazon CEO Jeff Bezos acknowledged on 60 Minutes that the Amazon-built CIA cloud will in fact be a private cloud.
The shift reflects how much Amazon, the de facto cloud of choice by small startups, also wants to win enterprise jobs. One of the most common comments coming out of AWS Summit in San Francisco this week was how similar the event was to something IBM or Gartner would do. This was not meant as a compliment, but if you want to be the big boy’s cloud, you have to act like a big boy.
Google’s Urs Hölzle
Both Amazon and Google, which launched huge price cuts and a raft of new services on Tuesday, clearly have the scale and resources to vacuum up a lot more corporate jobs into their respective clouds. What both lack is a full hybrid cloud strategy that would let businesses balance private and public cloud deployment as they see fit. And that public-private cloud coexistence message is the one legacy IT players like VMware and Microsoft have been pounding home.
But the public cloud companies are starting to get there. Google announced much faster data ingestion for its BigQuery database service that can now take in 100,000 rows per second versus 1,000 before.
And it announced new connectors that will enable companies to pipe their own data into Google BigQuery directly from their on-site databases — or other clouds — for processing — and spit them out again. “The connectors use public APIs so if you run Hadoop on premises on your Oracle or other database but want to run your analytics on BigQuery you can do so,” said William Vambenepe, senior product manager for Big Data at Google.
The ability to pump data into and out of a public cloud whether from another public cloud or a private cloud or an on-premise database is a huge deal for enterprise accounts and something they want to be easy so they can avoid cloud lock-in.
The fact that AWS won the CIA contract even after overbidding IBM on price shows that Amazon may be comfortable with not always being the low-cost supplier going forward. This acceptance of reality — even if it’s not necessarily the ideal cloud environment — signals a maturity that the industry overall should welcome.
This article was originally posted on GigaOm.