Nasdaq Picks Amazon's Public Cloud for Record Storage

Nasdaq is launching a cloud-based system hosted by Amazon Web Services (AWS) designed to let broker-dealers store critical records and meet compliance requirements, it announced this week.

This wouldn't be the first stock market exchange to turn to the cloud -- the New York Stock Exchange launched a community cloud last year -- but Nasdaq's decision will test the limits of using a major public cloud service to host sensitive data.

Specifically, Nasdaq's new offering, called FinQloud, will consist of two systems for broker-dealers: Regulatory Records Retention (R3) is a storage and retrieval system, and Self Service Reporting (SSR) will let firms perform queries and analysis of stored trading records on demand.

Ironically, compliance has been a major showstopper for many firms in the financial services industry as well as other vertical industries where a data security breach or outage could land a company in hot water. But Nasdaq, which for decades has been an early adopter of new technology, appears to believe that a breach or outage is no more likely to happen in the cloud than in its own datacenter.

Eric Noll, executive VP of transaction services for Nasdaq's U.S. and U.K operations, told CNBC on Tuesday that the economics of using the cloud are too compelling to pass up. Broker-dealers running their records management systems in Nasdaq's FinQloud will be able to reduce their costs by 80 percent compared with operating them in their own datacenters, Noll said.

By law, financial services firms are required to save records, including e-mails, for seven years. "In today's complicated market with more and more electronic communications, those storage costs have grown and grown and grown," Noll said. "By partnering with Amazon, what we think we are able to do is offer a lower-cost solution to reduce the cost for broker-dealers for their storage and their retention requirements for the regulators."

As Nasdaq's systems outage in May demonstrated -- during the first hour of trading on the day of Facebook's initial public offering, no less -- an internally run system is subject to the same material failures as those that have occurred in the public cloud. Nasdaq has worked with Amazon for six years and Noll believes FinQloud can generate revenues for the exchange without requiring capital investment or operational costs by utilizing usage-based compute and storage.

Moreover, Noll, in his CNBC interview, expressed confidence that Amazon can ensure data will be kept secure. At the same time, Noll said Nasdaq is also applying its own data encryption.

"There's always going to be concerns about data security," Noll said. "Cyber attacks are a reality of today's modern society -- we're going to have to deal with them. I think there are attacks on standalone units as well as other sources of data out there, but what we're going to be working with Amazon on is not only taking their info security protections, but we're adding layers on it through Nasdaq as well and we will preserve the sanctity of this information for the users of the cloud and with us."

AWS Senior VP Andy Jassy, who was present with Noll during the CNBC interview, said financial services companies have used Amazon's public cloud services for years, and expressed confidence that usage will grow exponentially. Jassy said he sees a day, perhaps 10 to 20 years from now, when AWS generates as much revenue as Amazon's $40 billion online retail business. "AWS has hundreds of thousands of customers at this point in over 190 countries and it's growing very, very dramatically," he said.

Yet many companies are loath to put sensitive and mission-critical data in the cloud. Nasdaq's decision to do so publicly will no doubt be a closely watched public cloud deployment.

Posted by Jeffrey Schwartz on September 27, 20120 comments


Salesforce.com Plans Expansion into Other Businesses -- Next Year

Salesforce.com CEO Marc Benioff is trying to dispel the notion that his company is a one-trick pony.

At the annual Dreamforce conference in San Francisco last week, Benioff told 90,000 attendees and the thousands more who tuned in to his keynote via a Facebook feed that he plans to expand his company's Software as a Service (SaaS) applications well beyond customer relationship management -- the foundation of Salesforce.com's business.

While this is not the first time Salesforce.com has veered from its charter of offering tools aimed at helping organizations better interact with its customers, it's arguably the broadest extension of its service offerings into lines of business it hadn't previously touched, such as document sharing, marketing automation and human resources.

In a sign that Salesforce.com is looking to stake a claim in these new areas, many of the new services launched last week won't be available until the second half of next year. That's out of character for Salesforce.com, which typically doesn't pre-announce services well before their availability, said R. "Ray" Wang, CEO of Constellation Research.

"Marc did not announce things in general availability other than the marketing apps, which was unusual," Wang said. "They did a lot of forward-marketing, which is not like Salesforce. I think he's afraid other people are going to jump into this market, so he's announcing things in development that are not released as code."

Wang, who attended the Dreamforce conference, said it was telling the number of HR, finance and marketing executives that were at the conference, who upstaged those from IT organizations. That's no coincidence. Citing a projection by Gartner that chief marketing officers will spend more on IT than CIOs by 2017, Benioff said, "Now we're inviting them into the cockpit into this incredible new marketing cloud."

The offering is Salesforce Marketing Cloud, which combines the scanning of social networks, advertising, business measurement and workflow based on technology it acquired from Buddy Media and Radian6. The key audience for Salesforce Marketing Cloud are these CMOs.

Benioff planted the seeds for extending Salesforce.com's reach with its launch two years ago of Chatter, its social media tool best described as a version of Facebook designed for use in an enterprise or extranet type scenario.

As noted by my colleague John K. Waters, Salesforce.com used Dreamforce to jump into the HR field -- or, as it's called these days, the "human capital management" (HCM) field -- with the launch Work.com, a system designed to let managers and HR benchmark and reward the performance of employees in a social context.

Through an expanded partnership with leading HCM SaaS provider Workday, Salesforce.com will provide integration of Work.com with Workday's HCM offerings. Salesforce.com's push into HCM comes as key rivals have moved into this rapidly growing segment. Oracle last year acquired Taleo, SAP picked up SuccessFactors, and IBM just announced it has acquired Kenexa for $1.3 billion.

Another area where Salesforce.com is spreading its wings is document management with a move aimed at offering an alternative to the likes of Box and Dropbox. The company described the new Salesforce Chatterbox service as the "Dropbox for the enterprise." But with so many services such as Dropbox and Box.net out there, is Salesforce a) moving too far adrift? and b) likely to gain a foothold into document sharing?

"Although Box.net has gained lots of attention and many users, it hasn't established a firm hold on the B2B and enterprise markets," said Jeffrey Kaplan, managing director of Thinkstrategies, a consulting firm focused on cloud computing. "And there are no dominant players in the other areas."

Likewise, Kaplan, who also attended Dreamforce, said Salesforce.com will raise its profile in other new functional fields it's entering. "Salesforce.com will bring greater legitimacy to each of these areas, in the same way it has championed the idea of social networking in the enterprise with Chatter," he said.

Perhaps the biggest challenge facing Salesforce.com's ambitions to widen its footprint is the number of companies it has acquired in recent years, said Joshua Greenbaum, principal analyst of Enterprise Applications Consulting.

"Now that they have all these assets, they need to do a better job of integrating them," Greenbaum said. "They need to focus on allowing developers and customers to integrate all this functionality and stop creating silos of functionality that are as problematic as any legacy silo is."

Posted by Jeffrey Schwartz on September 24, 20120 comments


Racemi Launches Server Image Migration Service

Racemi, a company that offers software to move Windows and Linux images from one bare-metal server to another, recently added cloud migrations to its portfolio. The company's new Cloud Path is a software as a service (SaaS) offering that lets administrators move server images to public cloud services.

Admins can use Cloud Path from a Web browser to migrate physical and virtual servers to infrastructure as a service (IaaS) cloud providers. Atlanta-based Racemi claims migrations will cost on average $800 less than performing manual re-imaging of data.

Pricing is determined by a usage-based model and eliminates the need to rely on experienced administrators, who can move server images without requiring templates or scripts. In addition to moving workloads from in-house systems to cloud-based servers, Cloud Path lets customers migrate cloud instances between different supported cloud providers.

Racemi charges $299 per successful migration, which includes an initial 20 GB of free storage. Customers can currently migrate Windows Server 2008 R2, Ret Hat Enterprise Linux and CentOS systems to cloud services provided by Amazon Web Services, GoGrid, Rackspace and Verizon's Terremark. Racemi said it plans to support additional server OSes and cloud service providers over time.

Posted by Jeffrey Schwartz on September 12, 20120 comments


What Makes Windows Server 2012 a 'Cloud OS'

When Microsoft announced the general availability of Windows Server 2012 last week, it described the release as heralding the start of the company's cloud OS era. In a carefully scripted, pre-recorded webcast, Microsoft executives illuminated the cloud characteristics of its new server OS.

Marketing hype aside, the release of Windows Server 2012 culminates a four-year engineering effort to build a common architecture for IT to develop and deploy applications on private, hybrid and public clouds. Microsoft has talked up the combination of Windows Server and System Center as a platform for private and hybrid clouds for some time.

Dubbing Windows Server 2012 a "cloud OS" simply advances Microsoft's messaging and lets IT decision makers compare it to a slew of other cloud infrastructure OSes such as Eucalyptus and open source software distributions delivered by those aligned with OpenStack including Rackspace and Red Hat, as well as VMware's proprietary VM and cloud software portfolio.

With the emergence of so-called "modern apps" -- those designed for various computing and mobile device types -- Microsoft wants developers to build server-side applications in its new .NET Framework 4.5 using Visual Studio 2012 and have the ability to deploy them on Windows Server 2012 in their datacenters or hosted by third-party cloud providers such as Hostway and Rackspace, or in its own Windows Azure public cloud, or any combination.

"We built Windows Server 2012 with the cloud in mind," said Bill Laing corporate VP of Microsoft's server and cloud division, who led the engineering team that developed Windows Server 2012. "It's the deepest and broadest release ever built for companies of all sizes, whether you run a single server connected to the cloud, or a large datacenter."

In the webcast, Laing pointed to four key attributes that makes Windows Server 2012 a cloud OS:

  1. Scalable and elastic: The latest generation of Hyper-V lets a customer scale from one to thousands of virtual machines as workloads dictate. It supports up to 320 logical processors and 4 TB of RAM per server, and Laing said it can virtualize 99 percent of all SQL Server databases. The new OS can run large virtual machines up to 64 virtual processors and 1 terabyte of memory per VM. So far, Laing said it has scaled 8,000 VMs per cluster.

  2. Shared resources: Windows Server 2012 is architected for multi-tenancy, critical to ensure the workloads of a given group, organizational unit or customer don't impact others. Combined with System Center 2012 SP1, Windows Server 2012 enables software defined networking, or SDN, which means "you can easily and dynamically provision isolated virtual networks running on the same physical fabric," explained Jeff Woolsey, a principal program manager for Windows Server and Cloud at Microsoft.

  3. Always-On: A feature called Live Migration provides VM mobility, which facilitates the movement of virtual machines from one physical server to another locally or over a wide area network. This cluster-aware feature is designed to provide continuous availability during patches, upgrades or failures.

  4. Automation and self-service: Users in departments can self-provision compute and storage resources. Windows Server 2012 enables automation with over 2,400 new PowerShell commandlets, designed to eliminate manual tasks and allowing IT to manage massive amounts of servers. Combined with System Center 2012, Windows Server 2012 offers automation via user-defined policies.

Microsoft makes a good case for Windows Server as a cloud OS and it should appeal to its installed base as customers build new apps for the cloud. But customers will determine whether Windows Server 2012, combined with System Center, is a viable option to VMware's cloud stack along with the open source and Amazon Web Services-compatible alternatives.

Posted by Jeffrey Schwartz on September 10, 20124 comments


A Cloud Boom, Despite Security Worries

Security concerns might be the top inhibitor to using public cloud services, but the horses may have already left the barn.

A global survey of 4,000 executives and IT managers found nearly half (49 percent) already use cloud services to store sensitive or confidential information, and another 33 percent plan to do so over the next two years. Only 19 percent said they don't, according to the survey, conducted by research consultancy Ponemon Institute and commissioned by Thales, a Microsoft ISV that provides security and encryption software and services.

The findings piqued my attention given public cloud services are a non-starter for many large organizations, especially those with regulatory or compliance restrictions. However, the Ponemon study canvassed enterprises of all sizes including small and mid-sized organizations, explained Ponemon institute chairman and founder Larry Ponemon.

I pointed Ponemon to the findings by the Open Data Center Alliance (ODCA), in which 40 percent of its membership said security was the key barrier to using public cloud services. "Even organizations that say security is an inhibitor still seem to be using cloud services," Ponemon remarked.

The findings also showed that 44 percent believe cloud providers are responsible for protecting data in the cloud, while 30 percent felt it was their own responsibility and 24 percent reported it should be shared.

"Like anything else, you need to be careful in selecting your business partners," Ponemon said. "A public cloud provider is a business partner and the fact they have access to your data, and possibly confidential and sensitive information is a big deal, and organizations need to see the cloud as a place that can be very insecure and the source of data breaches and security exploits. Not all cloud providers are the same."

When asked about the impact of cloud services on an organization's security posture, 44 percent said it had no change and 39 percent said it decreased. Only 10 percent said it increased, while 7 percent were unsure.

Only a small percentage, 11 percent, said their cloud provider encrypts data for them, while the rest assume responsibility for encryption. Of those 38, percent encrypt data in transit, 35 percent do so before it is transferred to a cloud provider and 16 percent use encryption selectively at the application layer within a cloud environment.

Thirty-six percent of those using encryption handle key management within their organizations, while 22 percent rely on a third party other than the cloud provider. Another 22 percent let the cloud provider manage the keys and 18 percent said it was a combination.

Posted by Jeffrey Schwartz on August 29, 20120 comments


Rackspace Releases Free Software to Build Private Clouds

Rackspace is now offering free software that lets anyone build private clouds based on the same platform that runs its cloud hosting service.

Alamo, codenamefor the company's Rackspace Private Cloud Software, is now available as a free download. The release, issued this week, marks a key milestone in Rackspace's plan to transition its cloud portfolio from its proprietary infrastructure to OpenStack, the open-source project the company helped launch with NASA two years ago.

Earlier this month, Rackspace completed the conversion of its server compute infrastructure running its public cloud service to OpenStack.

By offering its OpenStack-based software free of charge, Rackspace is betting that it will seed enterprise deployments of private clouds based on its open source solution. In turn, Rackspace is hoping enterprise customers will subscribe to its support services while also using its public cloud infrastructure for cloudbursting, the deployment model a growing number of those running datacenters are employing when they need capacity during peek periods.

Jim Curry, general manager of Cloud Builders, Rackspace's private cloud organization, explained Alamo is geared to those looking to build such clouds to those who don't have backgrounds with OpenStack. "To date most of the market for OpenStack has been people who were experts in it," Curry said. "We wanted to make it so a systems administrator who doesn't know anything about OpenStack and maybe knows a little bit about cloud, can easily get an OpenStack cloud up and running so they can evaluate and determine if it's a good solution on the same day." Curry said the software can be deployed in an hour.

Customers can opt for additional fee-based services, starting with Escalation Support, which starts at $2,500 plus $100 per physical node per month. At the next level, Rackspace will offer proactive support, which will include monitoring, patching and upgrading. Then sometime next year, Curry said Rackspace plans to offer complete management of OpenStack-based private clouds. The company hasn't set pricing for those latter offerings.

The initial Alamo software consists of the standard Essex release of OpenStack Nova compute infrastructure services, the Horizon dashboard, the Nova Multi Scheduler, Keystone authentication and the standard APIs. It also includes the Glance Image Library (a repository of system images), the Ubuntu-based distribution of Linux from Canonical as the host operating system with KVM-based virtualization and Chef Cookbooks from Opscode, which provide various OpenStack-based configuration scenarios.

In addition to supporting the Ubuntu distribution of Linux, Rackpace indents to support Red Hat Enterprise Linux with its OpenStack release, made available for testing this week. That support will come later in the year. A later release will also add support for SWIFT-based object storage, according to Curry.

Asked if Windows Server support is in the works, Mike Aeschliman, Rackspace Cloud Builders head of engineering, said not at this point. "To be honest, I think we will stick with Linux for a while because that's what the market is asking of us," Aeschliman said.

As for Rackspace's outreach to its channel of systems integration partners, Curry said they are aware of Alamo but the company hasn't reached out further yet. "We absolutely want to do that," Curry said. Because Rackspace's Alamo software is "plain-vanilla" OpenStack, the company plans to look to its partners to customize, or fork, it, and contribute it back to the community, Curry said.

Rackspace plan is to leverage its SIs to provide customization services, consulting, application migration and API integration into billing systems he explained. "These are not things we specialize in," he said. "We don't want to be the guys that do that work. We have great partners to do that."

Posted by Jeffrey Schwartz on August 16, 20120 comments


Red Hat Issues OpenStack Preview

Red Hat Software released the Technology Preview of its OpenStack software targeted at service providers and enterprises looking to build infrastructure as a service (IaaS) clouds based on the open source framework.

An early supporter of the two-year-old OpenStack project, Red Hat has kept a low public profile on its commitment to the effort. In its announcement Monday, the company pointed out it was the #3 contributor to the current Essex release of OpenStack.

"Our current productization efforts are focused around hardening an integrated solution of Red Hat Enterprise Linux and OpenStack to deliver an enterprise-ready solution that enables enterprises worldwide to realize infrastructure clouds," said Brian Stevens, RedHat CTO and vice president for worldwide engineering, in a statement.

Red Hat sees IaaS and OpenStack complimenting its Red Hat Enterprise Virtualization software. While the former provides the infrastructure to manage hypervisors in a self-service provisioning cloud model, RHEV targets servers, SANs and provides typical enterprise virtualization functions, the company said in a blog post.

OpenStack is a component of Red Hat's cloud stack. While it addresses, IaaS, Red Hat's cloud portolio also includes Red Hat Enterprise Linux, RHEV, its JBoss application middleware and OpenShift, which provides the company's framework for platform as a service (PaaS).

Red Hat plans to deliver its OpenStack release next year. The Technology Preview is available for download.

Posted by Jeffrey Schwartz on August 16, 20120 comments


Infosys Unveils Cloud Ecosystem Hub

Microsoft systems integrator partner and IT consulting giant Infosys recently launched its Cloud Ecosystem Hub, which is designed to help enterprises build, deploy and manage hybrid clouds.

The Infosys Cloud Ecosystem Hub combines the global company's consulting and development resources with the assets of the over 30 partners currently in the program, including Microsoft, VMware, Amazon Web Services, CA Technologies, Dell, Hitachi Data Systems, HP and IBM.

Enterprises using the Infosys service can implement cloud services up to 40 percent faster with 30 percent cost savings and a 20 percent increase in productivity versus going it alone, according to the company. Infosys said its self-service catalog allows enterprises to procure infrastructure and services to build and manage clouds running within an enterprise, through a cloud provider, while also enabling them to create a hybrid infrastructure.

In what Infosys calls a "smart brokerage" feature of the Cloud Ecosystem Hub, the system can intelligently choose and contrast cloud services from its roster of partners and takes into account various parameters such as quality of service requirements, IT assets, regulatory and compliance restrictions and cost issues, the hub procures the appropriate hardware, software and services.

"Our clients are dealing with complexities of a fragmented cloud environment," said Vishnu Bhat, Infosys VP and global head of cloud, in a statement. "The Infosys Cloud Ecosystem Hub provides organizations a unified gateway to build, manage and govern their hybrid cloud ecosystem. This solution allows clients to fully realize the benefits from the long-standing promise of the cloud."

Interarbor principal Dana Gardner described the offering as a "cloud of clouds." In a blog post he said, "I think Infosys's move also shows that one-size-fits-all public clouds will become behind-the-scenes utilities, and that managing services in a business ecosystem context is where the real value will be in cloud adoption."

Posted by Jeffrey Schwartz on August 14, 20120 comments


Woz Fears the Cloud, But You Don't Have To

Steve Wozniak's statement this past weekend that cloud computing could cause "horrible problems" has gone viral, but with all due respect to the visionary Apple co-founder -- take his fears with a grain of salt.

Wozniak gave the off-the-cuff remark after performing in Mike Daisey's theatrical presentation "The Agony and the Ecstasy of Steve Jobs," which exposes the labor conditions at Foxconn, the key manufacturer of Apple products in China. According to PhysOrg.com, a news service covering science and technology, Wozniak revealed his concern over the growing trend of storing data in cloud-based services in a response to an audience question after the performance, which took place at the Woolly Mammoth theater in Washington, D.C.

"I really worry about everything going to the cloud," Wozniak reportedly told the audience. "I think it's going to be horrendous. I think there are going to be a lot of horrible problems in the next five years."

Sure, there will be plenty of problems with cloud computing just as there are issues with all forms of computing. We've already seen numerous outages that have raised serious concerns.

Wozniak appeared not only worried over the reliability of cloud services, but argued users risk giving up ownership of their data once they store it in the cloud. "With the cloud, you don't own anything. You already signed it away," he said, referring to terms of service users agree to when signing on with some cloud. "I want to feel that I own things," he added.

It seems to me he was referring to social networks like Facebook and photo-sharing services. Facebook raised concerns about data ownership when it changed its terms of service back in 2009, claiming it had rights to your data even after an account is terminated, a move that raised the ire of many critics. While social networks typically run in the cloud, and indeed consumers should be aware of the terms of using such services, that's where the similarities end.

Woz went on to say "a lot of people feel, 'Oh, everything is really on my computer,' but I say the more we transfer everything onto the Web, onto the cloud, the less we're going to have control over it." To that point, it is indeed hard to argue that once data is in the cloud users have less control over it than if it is on their own premises, but in many cases that gap is narrow. Giving up autonomous control is usually a tradeoff worth making for having data readily available and less vulnerable to permanent loss.

Had Wozniak not chosen to use the word "horrendous" while suggesting the cloud would cause "horrible problems," his remarks probably would have gone unnoticed. But when someone of his stature predicts armageddon, it inevitably sparks debate.

Like any emerging technology, cloud computing will go through its fits and starts. But the cloud is not going away. Will Woz one day be able to say "I told you so?" I think not. What do you think? Leave a comment below or e-mail me at [email protected].

Posted by Jeffrey Schwartz on August 09, 20122 comments


Parts of HP's Public Cloud Hit General Availability

After talking up its plans to offer a public cloud service for well over a year, Hewlett-Packard has made the first two components of its HP Cloud generally available.

As of Aug. 1, customers can purchase the HP's Cloud Object Storage and CDN. The company is backing the service with a 99.95 percent service level agreement. If the service can't meet that SLA, HP will offer customers credits.

Compute services and subscriptions to HP's cloud-based MySQL remain in beta and while it is not generally available or backed by an SLA, customers can use them for production workloads, said Marc Padovani, director of product management for HP Cloud Services.

"We are still going though updates and hardening of the service," Padovani said. "Sometime later this year it will be at a point where it meets our levels of quality, availability and durability and we will apply the SLA and bring it to general availability status." Any customer can sign up for the compute services beta but the MySQL testing is somewhat more restrictive, Padovani said. HP will contact customers who sign up for the MySQL service beta and help set them up, he said.

As reported back in May, the HP Cloud Block Storage Service lets customers add storage volumes up to 2TB per volume. In addition to supporting multiple volumes atop of HP Cloud Compute instances, customers can create snapshots of their volumes to create point-in-time copies, allowing them to create new volumes from the snapshots, which can be backed up to HP Object Storage for added redundancy if needed.

The storage service is based on the OpenStack Swift storage system, a move that will ease portability of data to other OpenStack cloud services. For the content delivery network service, HP has created an interface to Akamai's CDN. Padovani said HP intends to contribute the code used to develop the CDN interface layer to the Swift object storage service to the OpenStack group. According to Padovani, "it eliminates the need for someone to have to go through all the integration work we did with Akamai."

Posted by Jeffrey Schwartz on August 08, 20120 comments


VMware May Be Launching Its Own Public Cloud

Another major public cloud could be in the works.

CRN recently reported that VMware plans to build out a public cloud that would compete with Microsoft, Amazon Web Services, Google and Rackspace.

While VMware has said its policy is not to comment on speculation, the report cites multiple unnamed sources who say VMware has acquired significant datacenter facilities in Nevada for what is known as "Project Zephyr." According to the report, VMware has gone this route to light a fire under its hosting partners to build out public cloud services based on vCloud.

The move is surprising to hosting providers that have committed to offering their own public cloud services based on VMware's hypervisor and vCloud platforms. Hosting providers say VMware had given assurances it does not intend to compete with them, setting it apart from the company's arch-rival Microsoft, which is investing heavily in expanding its Windows Azure service.

"They have iterated and re-iterated that they had no plans to go into the cloud and infrastructure as a service themselves," said Aaron Hollobaugh, VP of marketing at Hostway. "It's a big surprise to me, but it's also an inevitable change in their desire to grow within the cloud marketplace because they are not having the traction they want from their service providers."

Hostway is not a VMware partner -- it has aligned itself with Microsoft's cloud platform -- but the hosting provider faces similar competition from Redmond. Nevertheless, Hollobaugh believes Hostway is poised to address customers who require more higher levels of support. If the CRN report is true, his onetime employer, Denver-based Hosting.com, may face similar competition from VMware.

"This rumor has been around for a long time and I don't even know if it's real," said Hosting.com CTO Craig McLellan. "There's been no official communications with me. I think that in general, every technology manufacturer has embraced the channel while at the same time competing with the channel. The real emphasis has to be on working well together if this is in fact the case."

Despite VMware's insistence that it wasn't planning to offer a public cloud service, the company has made some moves in the past that could be construed as steps toward doing so. For example VMware acquired a facility in Washington that the company ultimately used to build a green datacenter, a move that sparked some scuttlebutt that it may be a front to operate a public cloud.

Another move that raised some speculation that VMware may be looking for its own public cloud came that same year in 2009 when the company took a 5 percent stake worth $20 million in Terremark, which was later snapped up by Verizon. Though VMware's stake in Terremark, which not surprisingly uses VMware's vCloud platform, was rather small, some wondered if the company wasn't positioning itself to buy the company outright before Verizon came in.

If in fact VMware decides to launch a public cloud, it makes one wonder if the change in heart comes from parent company EMC, the management changes at VMware and the company's efforts to bolster its cloud infrastructure with the acquisitions of DynamicOps and Nicera.

VMware customers and partners will be anxious to hear the company's public cloud intentions or lack thereof at this month's VM World conference, if not sooner.

Posted by Jeffrey Schwartz on August 06, 20120 comments


How VMware Is Ushering in the Next Phase of the Cloud

VMware's acquisition of software defined networking (SDN) pioneer Nicira caps a string of moves that piece together the company's go-forward mission of automating the datacenter and creating next-generation clouds.

Kicking off what could prove to be key milestones for VMware was its July 2 announcement it is acquiring DynamicOps, a cloud automation provider renowned for its support of multiple virtual environments. Then came last week's shocking news that VMware CEO Paul Maritz, who has led the company through stellar growth during his four-year tenure, is stepping aside to become chief strategist of the company's parent EMC.

Topping off this month's buzz was the Nicira deal, VMware's largest acquisition to date. The amount VMware is paying for Nicira, $1.26 billion, is eye-popping. Though Nicira is a hot software defined networking startup with some highly regarded engineers and executives whose Network Virtualization Platform (NVP) software is used by the likes of AT&T, eBay, Fidelity Investments and Rackspace, it only came to market six months ago.

The premium price tag notwithstanding, the reaction to VMware's move has raised eyebrows as the company shows its determination to not extend further into virtual networking but to lead in it. Cisco's stock on Tuesday dropped nearly 6 percent in reaction to the move. Adding insult to injury was Cisco's announcement that it is laying off 1,300 people on top of the more than 6,000 jobs that were already phased out this year.

Fear by Cisco investors that VMware's move to marginalize its hardware by virtualizing it is "way overblown," said Forrester Research analyst Andre Kindness. "The purchase puts another nail in the Cisco VMware relationship but networking is more than the data center and more than layer 2 and layer 3 switches," Kindness said. VMware has yet to address the Layer 4 to Layer 7 arena."

Kindness said Cisco has been down this road before, first when Juniper Networks entered the carrier routing market and later when it moved into enterprise switching, as well as when Hewlett-Packard acquired 3Com and VMware's launch of vSwitch. But vSwitch itself wasn't enough for VMware to bring forth a strong enough virtual networking story, Kindness noted, hence the Nicira deal.

While vSwitch was a worthy start, VMware was held back by its hardware partners' agendas, Kindness said, noting Cisco, Dell, HP and IBM all released converged solutions which embedded their own virtual networking technologies. And those that don't have the necessary piece-parts are partnering, such as HP's announcement in May that it will use F5 Networks Application Delivery Networking (ADN) to deliver automated policy-based automated networking.

"Basically, the big hardware vendors are developing their software and hardware solutions that would be controlled by their own management solutions," Kindness said. "This would minimize VMware's influence on the management stack and open the door to other hypervisors. Thus VMware needed a solution that would ride over any of the hardware vendors who themselves are fighting over virtual transport protocols between switches, between data centers and between data center and public cloud offerings."

Whether VMware can pave its own path remains to be seen but by acquiring DynamicOps and Nicira in tandem, VMware is taking some bold steps to lead the next phase of cloud and datacenter virtualization by evolving from core server pooling to incorporating the network gear.

Others that have jumped on the SDN bandwagon say VMware's move validates software defined networking as the next big trend in the evolution of the datacenter and cloud computing infrastructure. "This underscores just how phenomenal the surge is that's powering interest in SDN," said Kelly Herrell, CEO of networking software provider Vyatta, in a blog post. "The simple facts are irrefutable: virtualization and cloud have fundamentally altered compute and storage architectures, and networking now must adapt."

Jaron Burbidge, a business development manager for Microsoft New Zealand's Server and Cloud Platform business, pointed out in a blog post that Windows Server 2012 and System Center 2012 SP1 will allow for the creation of SDN-enabled clouds. "In many ways, VMware's acquisition of Nicira is a late acknowledgement of the importance of SDN as a critical enabler for the cloud," Burbidge noted. "However, I think our approach is substantially more mature, delivers end-to-end capability, and provides an open platform for partners to innovate. Most important, our implementation is available broadly for customers to begin deploying today."

At first glance, one might wonder why VMware needed to shell out so much money for Nicira. After all, as noted by Kindness and his Forrester colleague Dave Bartoletti, VMware has already moved down the road of SDN with vSphere, which provides virtual switching capabilities via vShield Network and Security services and support for the VXLAN protocol. "These go a long way to virtualizing networking hardware and put them under the hypervisor domain," Bartoletti said in a blog post, adding VMware's vSphere Storage Appliance and various array integrations simplify and automate the allocation of storage to virtual workloads.

While vSwitch was an appropriate entre for VMware, the company until now has had a reputation for focusing on its own world. "The DynamicOps acquisition changed this conversation," Bartoletti added. "DynamicOps already manages non-VMware hypervisors as well as workloads running on open virtualization platforms in multiple clouds. (Read: heterogeneous VM support.) And now with Nicira, VMware owns a software defined networking solution that was designed for heterogeneous hypervisor and cloud environments."

Moreover, Nicira's NVP is a pure network hypervisor, effectively a thin software layer between the physical network and the virtual machine which enables network virtualization to work with existing physical datacenter network devices, IDC said in a research note.

"The Open vSwitch runs natively in the hypervisor and is managed by Nicira. Nicira is attractive in that it is designed to enable any hypervisor on the same logical network, providing a common network experience across the datacenter. The virtual networks bring flexibility and agility to datacenter designs while enabling isolation to support multi-tenancy," the note said.

"VMware clearly recognized the need for more advanced networking years ago and has been actively working with its networking partners to advance the network functionality in the virtual datacenter. To date, however, the company has not been perceived as a leading voice in the broader networking community. The acquisition underscores the fact that VMware can no longer rely on partners for networking expertise. Networking is a critical pillar in private cloud delivery, and Nicira gets VMware closer to having a full solution."

There are still many open questions. For one, what impact will VMware's move ultimately have on its alliance with Cisco and VCE, the venture the two companies and EMC have to develop converged cloud infrastructure. Just last week VCE named 19-year Cisco veteran Praveen Akkiraju as its CEO.

Another burning question is whether VMware will make a push into the OpenStack consortium. Nicira is a key contributor to the evolving Quantum networking component of OpenStack. Will VMware support other components of OpenStack?

As these and many other questions play out, a clearer picture of where VMware is headed has unfolded. And VMworld is still a month away.

Posted by Jeffrey Schwartz on July 30, 20120 comments