Zorawar "Biri" Singh, a senior vice president and general manager at Hewlett-Packard and the architect of HP's public cloud effort, has exited the company.
HP confirmed Singh's departure, which was first reported by All Things D, on Thursday. Singh oversaw HP's efforts to build a public Infrastructure as a Service (IaaS) offering that would compete with the likes of Amazon Web Services, Rackspace and his former employer IBM.
I spoke with Singh last year and he was quite bullish about HP's prospects in both competing and partnering with Amazon. I'd say it's too early to write that effort off as a failure, but I've also seen little evidence that it has made strong inroads.
CRN raises the question: Was Singh pushed or did he jump? At this time, it's unclear whether he was poached by a competitor or left due to a reorganization that led to the launch of a consolidated Converged Cloud business that combined HP's various cloud efforts.
Under that reorg, former CTO of networking Saar Gillai, who had reported to Singh, was named general manager of the new cross-divisional organization. Gillai now reports to HP COO Bill Veghte. HP's VP of technology and customer operations for its Cloud Services business will run HP Cloud Services in an interim basis.
Have you bought into HP's public and cloud initiative? I'd like to hear how you're using its private and public cloud offerings. Drop me a line at firstname.lastname@example.org or leave a comment below.
Posted by Jeffrey Schwartz on January 18, 2013 at 11:59 AM0 comments
CA Technologies on Wednesday said it has appointed former Taleo chief Michael Gregoire as its new CEO in a move to step up its emphasis on cloud computing.
Gregoire will replace existing CEO William McCracken effective Jan. 7, CA said. McCracken, 70, is retiring after a three-year stint as CEO. Like his predecessor, John Swainson, McCracken spent more than three decades at IBM. Though Swainson retired in early 2010, he re-emerged in February as president of Dell's newly formed software group, where he recently engineered the $2.4 billion acquisition of Quest Software, a CA rival.
The choice of Gregoire appears to suggest CA's board wants to accelerate its cloud push. As CEO of Taleo, which provides Software as a Service talent management software, Gregoire helped engineer the sale of Taleo to Oracle this year for $1.9 billion. Gregoire's credential also include stops at CRM and ERP software supplier Peoplesoft, also acquired by Oracle, and EDS.
Now Gregoire, 46, is moving from Silicon Valley to Long Island and it will be interesting to see what he has in store for the longtime supplier of mainframe systems management vendor founded by Charles Wang. Of course, CA has diversified quite a bit and over the past few years and has emphasized management of cloud-based systems and apps with a number of small acquisitions. But overall, while the company is known for a solid though highly diversified line of enterprise systems and cloud management software, it has enjoyed moderate growth.
The board's decision to tap a Silicon Valley veteran rather than go back to the well of IBM execs could suggest a more aggressive growth posture for CA. The question is, will Gregoire look to let shareholders cash out by selling CA in its entirety or in pieces? Or will he look to build up and diversify CA's cloud portfolio?
"I believe CA Technologies has a compelling value proposition, a strong reputation and a growing relevance for customers, software engineering, and partners," Gregoire said in a statement. "It is clear that CA Technologies is well-positioned to lead the industry as companies find it more critical than ever to manage and secure their IT environments in the cloud and efficiently provide business services that enable them to win in the marketplace."
It stands to reason Gregoire may look at CA with a fresh set of eyes. "He brings a different perspective to CA that they haven't had in a chief executive," Matt Hedberg, an equity analyst with RBC Capital Markets, told Newsday (subscription required). "It's certainly going to help their long-term vision and outlook."
Posted by Jeffrey Schwartz on December 13, 2012 at 11:59 AM0 comments
With its planned acquisition of leading mobile device management (MDM) supplier Zenprise, Citrix is making a move to ensure its place in managing employee-owned tablets, PCs and smartphones, as well as cloud-based file sharing.
Terms of the agreement, announced last week, were undisclosed. Citrix did say it plans to integrate the Zenprise MobileManager, Zencloud and Zensuite offerings with its own MDM solution CloudGateway and the Me@Work portfolio, which includes the respective GoToMeeting and ShareFile cloud-based conferencing and document storage services.
"Consumerization and BYOD have given rise to very difficult challenges for businesses in enabling a productive, mobile workforce while still maintaining tight controls over company information," said Sumit Dhawan, Citrix vice president and general manager of mobile solutions. "Zenprise was a clear choice for Citrix, with its leading MDM product, an experienced team, a history of innovation, and a footprint on more than one million devices. With a complete Citrix enterprise mobility solution, customers have all the necessary pieces to manage and secure mobile apps, content and devices."
When Citrix launched CloudGateway last year, it described it as a tool to manage and securely distribute mobile apps in addition to managing PCs, Web and software as a service (SaaS) apps. It remains to be seen how Zenprise tools are integrated with CloudGateway but Zenprise has focused on MDM for nearly a decade and is seen as a leader by analyst firms Forrester and Gartner.
Presumably it means Zenprise will also provide management of Citirix's various cloud offerings including its new ShareFile. But Zenprise's flagship MobileManager software allows IT to control the deployment, configuration, provisioning based on enterprise policies, security (including the blocking of data synchronization with public cloud services such as iCloud), monitoring and decommissioning (wiping) of devices.
For those that don't want to deploy Zenprise MobileManager on site, the company offers Zencloud, which promises 100 percent uptime SLAs and is housed in SSAE 16/SOC1, FISMA Moderate compliant facility.
Posted by Jeffrey Schwartz on December 10, 2012 at 11:59 AM0 comments
Virtualization giant VMware and its corporate parent, EMC, will form a new business entity next year called "Pivotal" to bring together the two companies' respective cloud platforms and products that enable the processing of big data.
It's not clear whether the companies are spinning these assets off outright or how they will structure the new organization. VMware said it will formally announce the business structure next quarter. So far, we do know that the unit will be led by Paul Maritz, the former VMware CEO and current EMC chief strategy officer.
Pivotal's aim will be to provide a common entity for developers to build applications that run on VMware's various platform as a service (PaaS) offerings as well as software that analyzes huge volumes of data. Pivotal is scheduled to begin operations in the second quarter of next year.
Roughly 600 VMware employees and 800 from EMC will be assigned to Pivotal, which will include the Pivotal Labs, the provider of agile software development tools it acquired in March that lets developers build apps that can scale to cloud infrastructures using big data. Pivotal will also include EMC's Greenplum data warehouse appliance business.
VMware's contribution will include its vFabric (including Spring and Gemfire), Cloud Foundry and Cetas. It appears the vCloud Suite will remain part of VMware, as indicated in a blog post published Tuesday by senior VP of communications Terry Anderson.
"The resulting Pivotal Initiative solutions will be optimized for the VMware vCloud Suite, helping to ensure that customers benefit from the best cloud architecture available, top to bottom," Anderson said. "Simultaneously, VMware will continue to drive application-aware innovations into its core platform, ensuring best-in-class performance of any application when deployed onto the VMware vCloud Suite."
The announcement and rational of the move were vague, though numerous reports have speculated that the company has considered some form of spinoff such as its open source cloud PaaS venture Cloud Foundry. VMware is not commenting beyond Anderson's blog post.
"There is a significant opportunity for both VMware and EMC to provide thought and technology leadership, not only at the infrastructure level, but across the rapidly growing and fast-moving application development and big data markets," she noted. "Aligning these resources is the best way for the combined companies to leverage this transformational period, and drive more quickly towards the rising opportunities."
Forrester Research analyst Dave Bartoletti said in a blog post IT pros should welcome the move, which will refocus VMware on the datacenter and on furthering its push into software defined network virtualization.
"This move helps to end the cloud-washing that's confused customers for years: there's a lot of work left to do to virtualize the entire datacenter stack, from compute to storage and network and apps, and the easy apps, by now, have mostly been virtualized," Bartoletti wrote. "The remaining workloads enterprises seek to virtualize are much harder: they don't naturally benefit from consolidation savings, they are highly performance sensitive and they are much more complex."
Analyst James Staten, Bartoletti's colleague at Forrester, noted in his blog by moving their cloud platform offerings to a separate business, EMC and VMware will be in a better position to appeal to developers. It's "way too soon to speculate on the end results but this could help EMC play a significant role in cloud development services," Staten said. "Hopefully this new group will focus on cloud-based delivery and not build its business model around on-premise software license sales."
The news came just one day after VMware released new tools to ease the procurement and reach of cloud apps using its vCloud Suite. The company's vFabric Application Director 5, announced earlier this year, gains support beyond traditional VMware environments, notably Amazon Web Services EC2 public cloud and Microsoft's Hyper-V virtual machine platform.
The latest version of vFabric Application Director is aimed at easing the deployment of hybrid cloud applications via certified VMware-approved templates and various tools including middleware, data management and security software.
"It allows us to take the exact same [infrastructure] blueprint, deploy it onto vSphere, vCloud or Amazon EC2 without having to change anything," explained Shahar Erez, VMware's director of applications management products. "This gives organizations the flexibility to leverage their blueprint across clouds without being locked in."
To provide these various blueprints, reference architectures and OS-loaded templates for vFabric Application Director, VMware launched its Cloud Applications Marketplace. Erez said the marketplace already hosts 100 downloadable solutions from 30 ISVs and systems integrators. Among them are solutions from Bluelock, Cognizant, Couchbase, Jaserpsoft, Puppet Labs, Radware, Riverbed and SugarCRM.
"You would find load balancers, firewalls, WAN accelerators SSL accelerators, application servers, databases, message queues and memcaches," he said, as well as "applications for blogging, content management, bug tracking and cloud provisioning blueprints available with a single click."
VMware this week also released its vCenter Operations Management Suite 5.6, which adds application performance and configuration management capabilities and is designed for rapid configuration of virtual machines, Erez said. VMware is offering the performance management capabilities of the Operations Management Suite as a separate download with all versions of VMware vSphere.
Posted by Jeffrey Schwartz on December 06, 2012 at 11:59 AM1 comments
Less than seven years ago, Amazon Web Services disrupted traditional datacenter computing with its cloud-based infrastructure services, allowing enterprise customers to provision compute and storage and pay based on usage without having to make capital outlays for hardware or software. Many who have moved to this model of paying for IT infrastructure as an operational expense have enjoyed considerable reductions in capital expenditures.
Now, Amazon is looking to similarly upend the way organizations deploy data warehouses.
Kicking off its first-ever partner and customer conference on Wednesday, Amazon launched a cloud-based data warehousing service called Redshift. Amazon says the service will substantially reduce the cost of deploying data warehouses by eliminating the need to acquire conventional software and hardware provided by the likes of EMC, Hewlett-Packard, IBM, Microsoft, Oracle, SAP and Teradata.
In his opening keynote address at the company's re:Invent conference in Las Vegas, Amazon Web Services Senior VP Andy Jassy cited an IBM-commissioned report that found typical data warehouse installations can cost anywhere from $19,000 to $25,000 per terabyte per year. Using reserved data warehouse instances on Amazon's forthcoming Redshift, the average annual cost per terabyte will amount to less than $1,000 per TB year, according to Jassy.
"It allows you to easily and rapidly analyze petabytes of data. It's about a tenth of the cost of traditional data warehouse solutions. It automates the deployment and it works with the popular business intelligence tools," Jassy told 6,000 attendees present at re:Invent and 12,000 registered viewers (including yours truly) of the live webcast.
Customers can choose from either 16 TB nodes or 2 TB nodes and can configure up to 100 nodes per hour up to 1.6 petabytes starting at 85 cents per hour for a 2 TB node. The data is stored in columnar format, Jassy said, which means that the I/O moves much more quickly and queries of data will render much faster than a typical data warehouse solution. The service supports queries with standard SQL, JDBC and ODBC, he noted.
The parent company of AWS, the flagship Amazon retail site, has been testing Redshift for several months. Jassy said the group took 2 billion rows of data and ran six of its most complex queries typically performed in its existing Netezza (now part of IBM) data warehouse. On two 16-terabyte nodes of Redshift, it cost $3.65 per hour equating to $32,000 per year. "Instead of spending millions of dollars, they spent $32,000 a year and ended up with 10 times faster queries," Jassy said.
"Some multi-hour queries finish in under an hour, and some queries that took five to 19 minutes on our current data warehouse are now returning in seconds with Amazon Redshift," said Erik Selberg, manager of Amazon.com's data warehouse team, in a statement.
Redshift's underlying data warehouse engine is powered by ParAccel, a venture-backed company with a deep bench of data warehousing veterans that offers its own high performance analytic database. Initially Redshift will support BI tools from Jaspersoft and MicroStrategy but Jassy said it will also support other leading tools including Cognos from IBM and BusinessObjects from SAP. Early customers that are already participating in a private beta are Flipboard, the team of NASA/Jet Propulsion Labs, Netflix and Schumacher Group.
The service is available now for a limited preview but Amazon is targeting early next year to make Redshift commercially available.
So will Redshift take a bite out of the traditional data warehousing business? That remains to be seen but if Amazon delivers the price-performance that it's promising, it'll offer a compelling alternative, particularly to organizations that can't afford a traditional data warehouse today that have the need to analyze information.
"It doesn't necessarily mean customers are going to chuck the data warehouses they've already got," said Forrester Research analyst James Staten in a telephone interview. "If you've already go one you've already sunk that cost in. But if you're going to have to double or triple that data warehouse in size, it's really going to be hard to justify the cost of keeping them on premises."
That said, despite the rapid growth of the cloud business of Amazon and other providers, many organizations remain reluctant to move mission-critical or sensitive data off premise and that could certainly impact how quickly data warehousing and big data analytics moves to the cloud.
Jassy made clear Amazon will continue its path to be a disruptive force in datacenter computing and I'll spell that out in a post on Thursday following Amazon CTO Werner Vogels' keynote.
Posted by Jeffrey Schwartz on November 28, 2012 at 11:59 AM0 comments
Cisco on Thursday agreed to acquire cloud upstart Cloupia for $125 million in the company's latest bid to unify converged datacenters with cloud infrastructure and services.
Santa Clara, Calif.-based Cloupia, founded in 2009, offers software that lets organizations administer traditional datacenter infrastructure with cloud-based infrastructure. Cloupia's software provides a common interface to manage and monitor infrastructure across physical, virtual and cloud environments and is aligned with key providers.
In addition to an existing alliance with Cisco, Cloupia has partnerships with Amazon Web Services (AWS), EMC, Hewlett-Packard, NetApp, Rackspace and the Virtual Compute Environment, a company formed by Cisco and EMC. Cloupia's flagship product, the Unified Infrastructure Controller, lets organizations build private clouds and manage hybrid infrastructures.
Hilton Romansk, Cisco's vice president of business development, said in a blog post that the move builds on the company's effort to enable enterprise customers to manage its Unified Computing System and its Nexus switches and other third-party cloud infrastructure and services. Romansk explains how the deal accomplishes that goal:
Cisco's acquisition of Cloupia benefits Cisco's Data Center strategy by providing single "pane-of-glass" management across Cisco and partner solutions including FlexPod, VSPEX, and Vblock. Cloupia's products will integrate into the Cisco data center portfolio through UCS Manager, UCS Central, and Nexus 1000V, strengthening Cisco's overall ecosystem strategy by providing open APIs for integration with a broad community of developers and partners.
Similar to previous acquisitions in cloud management, such as Tidal, LineSider and NewScale, the acquisition of Cloupia also complements Cisco's Intelligent Automation for Cloud (IAC) solution.
In short, he concludes the acquisition of Cloupia will help Cisco provide intelligent network orchestration and management by bridging traditional datacenters and cloud infrastructure. It aims to offer the benefits of cloud automation by providing a view of an organization's entire compute, network, storage, VM and operating system resources.
Posted by Jeffrey Schwartz on November 15, 2012 at 11:59 AM0 comments
Like millions of people in the Northeast, I am hunkering down as Hurricane Sandy is living up to its promise as the worst storm to hit this region in decades.
We have been told to expect power outages of anywhere from seven to 10 days -- not a prospect I am looking forward to, if that prediction comes true. Call me a prima donna, but I'm not one who enjoys roughing it, even though as a child I did my share of camping. But that was a long time ago.
I have done everything I can do to prepare for this storm. I stocked up on batteries early on (you can't find D batteries anywhere now to save your life), water, non-perishable food and an extra bottle of wine. And because business doesn't stop, of course I did whatever I could to ensure I could work, presuming we are otherwise safe.
First, I purchased a myCharge device, which will let us charge our cell phones up to three times without using the car charger. Then, of course, I made sure my data was backed up both on a portable flash drive that I will carry with me but also in the cloud. I also purchased a converter that will let me charge my netbook via the car's battery
To ensure data is available, I backed it up to two personal cloud sites, Dropbox and Microsoft's SkyDrive. That is an approach I wouldn't have done in the past but given the number of highly publicized outages that Amazon Web Services (which had one just last week), Microsoft, Google and others have experienced, I believe redundancy greatly increases the likelihood of data availability.
Many are still reluctant to use cloud services to back up their personal files and I admit I have had my reservations. But I have come to the conclusion that the risk of anyone accessing my data is far less probable than the threat of losing files and photos if a catastrophe were to strike. And businesses need to think in the same way, while taking the appropriate measures to secure sensitive data.
How has Hurricane Sandy changed your thinking or use of cloud services both personally or for business critical data? Comment below or e-mail me at email@example.com.
Posted by Jeffrey Schwartz on October 30, 2012 at 11:59 AM0 comments
IBM spent much of last week's Cloud Innovation Forum talking up Platform as a Service (PaaS) to the roughly 100 customers and 200 other stakeholders, including business partners, in attendance.
Big Blue's PaaS offering, called SmartCloud Application Services (SCAS), is available in pre-release form for customers of IBM's existing infrastructure as a service (IaaS) offering, SmartCloud Enterprise (SCE), and it will be generally available later this quarter.
Only a small but incrementally growing percentage of enterprises have started using PaaS for production-oriented applications, according to industry analysts. Moreover, there are a number of companies with various PaaS offerings and shops are evenly split between the various services they envision using, according to IDC senior VP and chief analyst Frank Gens, who gave a presentation at the IBM event. Among the widely preferred PaaS services are Microsoft's Windows Azure, Google App Engine, IBM's SmartCloud PaaS, VMware Cloud Foundry, Salesforce.com's Force.com, NetSuite SuiteCloud, the Intuit Partner Platform and Red Hat OpenShift, according to an IDC study.
IBM used its event to release the results of a study it conducted based on its own survey of 1,500 IT decision makers in 18 countries (both mature and growth markets), which found that the need to manage proliferating big data is a key driver of organizations that are either looking at or going all-in on PaaS.
Driving big data is social media, mobile device usage and data analytics and integration. Customers "are starting to think about how they move big data activity onto the cloud and the applications needed to manage that," said Kevin Thompson, manager of IBM's Center for Applied Insights, who presented the results of the survey to a group of journalists.
IBM concluded there are four types of PaaS users:
- Pioneers (16 percent): those who are creating apps that enable new business capabilities using PaaS
- Experimenters (12 percent): shops that are dipping their toe into PaaS by attempting to take an existing app or business process and move it to the cloud
- Preparers (12 percent): those who plan to start using PaaS
- Observers (39 percent): those sitting on the sidelines for now
I asked Jim Comfort, VP of IBM's SmartCloud Strategy, what percentage of IBM shops he believes are using PaaS.
"This is a multi-year journey," Comfort responded. "I think almost every client has one or two teams or projects that they experimenting within this new category. Everyone is trying to find ways to play. As far as the mainstream, I would expect 25 percent over the next several years will shift their development [toward PaaS] once they find a set of tools that do what they really want to do. That's just my guess."
IBM's new PaaS services consist of application patterns, layers that run on top of its IaaS cloud, and existing customers can use those accounts to develop and deploy apps on SCAS via the SCE portal. IBM describes these patters as predefined software components designed to expedite the development and deployment of cloud-based apps based on multiple predefined architectures incorporated from decades of customer and partner engagements.
"It provides the tools to greatly simplify the development of Web applications using the infrastructure to hide all the complexity," Comfort said. The initial offering is targeted at Web application services. In the pipeline are others including database services, mobile and analytic services, he said.
The first services made available on the new offering are the Collaborative LifeCycle Management and Workload Service. The Collaborative Lifecycle Management services lets development managers add users and roles through the portal, which allows for monitoring and ensuring data availability. It uses IBM's Rational developer tool suite, enabling the application development lifecycle of tracking, designing, implementing, building, testing and deploying apps. The service allows team-based application development lifecycle via Rational Team Concert, Rational Requirements Composer and Rational Quality Manager. Pricing will be announced at GA but IBM will bill customers on a per-user, per-month basis.
Workload Service is aimed at replacing premises-based middleware, notably WebSphere in IBM shops. It provides policy-based automated scaling and app management via the Workload Deployer, which IBM calls the "brains" of the service. Developers can deploy virtual systems, the traditional model for deploying and connecting VMs or the newer virtual applications, which the developer deploys the actual apps.
Like IBM's core applications, it is designed for Java-based apps, though IBM's Comfort said the company plans to support other popular languages including Ruby, PHP and Microsoft's .NET. The new PaaS offering also offers Web Applications Services (WAS), virtual databases, virtual system patters and Java app platforms. Customers will pay a usage-based hourly or monthly rate.
Comfort emphasized with IBM's approach to PaaS, compute and app infrastructure run in the cloud while data remains on premises. That matters, he said, "because applications and data prove to be some of the most strategic assets a company has."
Posted by Jeffrey Schwartz on October 22, 2012 at 11:59 AM0 comments
Microsoft inked a deal on Tuesday to acquire StorSimple for an undisclosed amount, filling a key hole in Redmond's effort to push its so-called "Cloud OS" to enterprises.
StorSimple, a three-year-old provider of cloud integrated storage (CIS) appliances, lets those who manage datacenters to add public cloud services to the storage tier of an enterprise network. By using cloud services for data storage, StorSimple argues companies can offer improved disaster recovery, while lowering total cost of storage ownership by 60 to 80 percent.
The CIS storage appliances extend SAN snapshots, primary storage, backup and archive data to cloud-based services from Amazon Web Services, Google, IBM, Nirvanix and EMC's Atmos storage running in AT&T's public cloud, as well as OpenStack services from Dell, HP, and IBM and, of course, Microsoft's Windows Azure.
"A lot of mainstream enterprise IT customers are choosing Windows Azure with StorSimple," said co-founder and CEO Ursheet Parikh in a short pre-recorded video discussion embedded in a blog post by Michael Park, corporate VP for Microsoft's server and tools business.
Does that mean that once StorSimple becomes part of Microsoft, it will only use Windows Azure as a cloud target? A Microsoft spokeswoman would only say, "As a result of this announcement nothing changes. We have no additional information to share at this time."
StorSimple says its redundant disk controller ensures high availability and no single point of failure, while enabling non-disruptive software upgrades. The appliances include an application optimization plug-in architecture that provides plug-ins for individual files, virtual machine libraries and client devices, as well as SharePoint and Exchange. It's also certified for Windows Server and VMware infrastructures.
Posted by Jeffrey Schwartz on October 18, 2012 at 11:59 AM0 comments
At this week's BoxWorks conference in San Francisco, Box.com announced it has inked new pacts to address security and compliance issues that are often associated with cloud storage offerings.
On the compliance side, the company has released a new reporting API that will enable third-party business intelligence partners, such as GoodData, to create dashboards that will let administrators detect unusual activity like an individual downloading an excessive number of files.
On the security side, Box has a pact with Proofpoint to use its "security as a service" to add a layer of security over documents shared over the Box service. By using the Box API with Proofpoint's service, it can recognize if content is uploaded in violation of a security policy and will result in an admin being notified of the suspicious activity.
Security is one barrier, but making content accessible is another key issue among would-be users of software as a service (SaaS) offerings. To that end, Box said it is making it easier for customers to access data from various other SaaS-based services.
The company's new Box Embed is a framework based on HTML 5 that allows customers to access features from Box such as the ability to preview files, add or view comments and incorporate Box-based search into applications developed in house or by partners. Box revealed 10 partners that are using Box Embed within their cloud-based apps: Concur, Cornerstone OnDemand, DocuSign, Eloqua, FuzeBox, Jive, NetSuite, Oracle, SugarCRM and Zendesk.
Box Embed will let users securely and centrally store content, though they can access it from partner apps. For example, in the case of Jive, Box users will be able to incorporate enterprise social networking into its document collaboration service.
Customers can also use Box Embed to embed content from the service to their Web sites, intranets or services provided by third parties. NetSuite and SugarCRM launched Box Embed-supported features this week and Box said others will follow suit in the next quarter.
Posted by Jeffrey Schwartz on October 11, 2012 at 11:59 AM0 comments
IBM and AT&T are teaming up to offer their shared customers cloud services that use Big Blue's compute and storage infrastructure and network connectivity provided by the telecommunications carrier.
According to Tuesday's announcement, both companies will jointly sell the combined offering to their respective enterprise customers -- it's targeted at giants in the Fortune 1000 -- as an alternative to traditional infrastructure as a service (IaaS) cloud services which often use standard Internet connections.
The offering, to be released early next year, will use IBM's SmartCloud Enterprise+, the company's public IaaS-based cloud service, along with new virtual private network technology developed by AT&T Labs. The new VPN capability is designed to ensure secure connections and more reliable performance than Internet links provide, said Dennis Quan, vice president of IBM's Smart Cloud infrastructure.
"We feel it will give clients a lot more control over security, privacy and performance and we think will resolve some of the issues enterprises have with adopting cloud computing," Quan said in a telephone interview. Quan added the service is suitable for development and testing, as well as to run enterprise applications and even transaction-oriented Web sites.
Given the target audience of Fortune 1000 customers and the implied, though undisclosed, improvement in performance, this so-called "breakthrough" new VPN capability will come with a price premium, though the companies aren't saying. Also worth noting is the fact that both companies are members of the OpenStack effort. It is unclear when or if this service will support the OpenStack networking protocols. Quan would only say IBM is "deeply committed" to OpenStack.
The new VPN technology from AT&T automatically allocates network services to compute infrastructure, according to the announcement. This automation lets customers scale resources on demand much faster than if provided manually, both companies said. The companies said they will offer service-level agreements, over 70 security functions and "high levels" of security embedded on both wired and wireless devices authenticated to a customer's VPN.
The announcement had me wondering if AT&T is going to scale back its own public cloud infrastructure and platform services in favor of sourcing IBM's. In an e-mailed statement from an AT&T spokeswoman:
"This is AT&T's most recent step in executing its strategy to deliver cloud services that meet the needs of a wide variety of users including large and medium enterprises, developers, and internet-centric businesses. We recognize that one size does not fit all when it comes to cloud, and see the opportunity to provide a managed alternative to AT&T Compute as a Service that pairs AT&T's virtual private network technology with IBM's SmartCloud Enterprise+ infrastructure to deliver a highly secure and flexible cloud offer to businesses."
While these companies compete to some extent, it appears both stand to benefit from working together. AT&T can provide direct links from private cloud and premises-based datacenters to IBM SmartCloud Enterprise+, filling a gap in Big Blue's portfolio while giving AT&T another option to deliver IaaS, even if the service is not AT&T's.
It is not clear how big AT&T's enterprise public cloud service is but IBM's is presumably bigger. The company said it expects its cloud revenue to hit $7 billion by the year 2015. While the company hasn't disclosed its cloud revenues to date, IBM said it doubled last year over 2010.
What's your take on this pairing arrangement? Comment below or drop me a line at firstname.lastname@example.org.
Posted by Jeffrey Schwartz on October 10, 2012 at 11:59 AM0 comments
Once again, cloud computing is front and center at the Oracle OpenWorld conference in San Francisco.
Oracle spotlighted the launch of new IaaS, PaaS and SaaS offerings at this year's event. The company has added new online storage services, cloud-based asynchronous message queuing, cloud-based tools to build sites on social networks and in the Oracle Public Cloud, and a variety of new SaaS-based line of business tools including financial planning and analytics capabilities.
The continued emphasis on cloud follows last year's OpenWorld, when the company made its big cloud push with the launch of Oracle Public Cloud and the release of a slew of SaaS applications. As with Microsoft, HP, IBM, Dell and others, cloud computing is the focus of everything Oracle is talking about now.
It's always interesting to hear Oracle CEO Larry Ellison sing the praises of cloud computing these days, years after shrugging it off. In 2008, Ellison famously described cloud computing as the fashion du jour. "What the hell is cloud computing?" Ellison said to financial analysts four years ago. "I'm not going to fight this thing. I don't understand what we would do differently in the light of cloud computing other than changing the wording on some of our ads. It's crazy. That's my view."
Well, Ellison has since refined his view. During his keynotes at OpenWorld this year, Ellison described Oracle as the only company that has addressed private and public cloud computing at the IaaS, PaaS and SaaS layers. I think Microsoft and others might beg to differ but Oracle clearly launched a broad portfolio of SaaS apps as well as substantial infrastructure and platform services.
Ellison effectively said everything Oracle offers will be available for use on premise in traditional datacenters, for private clouds and in the Oracle Public Cloud. Moreover, customers can host their apps on dedicated hardware in Oracle datacenters or run their apps on shared infrastructure. And finally, much of the software in the Oracle Applications suite is now available as a SaaS offering, with social networking hooks, and those applications are based on the same Java-based application infrastructure as the premises-based versions of its software.
One thing that many will take issue with, though, is Ellison's claim that its cloud is standards-based. "Industry standards are extremely important," Ellison said, pointing to the fact that all of Oracle's cloud apps, databases and middleware are based on Java, Linux and the Xen hypervisor. Yet Oracle is one of the only major IT vendors that have not joined the OpenStack initiative (Microsoft, Amazon and Google are other notable players not involved). Cisco, Dell, HP, IBM, Rackspace, Red Hat and VMware are all on board, among nearly 200 other players.
If you're not concerned about portability, this won't matter. But if you're already locked into Oracle, there may be some compelling options from its cloud offerings.
And that's going to be the key focus for Oracle in the foreseeable future. Prior to making his Wednesday afternoon keynote, Ellison told CNBC's Maria Bartiromo, who spent two days broadcasting her show live from OpenWorld, that the cloud will take priority over making any major acquisitions in the coming years. "We are not planning any major acquisitions right now," Ellison said. "We are really focused on the fact that over the last seven or eight years, we re-engineered our applications for the cloud. We think that's a huge opportunity for organic growth."
While that's not the first time Ellison has said Oracle has spent that many years re-engineering its apps for the cloud, you might wonder how that's possible if he was describing it as a fashion trend in 2008. More than likely, Oracle was watching the growth of Salesforce.com and was very much hedging its bets in the SaaS model.
At OpenWorld, Oracle announced the following new cloud offerings are available for preview:
- Oracle Planning and Budgeting Cloud Service, a subscription-based version of its Hyperion Planning app
- Oracle Financial Reporting Cloud Service for creating financial statements
- Oracle Data and Insight Cloud Service for self-service analytics
- Oracle Social Sites Cloud Service, which provides the ability for non-technical users to create sites on social networks such as Facebook
- Oracle Developer Cloud Service for developers who want to build their apps using a public cloud service
- Oracle Storage Cloud Service for providing object storage content linked to existing Oracle Cloud services
- Oracle Messaging Cloud Service, an asynchronous message queuing service to link data between disparate sources.
What's your take on Oracle's cloud strategy? Feel free to comment below or drop me a line at email@example.com.
Posted by Jeffrey Schwartz on October 03, 2012 at 11:59 AM0 comments