Acquisitions and organic product development will both be part of the strategic mix as Riverbed Technology seeks to grow beyond its WAN optimization roots into a future in the software-defined era.
"We've done three acquisitions in the last 18 months, and we'll continue to do more," said Chairman and CEO Jerry M. Kennelly on Wednesday in a keynote at the Riverbed Partner Summit 2017 in Santa Barbara, Calif.
Kennelly's comments came as he and other senior executives were explaining the company's newly heavy emphasis on cloud and urging the 160 partners in attendance to commit to joining Riverbed in focusing on cloud networking, digital experience monitoring and service lifecycle management.
Riverbed used the conference to highlight its pivot from a business and partner model focused on fulfillment of Steelhead WAN optimization appliances to solutions that are more dependent on services leveraging newer products like SteelConnect for cloud networking and software-defined-WAN and SteelCentral for application and network performance monitoring.
"We are...a different company than we were two years ago, and so much of this comes from our commitment to the cloud," said Karl Meulema, senior vice president of Worldwide Channels at Riverbed, from the stage at a conference themed "Disrupt."
Kennelly did not indicate what technology sectors Riverbed is looking at for future acquisitions, but the three acquisitions he referred to -- Xirrus in April, Aternity last July and Ocedo last January -- all brought important components that are fitting into the San Francisco-based company's evolving strategy.
Last week, Riverbed unveiled its definitive agreement to acquire Xirrus, a provider of high-density Wi-Fi solutions that are known for serving up connectivity at high-profile sporting events and large-scale conferences. More important for Riverbed and its partners is Xirrus' technology for managing Wi-Fi networks from the cloud, which is critical for the company's cloud networking ambitions.
Several Riverbed executives described Xirrus on stage as a Cisco Meraki killer for Riverbed and its partners. Meulema spelled out the thinking in a one-on-one interview Wednesday. "Xirrus fills a hole for those customers who are looking to integrate LAN and WAN from the same point. Our SD-WAN offering was well accepted by our customers and our partners, but Meraki had an edge and we've taken that edge away," he said.
The Aternity acquisition from last July filled an important piece of Riverbed's digital experience monitoring puzzle, according to Mike Sargent, senior vice president and general manager of the SteelCentral business unit at Riverbed.
"The CIO is more blind than ever. They have all the accountability and less and less of the control," Sargent said of the need for visibility across the network, applications code, servers and public cloud services for diagnosing and fixing problems with networks and mission-critical applications. "You can't manage what you can't see."
Aternity added to the solution by bringing end-user devices into the SteelCentral monitoring picture. "It gave us that broad net to really see what's going on," Sargent said.
The Ocedo acquisition, meanwhile, was "absolutely instrumental" to Riverbed's SD-WAN play, Meulema said. "We recognized that we were behind on the controller side. It just gave us that one year or more jump in time," he said.
At the same time, Riverbed has placed internal investment bets in several areas. The company launched SteelConnect into the SD-WAN market a year ago and revved it with a 2.0 version in September. Along the way, the product has brought what Riverbed calls "one-click" connectivity first for Amazon Web Services and later for Microsoft Azure.
According to Meulema, the company's biggest organic investment is a project under development called the Service Delivery Platform (SDP).
During the conference keynote, Phil Harris, who runs Riverbed's service provider business, described the SDP as an effort to help Riverbed partners bring the speed of the software industry and the service repeatability of the managed service provider (MSP) market to the business of spinning up infrastructure and application service lines through a platform that leverages Riverbed's existing technology and combines it with APIs, templates and consumption analysis tools. The platform is being designed to be multi-tenant and capable of orchestrating elements from multiple vendors and services, not just Riverbed gear and software.
"In the same metaphor that the software industry has moved to a DevOps model, which is all around those ideas of continuous integration, continuous development, continuous release, much more accelerated value for the cloud, why can't we bring that value to the overall IT market space?" Harris said. He said the first version of the platform will be available in the third quarter.
Amid all the product change, Riverbed has been whittling its channel down from about 2,700 partners a few years ago to under 1,000 today in an effort to engage more deeply with partners who can help with the partner-led service strategy.
"Their focus on services and enabling and motivating their partners to strengthen and develop their own partner-branded services is absolutely the right thing to be doing," said Kevin Rhone, an analyst with Enterprise Strategy Group who attended the conference. "It's not easy helping partners transform their business models. I think they'll end up with a somewhat smaller but stronger partner network because not all of their traditional reselling partners will be able to develop a deep consultative service business."
Posted by Scott Bekker on April 28, 2017 at 9:39 AM0 comments
The first of the Intel Optane memory modules designed to improve desktop PC performance are available Monday at computer component retailers.
The two add-in components are a 16GB memory module with an MSRP of $44 and a 32GB module with an MSRP of $77.
Optane memory works with 7th Generation Intel Core processors. Through an Intel Optane memory ready program launched in early 2017, more than 130 motherboards were Intel Optane memory-ready as of late March.
While gaming is a major focus of Optane memory, Intel touts business application benefits for the new memory modules. According to Intel, Optane memory can double computer power-on speed, speed the launch of Microsoft Outlook by six times and speed the launch of the Chrome browser by five times.
The add-in modules are one of the first steps in the rollout of Optane memory. According to Intel, OEM systems with Optane memory pre-installed will be available later this year.
Posted by Scott Bekker on April 24, 2017 at 10:44 AM0 comments
Citrix Systems this week rolled out a Services Delivery Program based on converting the experience and intellectual property of Citrix Consulting into packaged, repeatable offerings for partners serving midmarket customers or those operating in developing or under-served markets.
The core of the program is a set of Services Delivery Kits to allow any partner to ramp up a services portfolio in areas including infrastructure assessments, design, core build, issue resolution and pilot development.
"Each kit is a one-stop shop that has everything you need to execute a specific customer engagement," said Hector Lima, Citrix vice president of worldwide professional services, in a video introducing the program.
Kits include project scope documents; tools, such as automation scripts; templates for things from sales slide decks to design matrices; and project-specific instructional videos. Each kit also includes two hours of remote assistance from Citrix Consulting Services. That consultant access is available 24 hours a day from Monday through Friday.
Unlike many elements of Citrix's programs for partners, even relatively elite tiers, the kits carry a price tag, and partners are required to buy a kit every time they use them with a customer.
A second tier of the program is a new set of e-learning training courses and certification exams leading to Services Delivery Certification for a partner. Once completed, Citrix partners with that certification will qualify for lower pricing on kits, preferential promotion and eligibility to join the Global Citrix Virtual Bench, in which the partner acts as Citrix Consulting staff.
Posted by Scott Bekker on April 21, 2017 at 2:26 PM0 comments
Microsoft unveiled a new type of container this week at DockerCon 2017 -- a Linux container that runs on Windows Server 2016.
The move would break down a fundamental wall in deployment scenarios for containers to date. For now, Linux containers can only run on Linux host operating systems and Windows containers can only run on Windows host operating systems. While that's more of a problem for Windows, which is the newcomer to the container phenomenon, a key benefit of containers is portability. The easier it is to deploy a container regardless of the underlying infrastructure, the closer the ideal comes to being realized.
Microsoft is partially solving the issue for its user base with the funky Hyper-V containers that it released to some industry head-scratching with Windows Server 2016. (Why add the management and processing overhead of virtualization to containers?) The rest of the solution is coming from Docker and from Linux distributors, who are committing to building lightweight Linux kernels that will run inside the Hyper-V containers.
John Gossman, Microsoft Azure lead architect and Linux Foundation board member, took the stage at DockerCon in Austin, Texas, on Tuesday to demonstrate a Linux container running inside a Hyper-V container inside a Windows Server.
Mike Schutz, general manager of product marketing in the Cloud + Enterprise division at Microsoft, described the significance of the moment in a blog post Wednesday. "Yesterday we showed for the first time, a Linux container running natively on Windows Server using the Hyper-V isolation technology currently available only to Windows Server Containers," Schutz wrote Wednesday. (See here for a primer on how containers of different types work across the Microsoft stack.)
Now that the Linux-in-Hyper-V approach is formally unveiled, Gossman presented the Linux support as a logical next step in Hyper-V containers. "When we announced and launched Hyper-V Containers it was because some customers desired additional, hardware-based isolation for multi-tenant workloads, and to support cases where customers may want a different kernel than what the container host is using -- for example different versions. We are now extending this same Hyper-V isolation technology to deliver Linux containers on Windows Server. This will give the same isolation and management experience for Windows Server Containers and Linux containers on the same host, side by side," he said in a post.
The premier of the Linux container on Windows coincided with Docker's big reveal this week around the Moby Project and LinuxKit -- with an emphasis on creating lightweight and secure Linux kernels from Docker and others to run inside containers. One of the problems that approach solves is providing a Linux kernel inside containers on non-Linux platforms, such as Windows Servers, Windows clients or Apple Macs. Those Linux kernels, which themselves are built from swappable container parts, can take up as little as 35MB.
In an official blog post this week, Justin Cormack, a software engineer at Docker, mentioned the LinuxKit work in the context of the Docker-Microsoft relationship that dates to 2014: "The next step in that collaboration...is that all Windows Server and Windows 10 customers will get access to Linux containers and we will be working together on how to integrate LinuxKit with Hyper-V isolation."
Gossman's Microsoft post provides a hint at how excited the open source community is about the opportunity to spread Linux-based containers across the global installed base of Windows Servers. Senior executives at Canonical, Intel, Red Hat and SUSE all provided statements about how they will be working over the next few months with Microsoft's open source integration code to create Linux container OS images for Hyper-V containers.
Posted by Scott Bekker on April 20, 2017 at 11:48 AM0 comments
Ingram Micro, a key distribution partner in Microsoft's Cloud Solution Provider (CSP) business model, on Thursday revealed upgrades to its platforms for both 1-Tier and 2-Tier Microsoft CSPs.
The CSP-related upgrades are part of a raft of platform upgrades being rolled out at the Ingram Micro Cloud Summit in Phoenix this week.
Microsoft's CSP program relies primarily on three types of partners using two business models. The 1-Tier CSPs, generally very large Microsoft partners, buy cloud subscriptions directly from Microsoft and resell them in bundled packages to customers. The bulk of Microsoft's CSP partners are 2-Tier CSPs, who obtain their Microsoft subscriptions from an intermediary 2-Tier Distributor, like Ingram.
Ingram plays a role in both business models, however, due to its 2015 acquisition of the Odin Service Automation platform from Parallels Holdings. Odin provides cloud marketplace technology that 1-Tier CSPs can use for storefront infrastructure, sales, billing, provisioning and management of the cloud services that they bundle for their customers.
For 1-Tier CSPs and other hosters, a new version of Odin Automation Essentials entered general availability. That version already supported the CSP's own services, many third-party cloud services and Microsoft services such as Office 365, Dynamics 365 and Enterprise Mobility + Security. The latest release adds Azure and Windows 10 Enterprise licenses to the Microsoft cloud services mix, along with the ability to provide shared and virtual private server hosting services.
For Ingram partners who are 2-Tier CSPs or who otherwise use Ingram's Cloud Marketplace, a new feature coming this quarter will allow them to conduct orchestration of Infrastructure as a Service (IaaS) for customers.
The Ingram Micro Cloud Orchestrator will enable automation and orchestration of the deployment and management of private, public or hybrid cloud workloads by partners on behalf of customers. Supported services include Microsoft Azure, Amazon Web Services, IBM BlueMix and VMware.
Posted by Scott Bekker on April 20, 2017 at 11:48 AM0 comments
Microsoft on Wednesday released a production-ready community technical preview (CTP) 2.0 of SQL Server v.Next, which the company also confirmed will officially be called SQL Server 2017.
Scott Guthrie, executive vice president of Cloud and Enterprise at Microsoft, revealed the SQL version and naming news, along with a raft of data platform announcements during a new online event called Microsoft Data Amp.
On track to ship roughly a year after SQL Server 2016, this new version of Microsoft's flagship database is highly anticipated for bringing Linux support to SQL Server. The CTP 2.0 is production-ready on both Windows and Linux. To underscore that the Linux versions are ready for prime time, a demo during the Data Amp event involved running SQL Server on Linux via Docker from an Apple Mac, and Microsoft also revealed a record TPC-H data warehousing benchmark conducted with SQL Server 2017 on Red Hat Enterprise Linux (RHEL) and HPE ProLiant server hardware.
Supported Linux platforms for CTP 2.0 include RHEL 7.3, SUSE Linux Enterprise Server v12 SP2, and Ubuntu 16.04 and 16.10. As demoed, SQL Server 2017 is also available as a Docker image, which can run on Linux, Windows or Mac.
There's parity between the Windows and Linux products on most of the new features in the CTP. The new preview brings to Linux a few features that were previously only available on Windows, such as some SQL Server Agent capabilities and a listener for Always On availability groups.
New for both Windows and Linux in CTP 2.0 are support for storing and analyzing graph data relationships, resumable online index rebuilds and improvements to the automatic processes for keeping database queries running efficiently.
One major new feature available only for Windows is the ability to use the Python language in the database to run advanced analytics. The new capability is called Microsoft Machine Learning Services, and Microsoft positions it as enabling the database to scale and accelerate machine learning, predictive analytics and data science scripts. The in-database use of Python joins the existing capabilities around the R language as a Windows-only feature of SQL Server.
Microsoft's agenda for data covers much more than on-premises database servers, and the SQL Server 2017 updates at Data Amp were accompanied on Wednesday by several major announcements involving other servers, services and Azure:
- Microsoft R Server 9.1, an incremental release of the Big Data analytics server that hit version 9.0 in early December, is now available. Enhancements to the 9.1 version include supporting the new Python capabilities in SQL Server 2017 CTP 2.0.
- Several Microsoft Cognitive Services graduated from preview stage to general availability in the Azure Portal. They include the Face API for detecting, comparing and grouping faces; the Computer Vision API for automatically contextualizing the contents of images, tagging objects, landmarks, people and actions and providing a description of the contents in a coherent sentence, and the Content Moderator for text and images, which is a mix of algorithm-based and human-review tools.
- Azure Analysis Services, a cloud tool based on the on-premises Microsoft SQL Server Analysis Services, reached general availability.
- A GetToSQL Migration Service entered the preview stage on Azure. The tool automates the migration of on-premises SQL Server, Oracle and MySQL databases to the Azure SQL Database.
Posted by Scott Bekker on April 19, 2017 at 11:40 AM0 comments
We're putting the wraps on our second annual RCP 200 list of the top U.S. Microsoft partners, but we wanted to give loyal RCPmag.com readers one more chance to throw their hats into the ring.
This is a qualitative list of the Microsoft solution provider companies that demonstrate a laser focus on Microsoft technology and a strong commitment to providing great value for their customers. There are a few requirements -- companies that get listed must belong to the Microsoft Partner Network, must have major end-user service operations in the United States and should have at least one Microsoft Gold Competency.
Beyond that, it's subjective. We're not just looking for the biggest companies or the broadest coverage of Microsoft technologies. Some winners are niche providers, focused on a narrow part of the Microsoft stack, others have a great regional reputation, still others are regular Microsoft regional award winners.
Think your company has what it takes? Fill out the application here by May 9.
Posted by Scott Bekker on April 17, 2017 at 11:44 AM0 comments
BitTitan is taking another big step in offering prescriptive guidance for Microsoft partners and other IT service providers with the release this week of an overhauled MSPComplete.
The Seattle-area company's suite is shifting from suggesting potential upselling opportunities to offering a full set of more than 100 customizable playbooks.
Those playbooks, or runbooks, are preconfigured sets of standard operating procedures within BitTitan's platform that a partner's service team can step through to perform many kinds of IT service projects, even ones that the engineers may not be completely familiar with at the start. The idea is to expand an MSP's revenue-generating services and improve the quality level and reliability of what engineers do on behalf of the MSP.
BitTitan first released MSPComplete in 2015 as a suite of products to help MSPs transition customers from on-premises servers to cloud services, so the suite included BitTitan tools for e-mail migration, document migration, configuration of Outlook, Azure assessments and other functions.
"Our next version is a platform that orchestrates end-to-end delivery of a project," said Geeman Yip, CEO of BitTitan.
With its close associations to Microsoft, many of those projects are related to Office 365, FastTrack, Azure, OneDrive and other Microsoft products and tools. But the new MSPComplete Service Library also involves migration and turnkey services for Dropbox Business, Amazon, Box and Google G Suite, among others.
Some examples of the out-of-the-box service playbooks available immediately include On-premise Exchange to Office 365, HealthCheck for Office 365, Troubleshoot Restrictions and Limits, Configure Microsoft Outlook, Set up SharePoint Online sites, Perform an Azure ROI Assessment, Perform Public Folder Migration, Generate a Software Inventory Report, Discover Azure VM Utilization, SaaS App Integration with Azure Active Directory Marketplace, Apply a Legal Hold to a Office 365 User, and Office 365 Mailbox Migration.
The suite also permits customization, allowing companies to add their own intellectual property to the workflows that their engineers will step through. For example, one of BitTitan's MSP customers might create a playbook of "FastTrack Onboarding Guidance" or an "Employee Onboarding Checklist" or "Install Office Mobile Apps."
Because an MSP's engineers perform their work through MSPComplete, the platform can provide other services useful to that MSP's management. Managers can gather metrics from the tool about how long each engineer spent on each step in a playbook to get a better sense of how long different parts of projects take or for employee training. Managers can also gain visibility into precisely where each customer's project stands. Finally, by entering information about how much each employee costs per hour, managers can determine how much certain services are costing them to deliver.
Christopher Hertz, former president of two-time Microsoft U.S. Partner of the Year New Signature, consulted with BitTitan on development of this update to MSPComplete and sees two main groups of partners who could benefit from the approach.
One group is smaller partner companies who aren't offering managed services yet, but are getting pressure to transition into MSPs from Microsoft, whose help comes in the form of overwhelming, 200-page guides.
"If I don't have managed services today for Office 365, and I'm worried about how do I price it, and how do I talk to my customers about it, and marry that with the ability for my team to actually deliver against that, this does all those things," Hertz said.
The other group is existing MSPs who want more reliable delivery but struggle to maintain up-to-date processes.
"It was always expensive to maintain playbooks or checklists because the world shifts pretty rapidly in cloud services. One problem is those were expressed in Word documents. A few things would happen. They would inevitably go out of date," Hertz said. "If you're a small MSP or even a midsize one, trying to maintain a comprehensive list of these best practices and putting them into standard operating procedure guidance is almost impossible to do when you think about how broad and how deep and how complex it is today with cloud services. You can get scale in that way with BitTitan in a way you just can't do as an individual partner."
Posted by Scott Bekker on April 06, 2017 at 1:30 PM0 comments
A year after unveiling its first software release, Santa Clara, Calif.-based startup Uila Inc. on Tuesday connected the application-aware infrastructure performance monitoring dots by bringing end users into the process.
Designed for datacenters running mission-critical applications, Uila (pronounced wee-la) relies on small virtual machines on physical hosts to listen to network traffic and send metadata to a controller on-premises or in the cloud. With intelligence allowing it to auto-discover more than 4,000 applications, the company's tool aims to create a performance dashboard for the entire stack, including application response times for compute, storage and the network.
The new end-user component helps with one of the main use cases of the whole toolset -- quick troubleshooting for IT operations staff in complex environments.
"When the end user complains, you don't know where to start," said Uila CEO Chia-Chee Kuan in an interview. "This is not a new problem, it's just made worse by everything being highly virtualized, plus cloud, as well. Applications are getting a lot more complicated than before with multi-tier, and it's not only just within your datacenter. A lot of times the application pulls in data from the Internet, as well -- for example Salesforce integration for some business applications."
The new end-user experience monitoring provides proactive alerts that would ideally identify and allow IT operations teams to address performance degradation before the user notices. Per-client transaction histories help IT dig into the root causes of issues. Additionally, IP addresses can retroactively be grouped into sites, allowing IT to define, visualize and compare application performance at different locations even after a performance issue arises.
With a $5 million Series A funding round completed in mid-March, the 4-year-old company thus far has raised $8.3 million to enter a crowded field. Yet Kuan positions Uila as a key complement to existing enterprise performance management tools for datacenters, applications, virtual infrastructure and the network.
"How we're different from them is we add application visibility and bridge the virtualization and physical and networking into their tools' visibility. We don't want to replace them. You use us as a catch-all solution. We catch everything in troubleshooting, and a lot of times we show exactly the root cause," he said.
With about two dozen named customers and 10,000 production application servers under monitoring, Uila plans to use its new funding for continued product development and reseller recruitment.
Posted by Scott Bekker on April 04, 2017 at 3:04 PM0 comments
When it comes to overall Internet access across all device types worldwide, Microsoft's long reign as the dominant operating system has come to an end, according to researchers at StatCounter.
The independent Web analytics company, which tracks OS usage, on Monday reported that Android overtook Windows in March for the first time as the world's most popular operating system.
For now, Android's lead is very slight. Android's worldwide OS Internet usage market share hit 37.93 percent compared with 37.91 percent for Windows.
How stable is a lead based on two hundredths of a percentage point? Based on current trends, it's not likely that Microsoft will grab the worldwide title back any time soon on the metric, which includes mobile phones and tablets along with desktops and laptops, according to a statement from StatCounter CEO Aodhan Cullen.
"This is a milestone in technology history and the end of an era," Cullen pronounced. "It marks the end of Microsoft's leadership worldwide of the OS market which it has held since the 1980s. It also represents a major breakthrough for Android which held just 2.4% of global internet usage share only five years ago."
By segment and geography, there are several areas where Microsoft retains its lead. First is worldwide PC and laptop Internet access, where Microsoft built its dominant position and which Windows still leads by a healthy margin, with an 84 percent share in March, according to StatCounter.
Then there is the lucrative North America market, where Windows still holds a lead across all platforms (40 percent share), and where Android (21 percent) has yet to surpass iOS (26 percent). Microsoft's advantage remains even more pronounced in Europe, with Windows leading Android 52 percent to 24 percent.
But alongside incremental gains in those important markets, Android has built a dominant position of its own in fast-growing Asian markets. On that continent, Android is a 52 percent to 29 percent leader over Windows.
Posted by Scott Bekker on April 03, 2017 at 9:53 AM0 comments
Dell refreshed its Wyse thin client line on Tuesday with a new device that sports the first quad-core processor in one of its entry-level devices for virtual workspace environments.
The Wyse 3040 has a lot of the specs you'd expect in a refreshed line -- lower weight at just over half a pound, about a quarter less power usage than previous models and a smaller overall size.
But Jeff McNaught, vice president of marketing and chief strategy officer for the Cloud Client Computing Unit at Dell, says the quad-core processor was a key enhancement to ensure lifetime value and flexibility for the little box.
"We selected a quad-core processor because one of the big tenets that we have with thin clients is delivering a device that can work for 8-10 years," McNaught said in an interview. "What we had found in the past with some other designs was that customers were [in] great [shape] as long as they didn't change their software strategy."
Specifically, the Intel Atom X5 1.44GHz quad-core processor with as much as 2GB DDR3 RAM and 8GB flash will help the 3040 support light multimedia activity and local application processing.
The Dell team focused on a future Skype for Business use case, which is emerging as part of more and more business scenarios.
"With that quad-core processor, we licensed some key technology from Citrix and from Microsoft to be able to handle products like Skype for Business very effectively. Virtual environments have challenges -- called the trombone effect and the back-on-itself effect, where you have multiple video and audio streams operating simultaneously. This product is designed to be able to use the server to set up the call; but then once the call is set up, it operates with the target in a peer-to-peer manner, and that reduces the amount of bandwidth that's needed by a lot," McNaught said. "We needed to use a processor that had a tremendous amount of performance, and we needed to be able to have encode and decode handled separately from everything else you see on that screen delivered by Citrix, Microsoft or VMware."
With a starting price of $329, it's in the same price range as Wyse's current starter entries -- the single-core 3010 and the dual-core 3020 and 3030. The 3030 will remain on the market.
The Wyse 3040 will initially ship with the Wyse ThinOS software. Starting in June, Wyse will offer a new thin client option called Wyse ThinLinux. McNaught described ThinLinux as a thin-client-focused and hardened version of SUSE Linux that expands the use case for a thin client: "[You will] be able to have a local browser and to embed specific Linux applications right into that thin client. A lot of our customers like to do that because they might be replacing a PC with that thin client."
Posted by Scott Bekker on March 28, 2017 at 10:30 AM0 comments
A refrain among channel advocates musing about IT's future in the cloud is that even for customers interested in assembling best-of-breed collections of cloud services, they still want a trusted adviser to put it all together for them.
The other way of looking at it is that customers want a single throat to choke when things don't work.
In the form of a commissioned study by 451 Research, Microsoft added some survey-based buttressing for that argument. Microsoft released the hosting and cloud study, "Digital Transformation Opportunity for Service Providers: Beyond Infrastructures," during its Microsoft Cloud and Hosting Summit last week.
Aziz Benmalek, vice president for Worldwide Hosting & Managed Service Providers at Microsoft, blogged about the 1,700-respondent survey:
Half of all organizations surveyed consider service providers as vital for future digital transformation projects. Even better, 60% of those would be willing to pay twice as much as they currently spend to have a single trusted advisor solution to manage all their digital transformation-related sourcing, implementation, and management needs.
Additionally, the new research shows that 62% of cloud/hosting infrastructure spending comes bundled with value-added services, rising to 84% for the next hosting/cloud infrastructure engagement. By owning the customer relationship end to end it provides the perfect platform for partners to build value-added services for their customers that will create stickiness and differentiation from the competition.
Posted by Scott Bekker on March 27, 2017 at 11:01 AM0 comments