Why service providers should not ignore cloud

For many service providers, cloud computing seems to be disruptive for their business model. Especially in the outsourcing business, many service providers are reluctant to offer cloud-based services to their clients. There are two main reasons behind this scepticism:

  • Cloud computing gives agility to the client to subscribe or unsubscribe to services quickly. Although this is a big advantage for the client, it brings challenges to the provider. Smaller service providers have difficulty estimating the required capacity correctly and they risk keeping expensive resources underutilized.
  • Larger providers fear losing their clients once the clients have cloud services with more flexible, standardized contracts. In today’s outsourcing world, it is difficult for a client to switch providers. Services and contracts are not standardized and relatively complex relationships exist between the client’s and service providers’ organizations. The cloud business model breaks this all up by standardizing service descriptions and consumption.

But in any case, ignoring the new trends on the horizon cannot be the solution.

The example of Kodak somehow shows what it means to answer a disruptive business change by just ignoring it. Kodak, inventor of modern photography and one of the market leaders in the early 1990s, decided not to move towards digital photography because this was considered disruptive for their photo development labs all over the world. When they finally accepted the industry trend, they had already lost ground against their competition and you can read in the news what happened to them in 2012.

The point is, if you are not offering your clients what they want, someone else will.

So, there are good reasons not to ignore cloud to keep your current clients with you, but there are even better reason to think about cloud to win clients you have not thought about before.

Cloud computing enables small and medium businesses (SMB) to leverage services that were only affordable for large enterprises before. Just consider a professional CRM application before Salesforce.com, such solutions where out of reach for small and medium business (SMB) clients. The same is true for professional platforms like an Oracle database server, which has initial license costs that do not fit in to the business plans of many startup companies. Receiving database services from the cloud, together with others from a shared system, makes this a positive business case. But, to be able to share the infrastructure and costs with others, service providers are required.

Summary:

The new business and delivery model of cloud computing brings risks to service providers. Clients are less forced to loyalty and might change providers more frequently. But it also means chances for new business that should not be under estimated!

The private cloud is the new datacenter

Normally, data centers are historically grown and contain a number of different, heterogeneous systems. Depending on the age of the data center, you can see the evolutionary steps of the IT industry. At older companies, you find mainframe computers, large midrange systems, and a number of rack-based Intel servers all together on one data floor. Looking at younger companies, (less than 10 years old), you won’t see this variety of platforms. They will rather count on a larger number of similar hardware, but highly virtualized to achieve the required flexibility and be able to run a large variety of workloads.

Virtualization certainly was the industry trend of the last decade.

But, what’s next? When we look ahead 10 years from now, what will be the trend of the next decade? I predict, it will be the private cloud!

From a technology point of view, the private cloud is less a revolution than virtualization was. I see it more as a logical next step. Although virtualization changed the way users perceived servers, with cloud computing, users perceive them now as a service.

It is also not a very big effort to add private cloud capabilities to today’s data centers. Every virtualized server farm can be equipped with a cloud computing layer that handles the user interaction, and the provisioning and deprovisioning of virtual servers. So, it is quite easy to adapt to this new technology.

Another reason for private clouds to conquer more and more data center space is because of cloud computing in general: workloads are not limited to run on servers in a specific data center of a company any more. In the next years, we will see more workloads put on public clouds. These remote workloads still require some degree of management and a central control point for provisioning and deprovisioning. Building up this control point for consuming remote public cloud services enables the local private cloud layer to hook in and be managed from the same infrastructure in a hybrid cloud set up.

I mentioned that with cloud computing IT is more perceived as a service than just technology; this is exactly what users outside the IT department will expect in the future. The cloud computing delivery model has already existed in the consumer market for quite some time now. People are used to visiting app stores to install their application software. They understand video on demand and software as a service for their private day-to-day IT usage. In the very near future, users will expect that on their workplaces too.

Summary

If a new data center is designed today, or an existing one is expanded to a larger extent, there are very good reasons to think about a cloud layer right from the start. At least, there are no good reasons not to think about it!

The 10 biggest myths about desktop cloud

The biggest myths are as follows:

  1. Desktop cloud is cheaper than traditional PCs.
    As I stated in my other blog post “Motivations for moving the desktop to the cloud,” if the only driver for a desktop cloud initiative is cost savings, the project might not succeed. There are many parameters to take into account that can make a desktop cloud solution cheap – or expensive.
  2. You can’t run multimedia applications on a virtual PC.
    You can run multimedia applications on VDI environments. All known vendors of VDI products have solutions available. For lightweight multimedia applications, such as audio or YouTube videos, state-of-the-art protocols such as HDX (Citrix) or PCoIP (VMware) can handle them and provide decent results.
  3. You can’t run CAD applications on a virtual PC.
    Solutions are on the market that can provide high-end graphics in a VDI environment. Most of them are able to access a GPU built into the server for rendering. However, if such a solution does make sense, it needs to be carefully evaluated on a case by case basis.
  4. You can access your desktop from anywhere at anytime.
    Although you can access your virtual desktop as soon as you have a working Internet connection, whether you can work with it depends on a few more parameters such as latency and bandwidth. Latency greater than 100 ms makes a remote desktop feel heavy; latency greater than 250 ms can be annoying; and if latency exceeds 500 ms, it is almost impossible to work with.
  5. You can’t use local attached devices such as printers or scanners.
    You can use local attached devices very well today. It’s more a question about whether all the necessary drivers are installed in the virtual desktop or terminal server or whether the user is entitled to use them.
  6. You can equip 100% of your user population with virtual PCs.
    Even with very high ambitions, you will only be able to transfer a certain percentage of your users to a virtual desktop. For highly standardized clients, an average of 80% is a good number.
  7. You cannot install additional software or device drivers to your virtual PC.
    Usually, this is true. Especially for installing device drivers, administrative privileges are required. Although, from a technical point of view, it would be possible to grant normal users admin rights for their virtual PCs, that is usually not the case in reality. For applications, it might be a different story. Using application virtualization, users can be entitled to access and locally install new applications based on their profile.
  8. You don’t need on site support any more.
    Even with traditional PCs, on-site support is not mandatory. Only about 5 – 10% of all problem tickets are hardware-dependent. The usual problem is related to software or configuration, which can be solved remotely, too. However, users prefer to have someone from the support team in person when discussing a problem – and that’s not changing with a virtual PC.
  9. It is the same effort to patch OS or distribute new versions of a software to all workstations.
    Having all virtual PCs and data in a central data center makes patching them much easier. The whole electronic software distribution and patch management infrastructure is much less complex because it does not require fan-out servers or WAN links.
  10. Desktop cloud does not change anything for the user, so the user gladly accepts the new workstation.
    Don’t underestimate the cultural change when you replace a user’s physical PC with a virtual PC in a cloud. It is like stealing something out of the user’s pocket!

Considerations when defining a desktop cloud solution

In the previous blog posts of this series, we discussed motivations for moving desktops to the cloud and also desktop cloud technologies. Now let’s bring all of them together!

Let’s look at a desktop cloud solution from an architecture perspective. First, we need to define our users and user groups that will actually use our solution. The users will require some hardware, a thin-client or other device to access their virtual desktop. On their virtual desktop, they run applications and compute data.

Simplified desktop cloud architecture

Simplified desktop cloud architecture

So, users will access desktops. But which desktop technology fits for a specific user? To answer this question, we need to define our user groups and their requirements. Typical user groups are task workers, travelers, or developers. All of them have different requirements for their desktops; perhaps not all of them can be met by a single technology! Trying to cover all my users with a desktop cloud solution is a very ambiguous goal and almost impossible to reach. Better approach is to identify the user groups that would benefit most from a desktop cloud, or that would bring most benefit to the company and start with those.

Mapping technologies to usergroups

Mapping technologies to usergroups

Next step is to think of the applications. Designing a desktop cloud solution is a perfect opportunity to review your application landscape and to identify potential for consolidation. There are also a number of ways to provide applications to the users. Applications can be published on terminal servers, streamed using application streaming, or provided purely from the web. Ideally, applications are either web-based or at least support a number of distribution technologies. Application selection principles and development guidelines help to clean up the application landscape in the long term.

Moving further up in the architectural hierarchy, we should discuss user data. By introducing desktop cloud, I might also be required to redesign my user data concept. Locally stored data might not fit the purpose any more when I want to access the data from any device at any time. Technologies such as central data stores, web enabled data, or synchronization mechanism come into consideration.

Designing a desktop cloud solution is not trivial, especially because it directly hits the users in the way they access IT resources. Design steps need to be done carefully, always having the full picture in mind to ensure success!

Desktop cloud technologies

Today, a desktop cloud can consist of various technologies. There are different technologies for delivering the actual desktop, providing the applications, or organizing the underlying infrastructure such as storage. A good desktop cloud solution is a well designed combination of those technologies to support the needed requirements. In today’s article, I want to briefly discuss the various technologies, and explain what they can do and what they can’t.

Let’s start with a user’s desktop and how it can be provided.

Shared desktop

A shared desktop today is what used to be called a terminal server. Basically, all users of a terminal server share the server hardware and the operating system instance. To ensure that users are separated from each other, they are granted only limited rights.

Pro:

  • Use of the hardware is very efficient because there is only one operating system instance
  • Software distribution and patch management is easy because it only needs to be performed once per server.

Con:

  • Applications need to be terminal server-ready.
  • If the operating system hangs, all users on that server are affected.
  • If a single user consumes too many resources, all other users on the same server experience performance issues.
  • Users might not accept the grade of limitation caused by its low user rights.

Virtual PC

A virtual PC is a virtual machine hosting the user’s desktop and operating system. Compared to the shared desktop, the virtual PC can be perceived as a full PC including a private instance of the operating system for every user. Therefore, users can theoretically gain administrative rights on their virtual PCs.

Pro:

  • Users can have more rights, up to administrative privileges.
  • They can run any software as on their traditional PC.
  • If the operating system of one user fails, the other users on that server are not affected.

Con:

  • Because every user has his or her own operating system instance, the overhead is higher.
  • Each operating system instance instance needs to be patched and managed.

Streaming

Streaming tries to combine the performance and response time of a traditional PC with the central manageability and accessibility of a desktop cloud. The main difference between a server-based desktop solution and streaming is that the desktop is sent from a central storage down to the user’s device and then actually runs on the user’s computer. After finishing working, the changes are sent back to the central storage.

Pro

  • Offers good performance.
  • Works offline with a local cache.
  • Desktops can be patched and updated centrally.

Con

  • It is complex to set up and maintain.
  • The sync process requires high network bandwidth.
  • Conflict management is required if the local cache and the central master are out of sync.

 Client hypervisor

Bare metal or type 1 and type 2 hypervisors running on client computers must support requirements that differ from server hypervisors. On a client, it is crucial to support 3D graphic acceleration, WiFi network, and all types of USB-attached hardware, such as printers and scanners. But supporting the latest SCSI adapter is not that important. So, what is the point of having a client hypervisor at all? One aspect of a hypervisor is to separate the operating system (and its included desktop) from the underlying hardware. This approach makes the OS hardware-independent and reduces the hassle with different requirements of drivers. An additional benefit is that the desktop can flexibly be moved from one physical machine to another, for example if either the physical PC or laptop is broken, or in combination with streaming, the desktop can be moved from a data center server to a local PC and vice versa.

Pro

  • Desktop can be moved from one physical hardware device to another.
  • Multiple desktops with different purposes can be used simultaneously.

Con

  • Hardware support for WiFi and 3D graphics is not mature today.
  • Additional overhead exists because of the hypervisor

 Golden Image (copy on write) and non-persistent desktop

Non-persistent desktops are virtual machines that are set back to their original state during reboot and therefore lose all changes made while they were online. A non-persistent client setup is usually combined with a persistent data partition, so that the users can store documents and files that will not be deleted when rebooting. However, all changes made to the operating system itself would vanish. As anyone can image, this setup is very robust and ensures a working desktop at any time.

Pro:

  • Storage requirement is low because the system partition is required only once.
  • Offers easy patching and software distribution. After the master image is patched, all rebooted virtual machines are automatically patched.
  • It is a very robust solution, because any misconfigured desktop only needs to be rebooted to be operational again.

Con:

  • It has a low user acceptance.

Offline Patching

As discussed above, the drawback of persistent virtual PCs are the need to patch each and every machine as with traditional client computers. However, there is still one big advantage over distributed PCs: while traditional desktop and laptop computers are carried around, left as spare devices in cupboards and drawers, or are simply turned off during a software distribution phase – and are therefore not reachable – virtual PCs are always residing in the data center, even if they are off line (virtually turned off).

But, in any case, they must be virtually turned on, patched and turned off again, unless an offline patching technology is used. Offline patching can patch the actual image files of virtual PCs while they are offline and therefore ensure that they get the software update they require.

Summary

For the sake of the length of this blog, these technologies are only a subset of what is available today but the description should provide a good overview about the main aspects that need to be looked at when thinking about a desktop cloud solution.

In the next blog of my desktop cloud series, I will discuss best practices of how to map technologies to client requirements.

Motivations for moving the desktop in the cloud

When you ask CIOs or CTOs, who plan to introduce desktop cloud within their enterprises, about their motivation, they tell you about all sorts of value they expect to gain from that project. However, if you dig a little bit deeper about their goals, it turns out that their main driver is cost reduction.

But, is a desktop cloud really cheaper than managing traditional notebooks and PCs? Well, as always in IT, the answer is not that simple: it depends.

To find out if something new is cheaper than what I have today, I need to understand my current costs – and, my current service quality. If I operate a low-cost solution today, and there are no mandatory requirements to upgrade my current service levels, I will very likely end up with much higher costs when I move my desktops into the cloud. The reason is that even a low-cost desktop cloud provides a set of advantages over a traditional notebook and desktop environment, such as better scalability, higher agility, better availability, improved accessibility, and most probably higher data safety and security. All those advantages require support from the underlying infrastructure, such as capable servers, data-center floor space, storage devices, proper network connections, and a suitable stack of management software.

I recently talked to a large retail company about desktop cloud. Their current model is simple and cheap: Each store operates one to three desktop computers. There are no central services provided except on-site support on a call-out basis with no guaranteed reaction time (usually from five to ten days). So, in fact, the shops are responsible for their PCs themselves; their employees store data locally and do or do not patch their operating systems. However, their business is not affected if a PC is not operational for several days. The store manager, uses them only for emailing, writing letters, and eventually filling in a spreadsheet to report the store’s performance to the headquarters. This client will never meet its current costs with a desktop cloud, unless the client decides that its poor service levels need to be increased to stay competitive for the future.

To answer our initial question, if a desktop cloud is a viable solution for a specific client, we need to view the full picture and determine how important other motivations for a move to the cloud are, and if the client is willing to pay the price for them.

Scalability

Because a desktop cloud is located in a central data center, I can easily scale its capacity and performance by only adding new hardware.

Agility

Deployment of new desktops is a matter of seconds, because they are virtually created in the data-center’s infrastructure. If companies tend to grow through acquisitions and frequently require a high number of new users to be equipped with the standard desktop platform, agility can be a valuable advantage over traditional PC rollout.

Availability

The desktop cloud infrastructure is based on server hardware and usually runs in a data center, leveraging fault-tolerant components and systems management. Thin clients used as end-user devices are without configuration and without locally stored data; if they fail, they can be exchanged quickly and easily.

Accessibility

A central desktop in a desktop cloud can be accessed from almost any device and from almost anywhere in the world. The only requirement is a capable network connection. But not only the desktop can be accessed from anywhere, also the user’s personal and corporate data.

Performance

As mentioned earlier, virtual desktops run on server hardware. Under normal mode of operation, a number of users share a certain server hardware. Depending on the ratio of concurrent users to server hardware, performance can be controlled, and adapted as required. Access to data is also usually much faster, because desktop and data are both located in the data center, connected through a high-performance data-center network.

Data safety and security

Data can easily be backed up, compared to the number of local hard drives in traditional PCs. Regarding data security, by keeping the data on storage devices in the data center, the data is protected by the data-center security mechanism. The user, regardless of which country that user might sit, can only view and edit the data, but not copy the data.

Summary

If reducing costs is the only motivation for considering a desktop cloud solution, showing a positive business case will be difficult. Moving PCs into a desktop cloud brings many benefits. How valuable these benefits are for a specific enterprise needs to be analyzed on a case-by-case basis and brought into relation to the existing environment and its provided and required services.