Microsoft versus OpenOffice: Not the battle of the future!

Anyone remember Super Audio CD (SACD)? Or Audio DVD (ADVD)? Those formats once had a battle about which would succeed the Audio CD as the primary media for audio content. However, after the invention of MP3 and its wide distribution, disc formats became obsolete.

When Blu-ray Disc and HD-DVD had a similar battle about who would be the next primary video media, experts already talked about a war where there was nothing to win for anybody. It was predicted that the Internet would become that primary media and that hard media such as discs or tapes would no longer be of any importance. Well, Blu-ray Discs did gain some market share, but only temporarily, as we look back today. Download portals, and also IPTV and video-on-demand offerings are slowly coming and will, for sure, get their pieces of the cake.

LibreOffice 3.5.1 came out recently, and Apache OpenOffice with the help of IBM will release a new version (4.0) approximately at the end of the year. These two alternative office suites run another attack on Microsoft’s dominance of the office software sector. But is this really a battle that is worth fighting for?

First of all, this battle is almost impossible to win. Not only that Microsoft’s dominant market share is equal to a de-facto industry standard, the office documents are mainly based on Microsoft’s proprietary document formats, which are often non-disclosed and therefore hard for non-Microsoft applications to interpret correctly. I think that any success of alternative office suites raise and fall with their ability to import and export Microsoft formats properly.

But, is this really the battlefield of the future? I don’t think so.

The actual battlefield about the future of software is in the cloud!

As Andreas Groth (@andreasgroth) and I mentioned in several earlier blog posts, the final goal of software evolution is to be web-based. There are several reasons for that: Web-based applications are easy to access (from any device), they are cheap to maintain and they support our new requirements in terms of collaboration and content sharing more easily as any local installed app does.

In regards to office software, all vendors had to start their development almost from zero again, which makes this race so interesting. Regardless, if we look at Microsoft’s Office 365, IBM Docs (formally known as LotusLive Symphony) or Google Docs, they all have in common that they were more or less developed from scratch. But beside the big three, there is a high momentum in that area to make applications accessible from a simple browser. Several examples are VMware AppBlast, Guacamole, and LibreOffice, which all use technology based on HTML-5.

But what will be the criteria to succeed in the cloud?

There is no doubt, that any office software needs to fulfil the productivity basics. I don’t think that cloud-based software must implement all fancy features of Office 2010, but it must enable the user to fulfil day-to-day tasks, including the capability to import and export office documents, display them properly, and run macros.

In terms of collaboration, cloud-based software needs to provide added value to any desktop- based application. It should be easily possible to share and exchange documents with coworkers.

But the most important factor will be its integration capabilities. Desktop and office workloads will not be moved into the cloud from one day to the other. There will be a certain time frame where the use of cloud-based applications start to grow, but the majority of people is still using locally installed applications. Being well integrated, both with the local installed software and also with server- based collaboration tools, will be the key factor for success. This is why I see Microsoft in a far better position than Google, although Google Docs has been around for quite some time and has started to become interesting, feature-wise.

IBM seems to be on the correct track. Its integration of IBM Docs into IBM Connections and IBM SmartCloud for Social Business (formally known as LotusLive), which can be tested in the IBM Greenhouse, looks very promising.

Summary

The new battlefield will be in the cloud, and although Microsoft did its homework, the productivity software market is changing. There are more significant solutions and vendors available than in the years before. If they play their cards right and provide good integration together with an attractive license model and collaboration features, they could get their share of (Microsoft’s) cake.

Windows 8 – the last one?

Will Windows 8 be the last client operating system out of Redmond? Probably not, but we need to ask ourselves which value a new client operating system (OS) version can bring to the desktop.

Up to today, functionality was required on the desktop, because the desktop was the platform for all the various applications. Each new operating system version brought new features and enabled us, the users, to do things we couldn’t do with an older version.

But, things have changed.

Now, more and more applications are moving away from the desktop. We still use a desktop to access applications by using a web browser and using web applications instead of locally installed ones (Gmail, for example, taught us that a local email client is no longer required). So, the desktop itself becomes more and more a platform for our preferred Internet browser. And in this role, the functionality of the operating system gets more and more unimportant.

But if additional operating system features are of no value any more, what motivation do users have to upgrade to the next OS version? We see this problem already on the enterprise level today. The main reason for most companies to migrate from Windows XP to Windows 7 is the simple fact that support for Windows XP will end in 2014 – and not all the new features of Windows 7.

So, how will the future look? Well, Google does give us an outlook with Chrome OS, an operating system, not based on Windows, that has a main task of running a web browser with the best possible performance. By moving functionality to the web, the capabilities of the web become more important. No wonder that HTML 5 addresses a lot of these new requirements. HTML 5 not only enables a new kind of user experience for web applications, it also provides a foundation for new technologies like AppBlast to bring traditional desktop applications to the web.

OK, but what will Redmond do?

Well, they are reaching out in other areas. It seems that Microsoft understood very well the challenge. Microsoft’s focus moves away from the traditional PC toward new devices such as tablets and smartphones. Windows 8 is designed more for tablets than it is for PCs; and Windows Phone 7 is a pure smartphone OS.

Coming back to our original question will Windows 8 be that last operating system from Microsoft? Definitely not, but whether it is the last one for PCs, I don’t know…

The 10 biggest myths about desktop cloud

The biggest myths are as follows:

  1. Desktop cloud is cheaper than traditional PCs.
    As I stated in my other blog post “Motivations for moving the desktop to the cloud,” if the only driver for a desktop cloud initiative is cost savings, the project might not succeed. There are many parameters to take into account that can make a desktop cloud solution cheap – or expensive.
  2. You can’t run multimedia applications on a virtual PC.
    You can run multimedia applications on VDI environments. All known vendors of VDI products have solutions available. For lightweight multimedia applications, such as audio or YouTube videos, state-of-the-art protocols such as HDX (Citrix) or PCoIP (VMware) can handle them and provide decent results.
  3. You can’t run CAD applications on a virtual PC.
    Solutions are on the market that can provide high-end graphics in a VDI environment. Most of them are able to access a GPU built into the server for rendering. However, if such a solution does make sense, it needs to be carefully evaluated on a case by case basis.
  4. You can access your desktop from anywhere at anytime.
    Although you can access your virtual desktop as soon as you have a working Internet connection, whether you can work with it depends on a few more parameters such as latency and bandwidth. Latency greater than 100 ms makes a remote desktop feel heavy; latency greater than 250 ms can be annoying; and if latency exceeds 500 ms, it is almost impossible to work with.
  5. You can’t use local attached devices such as printers or scanners.
    You can use local attached devices very well today. It’s more a question about whether all the necessary drivers are installed in the virtual desktop or terminal server or whether the user is entitled to use them.
  6. You can equip 100% of your user population with virtual PCs.
    Even with very high ambitions, you will only be able to transfer a certain percentage of your users to a virtual desktop. For highly standardized clients, an average of 80% is a good number.
  7. You cannot install additional software or device drivers to your virtual PC.
    Usually, this is true. Especially for installing device drivers, administrative privileges are required. Although, from a technical point of view, it would be possible to grant normal users admin rights for their virtual PCs, that is usually not the case in reality. For applications, it might be a different story. Using application virtualization, users can be entitled to access and locally install new applications based on their profile.
  8. You don’t need on site support any more.
    Even with traditional PCs, on-site support is not mandatory. Only about 5 – 10% of all problem tickets are hardware-dependent. The usual problem is related to software or configuration, which can be solved remotely, too. However, users prefer to have someone from the support team in person when discussing a problem – and that’s not changing with a virtual PC.
  9. It is the same effort to patch OS or distribute new versions of a software to all workstations.
    Having all virtual PCs and data in a central data center makes patching them much easier. The whole electronic software distribution and patch management infrastructure is much less complex because it does not require fan-out servers or WAN links.
  10. Desktop cloud does not change anything for the user, so the user gladly accepts the new workstation.
    Don’t underestimate the cultural change when you replace a user’s physical PC with a virtual PC in a cloud. It is like stealing something out of the user’s pocket!

Considerations when defining a desktop cloud solution

In the previous blog posts of this series, we discussed motivations for moving desktops to the cloud and also desktop cloud technologies. Now let’s bring all of them together!

Let’s look at a desktop cloud solution from an architecture perspective. First, we need to define our users and user groups that will actually use our solution. The users will require some hardware, a thin-client or other device to access their virtual desktop. On their virtual desktop, they run applications and compute data.

Simplified desktop cloud architecture

Simplified desktop cloud architecture

So, users will access desktops. But which desktop technology fits for a specific user? To answer this question, we need to define our user groups and their requirements. Typical user groups are task workers, travelers, or developers. All of them have different requirements for their desktops; perhaps not all of them can be met by a single technology! Trying to cover all my users with a desktop cloud solution is a very ambiguous goal and almost impossible to reach. Better approach is to identify the user groups that would benefit most from a desktop cloud, or that would bring most benefit to the company and start with those.

Mapping technologies to usergroups

Mapping technologies to usergroups

Next step is to think of the applications. Designing a desktop cloud solution is a perfect opportunity to review your application landscape and to identify potential for consolidation. There are also a number of ways to provide applications to the users. Applications can be published on terminal servers, streamed using application streaming, or provided purely from the web. Ideally, applications are either web-based or at least support a number of distribution technologies. Application selection principles and development guidelines help to clean up the application landscape in the long term.

Moving further up in the architectural hierarchy, we should discuss user data. By introducing desktop cloud, I might also be required to redesign my user data concept. Locally stored data might not fit the purpose any more when I want to access the data from any device at any time. Technologies such as central data stores, web enabled data, or synchronization mechanism come into consideration.

Designing a desktop cloud solution is not trivial, especially because it directly hits the users in the way they access IT resources. Design steps need to be done carefully, always having the full picture in mind to ensure success!

Desktop cloud technologies

Today, a desktop cloud can consist of various technologies. There are different technologies for delivering the actual desktop, providing the applications, or organizing the underlying infrastructure such as storage. A good desktop cloud solution is a well designed combination of those technologies to support the needed requirements. In today’s article, I want to briefly discuss the various technologies, and explain what they can do and what they can’t.

Let’s start with a user’s desktop and how it can be provided.

Shared desktop

A shared desktop today is what used to be called a terminal server. Basically, all users of a terminal server share the server hardware and the operating system instance. To ensure that users are separated from each other, they are granted only limited rights.

Pro:

  • Use of the hardware is very efficient because there is only one operating system instance
  • Software distribution and patch management is easy because it only needs to be performed once per server.

Con:

  • Applications need to be terminal server-ready.
  • If the operating system hangs, all users on that server are affected.
  • If a single user consumes too many resources, all other users on the same server experience performance issues.
  • Users might not accept the grade of limitation caused by its low user rights.

Virtual PC

A virtual PC is a virtual machine hosting the user’s desktop and operating system. Compared to the shared desktop, the virtual PC can be perceived as a full PC including a private instance of the operating system for every user. Therefore, users can theoretically gain administrative rights on their virtual PCs.

Pro:

  • Users can have more rights, up to administrative privileges.
  • They can run any software as on their traditional PC.
  • If the operating system of one user fails, the other users on that server are not affected.

Con:

  • Because every user has his or her own operating system instance, the overhead is higher.
  • Each operating system instance instance needs to be patched and managed.

Streaming

Streaming tries to combine the performance and response time of a traditional PC with the central manageability and accessibility of a desktop cloud. The main difference between a server-based desktop solution and streaming is that the desktop is sent from a central storage down to the user’s device and then actually runs on the user’s computer. After finishing working, the changes are sent back to the central storage.

Pro

  • Offers good performance.
  • Works offline with a local cache.
  • Desktops can be patched and updated centrally.

Con

  • It is complex to set up and maintain.
  • The sync process requires high network bandwidth.
  • Conflict management is required if the local cache and the central master are out of sync.

 Client hypervisor

Bare metal or type 1 and type 2 hypervisors running on client computers must support requirements that differ from server hypervisors. On a client, it is crucial to support 3D graphic acceleration, WiFi network, and all types of USB-attached hardware, such as printers and scanners. But supporting the latest SCSI adapter is not that important. So, what is the point of having a client hypervisor at all? One aspect of a hypervisor is to separate the operating system (and its included desktop) from the underlying hardware. This approach makes the OS hardware-independent and reduces the hassle with different requirements of drivers. An additional benefit is that the desktop can flexibly be moved from one physical machine to another, for example if either the physical PC or laptop is broken, or in combination with streaming, the desktop can be moved from a data center server to a local PC and vice versa.

Pro

  • Desktop can be moved from one physical hardware device to another.
  • Multiple desktops with different purposes can be used simultaneously.

Con

  • Hardware support for WiFi and 3D graphics is not mature today.
  • Additional overhead exists because of the hypervisor

 Golden Image (copy on write) and non-persistent desktop

Non-persistent desktops are virtual machines that are set back to their original state during reboot and therefore lose all changes made while they were online. A non-persistent client setup is usually combined with a persistent data partition, so that the users can store documents and files that will not be deleted when rebooting. However, all changes made to the operating system itself would vanish. As anyone can image, this setup is very robust and ensures a working desktop at any time.

Pro:

  • Storage requirement is low because the system partition is required only once.
  • Offers easy patching and software distribution. After the master image is patched, all rebooted virtual machines are automatically patched.
  • It is a very robust solution, because any misconfigured desktop only needs to be rebooted to be operational again.

Con:

  • It has a low user acceptance.

Offline Patching

As discussed above, the drawback of persistent virtual PCs are the need to patch each and every machine as with traditional client computers. However, there is still one big advantage over distributed PCs: while traditional desktop and laptop computers are carried around, left as spare devices in cupboards and drawers, or are simply turned off during a software distribution phase – and are therefore not reachable – virtual PCs are always residing in the data center, even if they are off line (virtually turned off).

But, in any case, they must be virtually turned on, patched and turned off again, unless an offline patching technology is used. Offline patching can patch the actual image files of virtual PCs while they are offline and therefore ensure that they get the software update they require.

Summary

For the sake of the length of this blog, these technologies are only a subset of what is available today but the description should provide a good overview about the main aspects that need to be looked at when thinking about a desktop cloud solution.

In the next blog of my desktop cloud series, I will discuss best practices of how to map technologies to client requirements.

Motivations for moving the desktop in the cloud

When you ask CIOs or CTOs, who plan to introduce desktop cloud within their enterprises, about their motivation, they tell you about all sorts of value they expect to gain from that project. However, if you dig a little bit deeper about their goals, it turns out that their main driver is cost reduction.

But, is a desktop cloud really cheaper than managing traditional notebooks and PCs? Well, as always in IT, the answer is not that simple: it depends.

To find out if something new is cheaper than what I have today, I need to understand my current costs – and, my current service quality. If I operate a low-cost solution today, and there are no mandatory requirements to upgrade my current service levels, I will very likely end up with much higher costs when I move my desktops into the cloud. The reason is that even a low-cost desktop cloud provides a set of advantages over a traditional notebook and desktop environment, such as better scalability, higher agility, better availability, improved accessibility, and most probably higher data safety and security. All those advantages require support from the underlying infrastructure, such as capable servers, data-center floor space, storage devices, proper network connections, and a suitable stack of management software.

I recently talked to a large retail company about desktop cloud. Their current model is simple and cheap: Each store operates one to three desktop computers. There are no central services provided except on-site support on a call-out basis with no guaranteed reaction time (usually from five to ten days). So, in fact, the shops are responsible for their PCs themselves; their employees store data locally and do or do not patch their operating systems. However, their business is not affected if a PC is not operational for several days. The store manager, uses them only for emailing, writing letters, and eventually filling in a spreadsheet to report the store’s performance to the headquarters. This client will never meet its current costs with a desktop cloud, unless the client decides that its poor service levels need to be increased to stay competitive for the future.

To answer our initial question, if a desktop cloud is a viable solution for a specific client, we need to view the full picture and determine how important other motivations for a move to the cloud are, and if the client is willing to pay the price for them.

Scalability

Because a desktop cloud is located in a central data center, I can easily scale its capacity and performance by only adding new hardware.

Agility

Deployment of new desktops is a matter of seconds, because they are virtually created in the data-center’s infrastructure. If companies tend to grow through acquisitions and frequently require a high number of new users to be equipped with the standard desktop platform, agility can be a valuable advantage over traditional PC rollout.

Availability

The desktop cloud infrastructure is based on server hardware and usually runs in a data center, leveraging fault-tolerant components and systems management. Thin clients used as end-user devices are without configuration and without locally stored data; if they fail, they can be exchanged quickly and easily.

Accessibility

A central desktop in a desktop cloud can be accessed from almost any device and from almost anywhere in the world. The only requirement is a capable network connection. But not only the desktop can be accessed from anywhere, also the user’s personal and corporate data.

Performance

As mentioned earlier, virtual desktops run on server hardware. Under normal mode of operation, a number of users share a certain server hardware. Depending on the ratio of concurrent users to server hardware, performance can be controlled, and adapted as required. Access to data is also usually much faster, because desktop and data are both located in the data center, connected through a high-performance data-center network.

Data safety and security

Data can easily be backed up, compared to the number of local hard drives in traditional PCs. Regarding data security, by keeping the data on storage devices in the data center, the data is protected by the data-center security mechanism. The user, regardless of which country that user might sit, can only view and edit the data, but not copy the data.

Summary

If reducing costs is the only motivation for considering a desktop cloud solution, showing a positive business case will be difficult. Moving PCs into a desktop cloud brings many benefits. How valuable these benefits are for a specific enterprise needs to be analyzed on a case-by-case basis and brought into relation to the existing environment and its provided and required services.