Our CTO and President, Vinod Jeyachandran, speaks to Jennifer Zarate, from CRN TV about Anunta DesktopReady, the Modern Desktop as a Service solution that brings enterprise grade virtual desktops to businesses.

DesktopReady uses Microsoft technology to provide build, scale, operate and easy to use cloud desktops specifically to SMBs. It is a Windows 10 desktop built on the Azure platform and is powered by Microsoft’s Windows Virtual Desktop.

It completely abstracts the process of building the desktop environment, maintaining it and allows the businesses to scale up at any time and any size.

Watch the video:

Learn more about DesktopReady: Visit our website

We came across an interesting blog post that discusses performance management on the cloud and the toss up between public and private clouds. You can read it here:

Why Performance Management is Easier in Public than On-Premise Clouds — Performance is one of the major concerns in the cloud. But the question should not really be whether or not the cloud performs, but whether the Application in question can and does perform in the cloud.

The main problem here is that application performance is either not managed at all or managed incorrectly and therefore this question often remains unanswered. Now granted, performance management in cloud environments is harder than in physical ones, but it can be argued that it is easier in public clouds than in on-premise clouds or even a large virtualized environment.

How do I come to that conclusion? Before answering that let’s look at the unique challenges that virtualization in general and clouds in particular – pose to the realm of APM.

We believe performance management will become easier in private clouds rather than public. This is mainly because it needs to be remembered that the two different group who manage infrastructure in Public clouds can also be siloed and this could result in a number of performance problems for end-users. So whether public or private, it is critical that all the dependant factors are woven together and pro-actively monitored.

I believe the basis of performance management has to start with end-user experience management. Unfortunately most approaches to monitoring are inward focused and they don’t really look at what effect breaches of various system thresholds have on end-users. Admittedly, it’s not easy to put in place a system which consistently ensures end-user performance is measured but it’s also not that complex if attempted through proper process charts. We have repeatedly seen customers having a non-integrated performance management, which ends up like a reactive system because in fact what’s getting monitored has no relation to what’s been delivered to the end-users.

My recommendation would be that be it private or public cloud start your performance management from end-user and move up the ladder to the data center. Connect all the points, identify dependencies, define relative thresholds (relate them to what kind of impact this will have on the end-user) and create an agent-less system to monitor end-user experience. This has proven effective for us both in Private & Public cloud application usages. It can become a lot easier in private cloud as we can have a single integrated system which can connect them all, in public cloud we might be restricted by different methods employed to measure performances by different vendors.

Build own laptop screen

With mobility being the new employee mantra, IT teams are struggling to keep in step with the challenges that the mobile workforce brings. According to Gartner, by 2017 nearly 38% of organizations will embrace BYOD and stop providing devices to its employees. Our conversations with Indian CTOs tell us that mobility is a top concern while BYOD is still some way off, though it is being embraced in niche areas like for agency workforces in Insurance or salesforce in FMCG. But overall, Desktop based systems aren’t disappearing anytime soon and IT heads still have the huge inventory of PC’s that they regularly need to refresh to ensure productivity levels. So many CTOs may find themselves wondering, is it better to refresh my desktop or should I look at virtual desktop solutions and consider thin clients. The answer as always is…. Depends! We discuss three of the most common cases below.

Most organisations follow a 5-year hardware refresh cycle but in India it is not uncommon to come across enterprises that will stretch that to 7 years or beyond! Essentially, it’s a case of if it ain’t broke don’t fix it! In such cases can the IT expect to establish a case for virtualized desktops rather than invest in new PCs? On the face of it, you don’t have to be a genius to say NO! But, these are precisely the cases where business is driving IT to go beyond replace. So can a business case be built in such cases? How do you compare a 35,000 PC price to the initial investment required to bring in the IT efficiency that virtual desktop solutions brings? Point is you need to compare apples to apples. Even if a PC costs you just 35,000, what does support cost? How much power does the PC consume? How much does a data breach cost? And so on. We have found that if IT can think business saving rather than IT saving, a business case can easily be built for replacing 500 desktops with virtualized desktops.

With support to Windows XP coming to an end, enterprises are saddled with multiple systems spread across the company that are now vulnerable to data loss and security breaches. Lack of support to XP may also mean issues with software compatibility which can lead to user dissatisfaction and productivity loss. In such cases CTOs are faced with the question of do they move to Windows 7 or use that as a trigger to transform delivery. As always, in many cases it will be driven by hard numbers. Migrating to Windows7 means investing in Windows 7 licenses and frequently hardware upgrades. Assuming you are ready, to spend on both, what is the residual life of the existing desktop is worth considering. We have found that the additional investment on license and the hardware upgrade, on an already sweated asset, makes little sense, especially when all that investment can come to naught if the PC itself starts to wear out. The same amount when directed towards virtual desktop solution initiative creates an opportunity to benefit from the IT efficiencies while postponing the need to change the PC. As and when the PC wears out it can be replaced with a lower cost TC with benefits of virtualization gained from the start.

Many visionary CTOs have used business expansion as a trigger to transform. The business case here is not unlike that in the PC refresh case except that you need to factor in your organisation’s refresh cycle and attendant financials to those of a virtualized environment. In addition to the economics this is the “perfect” case to test a new and better technology. It also gives you a clean slate as far as end-users are concerned. So governance and culture around use of desktop & applications can be laid down with no baggage and comparisons to unlimited storage and downloads. With the right partner, IT heads are quickly able to demonstrate the many advantages that virtual desktop solutions can deliver and in our experience never look back.

Every CTO has to find a way to balance being a pragmatist and a visionary. Most CTOs understand that newer technologies such as desktop virtualization have benefits but convincing non-technical managements and boards means justifying the investment with a “business case”. In most cases unfortunately the term business case is usually used to convey the economic case while business value is ignored. Perhaps that is the key to getting support to your desktop virtualization initiative: emphasize the business value while showing you have done enough hard work on the economic case.

End-user performance management is very critical to making VDI a successful initiative. From an end-user standpoint, the user is looking for maximum efficiency and is not concerned about HOW that is achieved or WHAT technology is used. Just like how a mobile phone user does not care about whether his phone uses GSM or CDMA technology as long as it solves its intended purpose.

Frequently, business heads and teams resist VDI based on the fact that the familiar box near them has been taken away. We saw a lot of resistance when we rolled out VDI couple of years ago, but we found out a solution to prove and measure its performance. Eventually, we made these performance metrics available for all to see so that new users who challenge VDI have reliable data to refer to.

The approach we have adopted is a combination of technology and processes. Our monitoring architecture started from the end-user application metrics and moved up the layer to the actual VDI in the data center (contrary to the traditional approach of just looking at performance counters). With this approach we were able to easily relate the application performance at the end-user level to the dependent parameters of central infrastructure. We created business views which brought in all the dependent infrastructure together but still faced a challenge of simulating actual end-user experience.

We then developed application simulators which could schedule the application access at certain periods of the hour and feed the performance numbers (equivalent to typical use case scenarios and key strokes of the users). This was again interlinked to the various system thresholds like Network, WAN, SAN IO, Virtual platform and ended up with the final VDI session performance tracking. Any deviation in the threshold would highlight the possible causes which are being monitored 24/7 by the NOC team. With this we have been able to consistently achieve user satisfaction as well as start delivering application performance guarantees to our customers – and free business heads and end-users of their VDI-related fears in the process.

Visit to know more about our latest End-User Computing offerings.

According to Brian Madden, VDI is not the silver bullet folks expect it to be. The two major misconceptions highlighted being:

  • With desktop virtualization one can avoid managing windows desktops
  • With desktop virtualization, you virtualize the apps and virtualize the user environment, and then there’s nothing left to manage

Brian further explains how desktop virtualization is inextricably linked to Windows 7.

A lot has been said about the challenges or myths about VDI and conclusions are being made basis these. While these discussions kindle a constructive thought, they also scare away new users by detailing one complexity after another. Here’s our take on them:

First of all, the organization should be ready for a real transformation if VDI has to be adopted. If the intention is to manage everything as it’s being managed currently then most of the challenges being talked about on blogs and online forums will be true. The fundamental change is that VDI is moving control from the end-point to the datacenter.

Traditionally, a lot of discipline has been adopted in datacenter management as most of the control lies with the IT team. Few years ago several blogs spoke about how the virtual server concept would fail and never take off. Questions were raised about hardware being shared, driver issues, memory allocation, storage, etc. Today, nobody questions server virtualization capabilities; almost every organization has attempted it or is using it on a large scale. Also, comparing speed of adoption of server virtualization to that of desktop virtualization is incorrect. Desktops are tightly integrated with end-users. More than technology, it’s a perception play and organizations should be ready to embrace it.

When we adopted this solution earlier, we faced questions about the cost effectiveness of VDI (which was not seen as optimal), ease of management, etc., but realized that we were trying to compare VDI to the bottom most layer of the desktop and not looking at it as a broader solution which can deliver much more than the existing desktop back then. Speaking of compliance and security, many Desktop IT teams are struggling to manage tough compliance requirements, facing audit after audit forcing them to streamline the end-point solution, protect critical data on desktops, complicated policies and scripts. The way out for these are stop-gap solutions or deploying enterprise-wide complex applications which would be used only for addressing about 5 – 10% of the issues they were supposed to take care of.

The effort and investment needed for these are not attributed to desktop costs, rather they all become part of the information security budget. Isn’t it logical to say that the current desktop is not capable of protecting itself and hence we need to look out for solutions? If yes, then why are these costs not attributed to desktop costs? On the contrary, migrating to VDI brings about 70-80% of compliance without intervention of any additional application or technology. Are we consciously crediting VDI for this? Great desktop management tools and solutions do exist today. But even then, the need to manage each end-points still persists. The accuracy of patching, achieving standardization in the hardware/software configuration, application rollout is not an easy task for desktop engineers. VDI brings down this complexity and masks the hardware variation and provides a wonderful application layer that is completely standardized. While patching is still needed in VDI, it does reduce the volume of patching by using the right templates.

VDI management is about managing 1 desktop vis-à-vis 500 desktops. If enough time is spent in designing and planning then the manageability of VDI can be lot simpler than actual desktops. At times, IT teams are challenged about their so called obsession with “VDI” and trying to make it work in whatever form. The answer is ‘No’ , because the audience you’re going to face is end-users and they are smart enough to know what works best for them. The concept of VDI is not new, the logic behind sharing a common infrastructure platform has been around for many years. The evolution of many such technologies like client server architecture, terminal services, application virtualization, etc., are driving the single point agenda of how effectively one can deliver application to the end-users.

We should continue to look at solutions that deliver application to end-users using various methods and tools. Also, VDI shouldn’t be compared to replacing a desktop but the complete chain of things which contribute towards end-user experience management (EUEM).

More off this less hello salamander lied porpoise much over tightly circa horse taped so innocuously outside crud mightily rigorous negative one inside gorilla and drew humbly shot tortoise inside opaquely. Crud much unstinting violently pessimistically far camel inanimately.

Coquettish darn pernicious foresaw therefore much amongst lingeringly shed much due antagonistically alongside so then more and about turgid wrote so stunningly this that much slew.

What is Virtual Desktop Infrastructure (VDI)?

In the past few years, we have witnessed a virtual explosion in the number of devices, dramatic change in the way end-users consume IT and a rise in workforce which is mobile and prefers to work from the location of their convenience. This has led enterprises to devise a strategy that accommodates the growing needs of mobile workforce. Enterprises are now faced with the challenge to strike a balance between windows-only and platform-agnostic applications, enterprise network and public internet, enterprise owned devices and user-owned devices. However, achieving a balance while addressing the needs of end-users is a challenge and complex process.

Enterprises are increasingly looking at virtualization technologies to provide the right answers to these issues. A recent Research and Markets report states that “the global desktop virtualization market is projected to reach US$28.345 billion in 2022, increasing from US$6.276 billion in 2016 at a CAGR of 28.57%.”

What is Virtual Desktop Infrastructure?

As opposed to a standard desktop infrastructure that stores data and runs applications locally on your machine, a virtual desktop infrastructure is a technology that delivers virtual desktops in the end-user computing environment. These virtual machines/desktops are relayed on thin or zero clients as images of data and applications which are stored remotely on a central server on a public, private or hybrid Cloud.

The VDI environment segregates desktop operating system, applications, and data from the hardware, and offers a robust, secure, flexible yet affordable desktop solution. The virtualization software creates desktop images and enables access to end-users over a network.

In the VDI environment, virtual desktops are provisioned based upon user-group profiling which ensures that end-users have an uninterrupted access to role-specific applications and data via a customized interface. Since data and applications reside on a central server, management becomes simple. All bug fixes, policy and software upgrades are applied only on the centrally stored golden images ensuring security and compliance adherence for all end-users. The ability to provide consistently high application availability in an anywhere, anytime, and across any device improves end-user experience and productivity.

Why enterprises should adopt Virtual Desktop Infrastructure?

Why enterprises should adopt Virtual Desktop Infrastructure?

Enterprises should adopt VDI for the following reasons.

  • Anytime, anywhere, any device access to data and applications – As applications and data are stored on central servers, this enables access independent of the time, place and device enabling end-users enjoy flexible yet secure application availability.
  • Simplified management – Centralized storage of data and applications enable IT to manage desktops in an extremely efficient manner. Since user profiles are also stored centrally, rolling out OS, application and bug fixes are simplified.
  • Secure access – End-users only have access to applications and data specific to their roles as a result of role-based user profile creation and policy adherence. This also enables enterprises in providing contract employees and partners with secured access to relevant data and applications.
  • Rapid disaster recovery and business continuity – In the event of a disaster, virtual desktops can be provisioned from the centralized backup, thereby ensuring business continuity.
  • Reduced costs – If properly designed and implemented, VDI are often lower in cost than PCs on a total cost of ownership (TCO) basis.

Some of the use cases/ business requirements where Virtual Desktop Infrastructure is particularly suited,

  • Supporting remote users who need access to core applications
  • Temporary/makeshift offices
  • Flexibility to rapidly ramp-up/ramp-down
  • Simplify licensing and compliance
  • Looking for next generation infrastructure at PC/hardware refresh cycle
  • Implementing a BYOD policy
  • Providing uninterrupted connectivity to remote branch offices that have poor bandwidth connectivity
  • Require a versatile setup for Developers, Contractors and Training requirements
  • Need high security to protect customer data or proprietary intellectual properties
  • Not wanting to invest in large internal IT team with specialized skills

The versatility and flexibility of virtual desktop infrastructure helps enterprises in addressing the demands of their end-users while ensuring security, compliance and ease of management. Some of the technologies used require specialized skills and enterprise often work with specialized partners who can design, deploy and manage virtual desktops and deliver high quality end user experience.

As a recognized specialist in cloud and virtualization technologies, Anunta helps enterprises address today’s application delivery challenges by migrating them from traditional client-server architecture to a unified desktop and application services environment. Our solutions are focused on simplifying IT and maximizing performance and availability at the user end, at an optimum cost. Our decades of expertise in virtualization technologies and a track record in transforming and managing over 80,000 endpoints for more than 120,000 users, make us the virtualization solutions provider of choice for leading enterprises, for large, complex transformations across industry verticals.

Learn more about our virtualization solutions and domain expertise on .

Remote working is no longer optional as the ongoing COVID-19 crisis has pushed global organizations to quickly adopt the new ways of working. As the focus shifts from office cubicle to the comforts of home for work, employee productivity, seamless communication and collaboration, accessibility to enterprise applications, and data security emerge as the primary concerns for most organizations. Organizations will have to choose the right technology that is sustainable, secure, and scalable to adapt to the changing needs of businesses in the long-term. Here’s an infographic that shows the major trends in workplace technologies.

Build own laptop screen

With remote working becoming the new normal, IT and security teams are under pressure to ensure a secure work environment. We have put together this infographic to help understand the areas of vulnerabilities and how Cloud Desktops can keep your remote work environment secure.

Build own laptop screen

Panther desolately iguanodon alas in goodness goodness re-laid when wishful but yet and trim hey went the tamarin some during obsessively into far notwithstanding aristocratic a ouch jeez goodness chameleon because piranha. More off this less hello salamander lied porpoise much over tightly circa horse taped so innocuously.

Forbade panther desolately iguanodon alas in goodness goodness re-laid when wishful but yet and trim hey went the tamarin some during obsessively into far notwithstanding then more.

Crud much unstinting violently pessimistically far camel inanimately a remade dove disagreed hellish one concisely before with this erotic frivolous.