Click here to close now.




















Welcome!

API Journal Authors: Elizabeth White, Carmen Gonzalez, Liz McMillan, AppDynamics Blog, Vormetric Blog

Related Topics: Containers Expo Blog

Containers Expo Blog: Article

Virtualization: A Promising and Justifiable Investment

Exclusive Q&A with Bala Murugan, Chief Architect, eG Innovations

"There is a shift in focus and it is from technologies that enable virtualization to technologies that manage virtualization," says Bala Murugan, Chief Architect at eG Innovations, in this Exclusive Q&A with SYS-CON's Virtualization Journal. Overall, Murugan maintains, virtualization is "a promising and justifiable investment, particularly in the current economic downturn."

Virtualization Journal: Do you agree with the view that Virtualization is one of the most promising technology investments in the current economic downturn?

Bala Murugan: Virtualization, when done right, has been proven to provide significant reductions in direct cost. It also helps your indirect cost by improving your IT’s performance, reliability and capacity management. So yes, I would say that it is a promising and justifiable investment, particularly in the current economic downturn.


Virtualization Journal:
How about your concept of “Virtualization 2.0” – doesn’t it implicitly suggest that Virtualization 1.0 has been deficient?

Murugan: On the contrary, it is more in reference to the evolution of the Virtualization industry. Virtualization 1.0 was a revelation; it introduced virtualization to the world, proved its power and showed everyone how much they can benefit from it. Virtualization 2.0 – which is already here - is about accepting Virtualization as reality and moving on to how to do it right. How to get the most out of it. Essentially, there is a shift in focus and it is from technologies that enable virtualization to technologies that manage virtualization.

To be successful in Virtualization 2.0, organizations have to focus around technology that helps them manage their virtualization deployments better. Being a monitoring technology provider, we understand the complexities of monitoring in Virtualization 2.0 and are well positioned to help these companies realize the full potential of their virtualized infrastructures.


Virtualization Journal:
Are you concerned at all that the “2.0” label might detract from the overall value proposition, given that it seems to be going down with the USS Economy. ;-)

Murugan: We view Virtualization 2.0 as an evolution (next phase) – not as a radical revamp of current virtualization deployments. In Virtualization 2.0, the focus is on how to make virtualization deployments more cost-effective and how to gain maximum benefits. So this will actually make virtualization a mandatory technology for most organizations that are dealing with tight budgets in the economic slow-down.


Virtualization Journal:
How about interoperability, how important is that for the industry do you think? What barriers persist?

Murugan: We live in an age of diverse infrastructures. Even before virtualization, the success of n-tier architectures and open systems made it impossible for one to have a homogenous environment. Data centers today comprise diverse technologies that have to co-exist and to in deliver IT services. Virtualization has taken this another step on the evolutionary road, now we are talking about adding a couple of more tiers to the n-tier apps by separating the hardware from the OS. At this juncture, we believe that interoperability is not a “nice to have”. It is a “must have.”

In terms of barriers, the ones that still exist are mostly technological, that people are working to overcome. In principle, I believe everyone agrees interoperability is a must have. Not only do they have to deal with a mix of virtual and non-virtual infrastructures, but also different types of virtualization from different vendors. They key we found is to be able to provide a unified consistent view across this diverse landscape, which makes management that much easier for the end-user.


Virtualization Journal:
Do you think VMware needs fear Microsoft’s belated entry into the virtualization marketplace?

Murugan: History has shown that Microsoft can be a significant threat in any endeavor it puts its mind to. They will have good technology and resort to their favorite ploy; their licensing model, and make Virtualization more of a commodity than it already is.

VMware itself has recognized that the hypervisor is no longer going to be the differentiator and that technologies that enable the effective use of virtualization (e.g., manageability), new application deployment models (like virtual desktops), etc. will be a key to retaining their leadership position.

Competition in this space can only be good – innovation will be faster and certainly there is room for multiple vendors in this fast growing market.


Virtualization Journal:
How about eG Innovations, what’s the background story to the company’s formation and growth to date?

A: eG Innovations was founded by Srinivas Ramanathan, who also is our president and CEO. Prior to eG, he was a research scientist at HP and the chief architect of Firehunter, an ISP performance monitoring solution. His years at HP gave him a ringside seat to real pain points that customers have with monitoring their environments and monitoring tools themselves. In 2000, he left HP to build the proverbial “better mousetrap,” and assembled a strong team, including myself, to take this concept from the ground up. That was the genesis of eG Innovations.

Our focus was on monitoring n-tier architectures by looking at them as business services as opposed to a collection of servers, networks and applications. Our key benefit to the customer was our ability to proactively identify to the right problem, the true root cause, of poor performance in their IT infrastructures. As a result, customers spent less time firefighting and finger pointing, and more time improving their overall service levels. It took a couple of years to roll-out the finished product, and we got VC funding from Singapore. Then we opened up the US market in 2002 and found a receptive audience for the technology. We quickly became the premier Citrix monitoring solution, which had all the classic n-tier architecture issues. We won many awards and saw the company grow across the globe.

We saw the opportunity in the virtualization space quite early and started working with early virtualization adopters to better understand their needs and to strengthen our technology. Our mastery in thin-client computing and shared access technologies (Citrix, Microsoft Terminal Services, etc.) helped because a Virtualization ecosystem (one box – multiple OSs) is similar to a Citrix ecosystem (one OS – multiple users). More awards later, we are now recognized as one of the industry leaders in the Virtualization monitoring space, with support for different virtualization platforms including VMware, Citrix Xen, Solaris Containers/LDOMs and more.


Virtualization Journal:
What are the main pain points that bring customers to you in search of a monitoring solution?

Murugan: The biggest single pain point is probably problem isolation. When there is a problem in your n-tier IT infrastructure, it is usually pretty hard to distinguish between the true root cause and the effects. With systems being interdependent, a single problem generally causes a ripple effect that flows through the entire environment, leading you to chase effects as opposed to pinpointing the root cause. In simple terms, this means you are wasting valuable IT resources in fire-fighting mode fixing effects, which leads to finger pointing inside the organization. Meanwhile, your customers are still facing the problem. Virtualization only increases the complexity of your n-tier IT delivery, which makes problem isolation even more difficult.

Another key pain point that we see customers face is lack of visibility into their IT infrastructures. Even though it sounds simple enough, more often than not customers today don’t have total visibility into what is going on within their virtualized infrastructures. When you are managing a virtualized environment you definitely need answers to questions like; “How many guests are they running?” “How many guests are just consuming resources without being used?” “Where are the bottlenecks in the environment?” “Where do you stand on capacity?” “How do applications running inside VMs compare to ones running on physical servers?” “Is VMotion happening? If yes, why?” and so on. When it comes to virtual environments, what you don’t know can hurt you badly.

Another common problem is the classic disconnect between business services and the IT infrastructure. For example, business users say they can’t process orders or things are too slow. The IT side says servers are running fine on CPU. Both of them are right in their own perspective, but they are not on the same page, not even on the same book. This comes from the traditional IT view of looking at boxes and servers as opposed to the actual quality of services being delivered.


Virtualization Journal:
What are two of your favorite customer success stories?

Murugan: There are many, but a classic one was when we got called in by a customer who was deploying a new project with Citrix technologies in a heterogeneous infrastructure with physical and virtual servers. Their new service was not taking off. Users were complaining about severe slowdowns and they had already spent weeks on this problem with no results. Before they came to us, they had changed the server hardware, the application software, the client terminals and software, all to no effect. Within a couple of days of getting involved, we were able to pin-point the source of the problem – network packet retransmissions between servers -- due to some issues with the way network teaming had been set up. We had been working with the application and server teams, and these teams had no visibility into the network. All they had to go by was what the network team was telling them. Hence, they assumed when a problem happened that it was a server or application issue, and spent weeks chasing this. Without any kind of instrumentation on the network, our eG Enterprise solution was able to determine that the root cause of the problem was in the network, not in the VMs, Citrix or other applications. This was a classic case of having to work with limited visibility into some domains, working with different silos of the infrastructure, and yet being able to effectively troubleshoot problems. In the end, it took us just a minutes to review the collected metrics to identify the root cause. Even after hundreds of customer installations, this remains a great example of a customer success.

Another very good example was a large financial institution where our technologies have delivered immense value. Before we got involved, they were very silo-based in their day to day firefighting and operations. We helped them streamline their operations, providing the helpdesk with end-to-end visibility into key business services. s a result, when a problem occurs, the helpdesk knows exactly which expert to call to resolve a problem. This produced significant improvement in service uptime, and more effective use of their operations staff.


Virtualization Journal:
What does the future hold do you think for VDI?

Murugan: VDI and its various technology cousins are definitely here to say. The idea of a centralized desktop with the power of a localized desktop is extremely attractive. Some of the largest implementations have been VDI related. Currently we are seeing Fortune 100 companies leading the way on this and I believe it will be common place soon even in mid-size companies. As a technology, it has not yet fully matured, but once it does we see it as becoming a much bigger market than server-based virtualization initiatives. It may become the de-facto desktop platform in near future.


Virtualization Journal: Do you agree that we are entering a new age of infrastructure – one in which it is back on the agenda of C-level execs (and not only the CTO)?

Murugan: I believe infrastructure has always been on the agenda of C-level execs, but with the success of virtualization there are definitely more conversations at the C-level about how to do this right.


Virtualization Journal: You were responsible for the design and development of one of the earliest J2EE portals in the late 90s; what role does Java play today in the enterprise technology landscape?

Murugan: The platform independence provided by Java was one of the key drivers that enabled a slew of web-facing service-oriented applications in the last decade. Java and its sister technologies remain one of the backbone technologies of the web-based applications.

More Stories By Jeremy Geelan

Jeremy Geelan is Chairman & CEO of the 21st Century Internet Group, Inc. and an Executive Academy Member of the International Academy of Digital Arts & Sciences. Formerly he was President & COO at Cloud Expo, Inc. and Conference Chair of the worldwide Cloud Expo series. He appears regularly at conferences and trade shows, speaking to technology audiences across six continents. You can follow him on twitter: @jg21.

Comments (1) View Comments

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


Most Recent Comments
rcjay2 01/23/09 01:38:00 PM EST

This is a great article and gives you insight to one of the leaders in Enterprise Monitoring Solutions. I am a user who has the pleasure of working with Bala and the folks at EG for some time now. I can honestly say that the product is amazing. It works in all environments across all OS’s and the monitoring/ reporting capabilities are extensive and endless. Out of the box it monitors everything you can throw at it and if you need to implement a custom monitoring solution for something not covered it is easy to include custom scripts EG can run and report on. Currently, I have the EG suite monitoring 2 complete virtual environments with XenServer 5 and ESX Infrastructure 3. Within each virtual environment I have multiple hosts with a range of operating systems. Everything from Solaris, Fedora Core, and all versions of Windows (2003/2008) are running and fully monitored. Not to mention all the network devices (Cisco, Dell, and Linksys) and printers can all be monitored via SNMP.

Furthermore one of the key points is with the newest version EG is now able to monitor the Solaris Sunray environment. All things surrounding the DTU connectivity is readily available. I have found that it is easy to install, configure and in the case of a disaster it is easy to get a backup up and going. One final note, support from the people at EG is second to none. I have spoke with them on numerous occasions and have never run into anything but a genuine offering of help and wiliness to understand and pinpoint the issue until a resolution is discovered.

Rob Jaudon
Promptu Technologies

@ThingsExpo Stories
SYS-CON Events announced today that IceWarp will exhibit at the 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. IceWarp, the leader of cloud and on-premise messaging, delivers secured email, chat, documents, conferencing and collaboration to today's mobile workforce, all in one unified interface
WebRTC has had a real tough three or four years, and so have those working with it. Only a few short years ago, the development world were excited about WebRTC and proclaiming how awesome it was. You might have played with the technology a couple of years ago, only to find the extra infrastructure requirements were painful to implement and poorly documented. This probably left a bitter taste in your mouth, especially when things went wrong.
With the proliferation of connected devices underpinning new Internet of Things systems, Brandon Schulz, Director of Luxoft IoT – Retail, will be looking at the transformation of the retail customer experience in brick and mortar stores in his session at @ThingsExpo. Questions he will address include: Will beacons drop to the wayside like QR codes, or be a proximity-based profit driver? How will the customer experience change in stores of all types when everything can be instrumented and analyzed? As an area of investment, how might a retail company move towards an innovation methodolo...
The Internet of Things (IoT) is about the digitization of physical assets including sensors, devices, machines, gateways, and the network. It creates possibilities for significant value creation and new revenue generating business models via data democratization and ubiquitous analytics across IoT networks. The explosion of data in all forms in IoT requires a more robust and broader lens in order to enable smarter timely actions and better outcomes. Business operations become the key driver of IoT applications and projects. Business operations, IT, and data scientists need advanced analytics t...
A producer of the first smartphones and tablets, presenter Lee M. Williams will talk about how he is now applying his experience in mobile technology to the design and development of the next generation of Environmental and Sustainability Services at ETwater. In his session at @ThingsExpo, Lee Williams, COO of ETwater, will talk about how he is now applying his experience in mobile technology to the design and development of the next generation of Environmental and Sustainability Services at ETwater.
Consumer IoT applications provide data about the user that just doesn’t exist in traditional PC or mobile web applications. This rich data, or “context,” enables the highly personalized consumer experiences that characterize many consumer IoT apps. This same data is also providing brands with unprecedented insight into how their connected products are being used, while, at the same time, powering highly targeted engagement and marketing opportunities. In his session at @ThingsExpo, Nathan Treloar, President and COO of Bebaio, will explore examples of brands transforming their businesses by t...
SYS-CON Events announced today that Pythian, a global IT services company specializing in helping companies leverage disruptive technologies to optimize revenue-generating systems, has been named “Bronze Sponsor” of SYS-CON's 17th Cloud Expo, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Founded in 1997, Pythian is a global IT services company that helps companies compete by adopting disruptive technologies such as cloud, Big Data, advanced analytics, and DevOps to advance innovation and increase agility. Specializing in designing, imple...
While many app developers are comfortable building apps for the smartphone, there is a whole new world out there. In his session at @ThingsExpo, Narayan Sainaney, Co-founder and CTO of Mojio, will discuss how the business case for connected car apps is growing and, with open platform companies having already done the heavy lifting, there really is no barrier to entry.
SYS-CON Events announced today that HPM Networks will exhibit at the 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. For 20 years, HPM Networks has been integrating technology solutions that solve complex business challenges. HPM Networks has designed solutions for both SMB and enterprise customers throughout the San Francisco Bay Area.
SYS-CON Events announced today that Micron Technology, Inc., a global leader in advanced semiconductor systems, will exhibit at the 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Micron’s broad portfolio of high-performance memory technologies – including DRAM, NAND and NOR Flash – is the basis for solid state drives, modules, multichip packages and other system solutions. Backed by more than 35 years of technology leadership, Micron's memory solutions enable the world's most innovative computing, consumer,...
Through WebRTC, audio and video communications are being embedded more easily than ever into applications, helping carriers, enterprises and independent software vendors deliver greater functionality to their end users. With today’s business world increasingly focused on outcomes, users’ growing calls for ease of use, and businesses craving smarter, tighter integration, what’s the next step in delivering a richer, more immersive experience? That richer, more fully integrated experience comes about through a Communications Platform as a Service which allows for messaging, screen sharing, video...
As more intelligent IoT applications shift into gear, they’re merging into the ever-increasing traffic flow of the Internet. It won’t be long before we experience bottlenecks, as IoT traffic peaks during rush hours. Organizations that are unprepared will find themselves by the side of the road unable to cross back into the fast lane. As billions of new devices begin to communicate and exchange data – will your infrastructure be scalable enough to handle this new interconnected world?
As more and more data is generated from a variety of connected devices, the need to get insights from this data and predict future behavior and trends is increasingly essential for businesses. Real-time stream processing is needed in a variety of different industries such as Manufacturing, Oil and Gas, Automobile, Finance, Online Retail, Smart Grids, and Healthcare. Azure Stream Analytics is a fully managed distributed stream computation service that provides low latency, scalable processing of streaming data in the cloud with an enterprise grade SLA. It features built-in integration with Azur...
Too often with compelling new technologies market participants become overly enamored with that attractiveness of the technology and neglect underlying business drivers. This tendency, what some call the “newest shiny object syndrome,” is understandable given that virtually all of us are heavily engaged in technology. But it is also mistaken. Without concrete business cases driving its deployment, IoT, like many other technologies before it, will fade into obscurity.
Akana has announced the availability of the new Akana Healthcare Solution. The API-driven solution helps healthcare organizations accelerate their transition to being secure, digitally interoperable businesses. It leverages the Health Level Seven International Fast Healthcare Interoperability Resources (HL7 FHIR) standard to enable broader business use of medical data. Akana developed the Healthcare Solution in response to healthcare businesses that want to increase electronic, multi-device access to health records while reducing operating costs and complying with government regulations.
For IoT to grow as quickly as analyst firms’ project, a lot is going to fall on developers to quickly bring applications to market. But the lack of a standard development platform threatens to slow growth and make application development more time consuming and costly, much like we’ve seen in the mobile space. In his session at @ThingsExpo, Mike Weiner, Product Manager of the Omega DevCloud with KORE Telematics Inc., discussed the evolving requirements for developers as IoT matures and conducted a live demonstration of how quickly application development can happen when the need to comply wit...
The Internet of Everything (IoE) brings together people, process, data and things to make networked connections more relevant and valuable than ever before – transforming information into knowledge and knowledge into wisdom. IoE creates new capabilities, richer experiences, and unprecedented opportunities to improve business and government operations, decision making and mission support capabilities.
Explosive growth in connected devices. Enormous amounts of data for collection and analysis. Critical use of data for split-second decision making and actionable information. All three are factors in making the Internet of Things a reality. Yet, any one factor would have an IT organization pondering its infrastructure strategy. How should your organization enhance its IT framework to enable an Internet of Things implementation? In his session at @ThingsExpo, James Kirkland, Red Hat's Chief Architect for the Internet of Things and Intelligent Systems, described how to revolutionize your archit...
MuleSoft has announced the findings of its 2015 Connectivity Benchmark Report on the adoption and business impact of APIs. The findings suggest traditional businesses are quickly evolving into "composable enterprises" built out of hundreds of connected software services, applications and devices. Most are embracing the Internet of Things (IoT) and microservices technologies like Docker. A majority are integrating wearables, like smart watches, and more than half plan to generate revenue with APIs within the next year.
Growth hacking is common for startups to make unheard-of progress in building their business. Career Hacks can help Geek Girls and those who support them (yes, that's you too, Dad!) to excel in this typically male-dominated world. Get ready to learn the facts: Is there a bias against women in the tech / developer communities? Why are women 50% of the workforce, but hold only 24% of the STEM or IT positions? Some beginnings of what to do about it! In her Opening Keynote at 16th Cloud Expo, Sandy Carter, IBM General Manager Cloud Ecosystem and Developers, and a Social Business Evangelist, d...