Welcome!

Search Authors: Carmen Gonzalez, Lori MacVittie, Liz McMillan, Shelly Palmer, Lacey Thoms

Related Topics: Virtualization, SOA & WOA

Virtualization: Article

How Data Virtualization Improves Business Agility – Part 2

Accelerate value with a streamlined, iterative approach that evolves easily

Business Agility Requires Multiple Approaches
Agile businesses create business agility through a combination of business decision agility, time-to-solution agility and resource agility.

This article addresses how data virtualization delivers time-to-solution agility. Part 1 addressed business decision agility and Part 3 will address resource agility.

Time-To-Solution Agility = Business Value
When responding to new information needs, rapid time-to-solution is critically important and often results in significant bottom-line benefits.

Proven, time and again across multiple industries, substantial time-to-solution improvements can be seen in the ten case studies described in the recently published Data Virtualization: Going Beyond Traditional Data Integration to Achieve Business Agility.

Consider This Example: If the business wants to enter a new market, it must first financially justify the investment, including any new IT requirements. Thus, only the highest ROI projects are approved and funded. Once the effort is approved, accelerating delivery of the IT solution also accelerates realization of the business benefits and ROI.

Therefore, if incremental revenues from the new market are $2 million per month, then the business will gain an additional $2 million for every month IT can save in time needed to deliver the solution.

Streamlined Approach to Data Integration
Data virtualization is significantly more agile and responsive than traditional data consolidation and ETL-based integration approaches because it uses a highly streamlined architecture and development process to build and deploy data integration solutions.

This approach greatly reduces complexity and reduces or eliminates the need for data replication and data movement. As numerous data virtualization case studies demonstrate, this elegance of design and architecture makes it far easier and faster to develop and deploy data integration solutions using a data virtualization platform. The ultimate result is faster realization of business benefits.

To better understand the difference, let's contrast these methods. In both the traditional data warehouse/ETL approach and data virtualization, understanding the information requirements and reporting schema is the common first step.

Traditional Data Integration Has Many Moving Parts
Using the traditional approach IT then models and implements the data warehouse schema. ETL development follows to create the links between the sources and the warehouse. Finally the ETL scripts are run to populate the warehouse. The metadata, data models/schemas and development tools used within each activity are unique to each activity.

This diverse environment of different metadata, data models/schemas and development tools is not only complex but also results in the need to coordinate and synchronize efforts and objects across them.

Experienced BI and data integration users will readily acknowledge the long development times that result from this complexity, including Forrester Research in its 2011 report Data Virtualization Reaches Critical Mass.

"Extract, transform, and load (ETL) approaches require one or more copies of data staged along the physical integration process flow. Creating, storing, and manipulating these copies can be complex and error prone."

Data Virtualization Has Fewer Moving Parts
Data virtualization uses a more streamlined architecture that simplifies development. Once the information requirements and reporting schema are understood, the next step is to develop the objects (views and data services) used to both model and query the required data.

These virtual equivalents of the warehouse schema and ETL routines and scripts are created within a single view or data service object using a unified data virtualization development environment. This approach leverages the same metadata, data models/schemas and tools.

Not only is it easier to build the data integration layer using data virtualization, but there are also fewer "moving parts," which reduces the need for coordination and synchronization activities. With data virtualization, there is no need to physically migrate data from the sources to a warehouse. The only data that is moved is the data delivered directly from the source to the consumer on-demand. These result sets persist in the data virtualization server's memory for only a short interval.

Avoiding data warehouse loads, reloads and updates further simplifies and streamlines solution deployment and thereby improves time-to-solution agility.

Iterative Development Process Is Better for Business Users
Another way data virtualization improves time-to-solution agility is through support for a fast, iterative development approach. Here, business users and IT collaborate to quickly define the initial solution requirements followed by an iterative "develop, get feedback and refine" process until the solution meets the user need.

Most users prefer this type of development process. Because building views of existing data is simple and fast, IT can provide business users with prospective versions of new data sets in just a few hours. The user doesn't have to wait months for results while IT develops detailed solution requirements. Then business users can react to these data sets and refine their requirements based on the tangible insights. IT can then change the views and show the refined data sets to the business users.

This iterative development approach enables the business and IT to hone in on and deliver the needed information much faster than traditional integration methods.

Even in cases where a data warehouse solution is mandated by specific analytic needs, data virtualization can be used to support rapid prototyping of the solution. The initial solution is built using data virtualization's iterative development approach, with migration to the data warehouse approach once the business is fully satisfied with the information delivered.

In contrast, developing a new information solution using traditional data integration architecture is inherently more complex. Typically, business users must fully and accurately specify their information requirements prior to any development, with little change tolerated. Not only does the development process take longer, but there is a real risk that the resulting solution will not be what the users actually need and want.

Data virtualization offers significant value, and the opportunity to reduce risk and cost, by enabling IT to quickly deliver iterative results that enable users to truly understand what their real information needs are and get a solution that meets those needs.

Ease of Data Virtualization Change Keeps Pace with Business Change
The third way data virtualization improves time-to-solution agility is ease of change. Information needs evolve. So do the associated source systems and consuming applications. Data virtualization allows a more loosely coupled architecture between sources, consumers and the data virtualization objects and middleware that integrate them.

This level of independence makes it significantly easier to extend and adapt existing data virtualization solutions as business requirements or associated source and consumer system implementations change. In fact, changing an existing view, adding a new source or migrating from one source to another is often completed in hours or days, versus weeks or months in the traditional approach.

Conclusion
Data virtualization reduces complexity, data replication and data movement. Business users and IT collaborate to quickly define the initial solution requirements followed by an iterative "develop, get feedback and refine" delivery process. Further independent layers make it significantly easier to extend and adapt existing data virtualization solutions as business requirements or associated source and consumer system implementations change.

These time-to-solution accelerators, as numerous data virtualization case studies demonstrate, make it far easier and faster to develop and deploy data integration solutions using a data virtualization platform than other approaches. The result is faster realization of business benefits.

Editor's Note: Robert Eve is the co-author, along with Judith R. Davis, of Data Virtualization: Going Beyond Traditional Data Integration to Achieve Business Agility, the first book published on the topic of data virtualization. This series of three articles on How Data Virtualization Delivers Business Agility includes excerpts from the book.

More Stories By Robert Eve

Robert Eve is the EVP of Marketing at Composite Software, the data virtualization gold standard and co-author of Data Virtualization: Going Beyond Traditional Data Integration to Achieve Business Agility. Bob's experience includes executive level roles at leading enterprise software companies such as Mercury Interactive, PeopleSoft, and Oracle. Bob holds a Masters of Science from the Massachusetts Institute of Technology and a Bachelor of Science from the University of California at Berkeley.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@ThingsExpo Stories
Explosive growth in connected devices. Enormous amounts of data for collection and analysis. Critical use of data for split-second decision making and actionable information. All three are factors in making the Internet of Things a reality. Yet, any one factor would have an IT organization pondering its infrastructure strategy. How should your organization enhance its IT framework to enable an Internet of Things implementation? In his session at Internet of @ThingsExpo, James Kirkland, Chief Architect for the Internet of Things and Intelligent Systems at Red Hat, will describe how to revoluti...
The Internet of Things will greatly expand the opportunities for data collection and new business models driven off of that data. In her session at Internet of @ThingsExpo, Esmeralda Swartz, CMO of MetraTech, will discuss how for this to be effective you not only need to have infrastructure and operational models capable of utilizing this new phenomenon, but increasingly service providers will need to convince a skeptical public to participate. Get ready to show them the money! Speaker Bio: Esmeralda Swartz, CMO of MetraTech, has spent 16 years as a marketing, product management, and busin...
Samsung VP Jacopo Lenzi, who headed the company's recent SmartThings acquisition under the auspices of Samsung's Open Innovaction Center (OIC), answered a few questions we had about the deal. This interview was in conjunction with our interview with SmartThings CEO Alex Hawkinson. IoT Journal: SmartThings was developed in an open, standards-agnostic platform, and will now be part of Samsung's Open Innovation Center. Can you elaborate on your commitment to keep the platform open? Jacopo Lenzi: Samsung recognizes that true, accelerated innovation cannot be driven from one source, but requires a...
SYS-CON Events announced today that Red Hat, the world's leading provider of open source solutions, will exhibit at Internet of @ThingsExpo, which will take place on November 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA. Red Hat is the world's leading provider of open source software solutions, using a community-powered approach to reliable and high-performing cloud, Linux, middleware, storage and virtualization technologies. Red Hat also offers award-winning support, training, and consulting services. As the connective hub in a global network of enterprises, partners, a...
P2P RTC will impact the landscape of communications, shifting from traditional telephony style communications models to OTT (Over-The-Top) cloud assisted & PaaS (Platform as a Service) communication services. The P2P shift will impact many areas of our lives, from mobile communication, human interactive web services, RTC and telephony infrastructure, user federation, security and privacy implications, business costs, and scalability. In his session at Internet of @ThingsExpo, Robin Raymond, Chief Architect at Hookflash Inc., will walk through the shifting landscape of traditional telephone a...
SYS-CON Events announced today that Matrix.org has been named “Silver Sponsor” of Internet of @ThingsExpo, which will take place on November 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA. Matrix is an ambitious new open standard for open, distributed, real-time communication over IP. It defines a new approach for interoperable Instant Messaging and VoIP based on pragmatic HTTP APIs and WebRTC, and provides open source reference implementations to showcase and bootstrap the new standard. Our focus is on simplicity, security, and supporting the fullest feature set.
BSQUARE is a global leader of embedded software solutions. We enable smart connected systems at the device level and beyond that millions use every day and provide actionable data solutions for the growing Internet of Things (IoT) market. We empower our world-class customers with our products, services and solutions to achieve innovation and success. For more information, visit www.bsquare.com.
How do APIs and IoT relate? The answer is not as simple as merely adding an API on top of a dumb device, but rather about understanding the architectural patterns for implementing an IoT fabric. There are typically two or three trends: Exposing the device to a management framework Exposing that management framework to a business centric logic • Exposing that business layer and data to end users. This last trend is the IoT stack, which involves a new shift in the separation of what stuff happens, where data lives and where the interface lies. For instance, it’s a mix of architectural style...
SYS-CON Events announced today that SOA Software, an API management leader, will exhibit at SYS-CON's 15th International Cloud Expo®, which will take place on November 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA. SOA Software is a leading provider of API Management and SOA Governance products that equip business to deliver APIs and SOA together to drive their company to meet its business strategy quickly and effectively. SOA Software’s technology helps businesses to accelerate their digital channels with APIs, drive partner adoption, monetize their assets, and achieve a...
From a software development perspective IoT is about programming "things," about connecting them with each other or integrating them with existing applications. In his session at @ThingsExpo, Yakov Fain, co-founder of Farata Systems and SuranceBay, will show you how small IoT-enabled devices from multiple manufacturers can be integrated into the workflow of an enterprise application. This is a practical demo of building a framework and components in HTML/Java/Mobile technologies to serve as a platform that can integrate new devices as they become available on the market.
SYS-CON Events announced today that Utimaco will exhibit at SYS-CON's 15th International Cloud Expo®, which will take place on November 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA. Utimaco is a leading manufacturer of hardware based security solutions that provide the root of trust to keep cryptographic keys safe, secure critical digital infrastructures and protect high value data assets. Only Utimaco delivers a general-purpose hardware security module (HSM) as a customizable platform to easily integrate into existing software solutions, embed business logic and build s...
Connected devices are changing the way we go about our everyday life, from wearables to driverless cars, to smart grids and entire industries revolutionizing business opportunities through smart objects, capable of two-way communication. But what happens when objects are given an IP-address, and we rely on that connection, sometimes with our lives? How do we secure those vast data infrastructures and safe-keep the privacy of sensitive information? This session will outline how each and every connected device can uphold a core root of trust via a unique cryptographic signature – a “bir...
Internet of @ThingsExpo Silicon Valley announced on Thursday its first 12 all-star speakers and sessions for its upcoming event, which will take place November 4-6, 2014, at the Santa Clara Convention Center in California. @ThingsExpo, the first and largest IoT event in the world, debuted at the Javits Center in New York City in June 10-12, 2014 with over 6,000 delegates attending the conference. Among the first 12 announced world class speakers, IBM will present two highly popular IoT sessions, which will take place November 4-6, 2014 at the Santa Clara Convention Center in Santa Clara, Calif...
Almost everyone sees the potential of Internet of Things but how can businesses truly unlock that potential. The key will be in the ability to discover business insight in the midst of an ocean of Big Data generated from billions of embedded devices via Systems of Discover. Businesses will also need to ensure that they can sustain that insight by leveraging the cloud for global reach, scale and elasticity.
WebRTC defines no default signaling protocol, causing fragmentation between WebRTC silos. SIP and XMPP provide possibilities, but come with considerable complexity and are not designed for use in a web environment. In his session at Internet of @ThingsExpo, Matthew Hodgson, technical co-founder of the Matrix.org, will discuss how Matrix is a new non-profit Open Source Project that defines both a new HTTP-based standard for VoIP & IM signaling and provides reference implementations.

SUNNYVALE, Calif., Oct. 20, 2014 /PRNewswire/ -- Spansion Inc. (NYSE: CODE), a global leader in embedded systems, today added 96 new products to the Spansion® FM4 Family of flexible microcontrollers (MCUs). Based on the ARM® Cortex®-M4F core, the new MCUs boast a 200 MHz operating frequency and support a diverse set of on-chip peripherals for enhanced human machine interfaces (HMIs) and machine-to-machine (M2M) communications. The rich set of periphera...

SYS-CON Events announced today that Aria Systems, the recurring revenue expert, has been named "Bronze Sponsor" of SYS-CON's 15th International Cloud Expo®, which will take place on November 4-6, 2014, at the Santa Clara Convention Center in Santa Clara, CA. Aria Systems helps leading businesses connect their customers with the products and services they love. Industry leaders like Pitney Bowes, Experian, AAA NCNU, VMware, HootSuite and many others choose Aria to power their recurring revenue business and deliver exceptional experiences to their customers.
The Internet of Things (IoT) is going to require a new way of thinking and of developing software for speed, security and innovation. This requires IT leaders to balance business as usual while anticipating for the next market and technology trends. Cloud provides the right IT asset portfolio to help today’s IT leaders manage the old and prepare for the new. Today the cloud conversation is evolving from private and public to hybrid. This session will provide use cases and insights to reinforce the value of the network in helping organizations to maximize their company’s cloud experience.
The Internet of Things (IoT) is making everything it touches smarter – smart devices, smart cars and smart cities. And lucky us, we’re just beginning to reap the benefits as we work toward a networked society. However, this technology-driven innovation is impacting more than just individuals. The IoT has an environmental impact as well, which brings us to the theme of this month’s #IoTuesday Twitter chat. The ability to remove inefficiencies through connected objects is driving change throughout every sector, including waste management. BigBelly Solar, located just outside of Boston, is trans...
SYS-CON Events announced today that Matrix.org has been named “Silver Sponsor” of Internet of @ThingsExpo, which will take place on November 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA. Matrix is an ambitious new open standard for open, distributed, real-time communication over IP. It defines a new approach for interoperable Instant Messaging and VoIP based on pragmatic HTTP APIs and WebRTC, and provides open source reference implementations to showcase and bootstrap the new standard. Our focus is on simplicity, security, and supporting the fullest feature set.