API Journal Authors: Liz McMillan, Rich DeFabritus, Pat Romanski, Derek Weeks, Elizabeth White

Related Topics: Containers Expo Blog, Microservices Expo, Open Source Cloud, API Journal, Agile Computing, @CloudExpo

Containers Expo Blog: Article

CIOs' Top Priority: Analytics and BI

How to Deal with the Data Integration Bottleneck

Whether as a driver for growth, a means to attract and retain customers, or a way to drive innovation and reduce costs, the business value of analytics and business intelligence has never been higher.

Gartner's Amplifying the Enterprise: The 2012 CIO Agenda as well as IBM's Global CIO Study 2011 confirm this point, with analytics and BI setting atop CIO's technology priorities in both reports.

Data Integration Is the Biggest Bottleneck
Providing analytics and BI solutions with the data required has always been difficult, with data integration long considered the biggest bottleneck in any analytics or BI project.

Complex data landscapes, diverse data types, new sources such as big data and the cloud are but a few of the well-known barriers.

For the past two decades, the default solution has been to first consolidate the data into a data warehouse, and then provide users with tools to analyze and report on this consolidated data.

However, data integration based on these traditional replication and consolidation approaches have numerous moving parts that must be synchronized. Doing this right extends lead times.

The Data Warehousing Institute confirms this lack of agility. Their recent study stated the average time needed to add a new data source to an existing BI application was 8.4 weeks in 2009, 7.4 weeks in 2010, and 7.8 weeks in 2011. And 33% of the organizations needed more than 3 months to add a new data source.

Data Virtualization Brings Agility to Analytics and BI
According to Data Virtualization: Going Beyond Traditional Data Integration to Achieve Business Agility, data virtualization significantly accelerates data integration agility. Key to this success has been data virtualization's ability to provide:

  • A more streamlined data integration approach
  • A more iterative development process
  • A more adaptable change management process

Using data virtualization as a complement to existing data integration approaches, the ten organizations profiled in the book cut analytics and BI project times in half or more.

This agility allowed the same teams to double their number of analytics and BI projects, significantly accelerating business benefits.

For more insights on data virtualization and business agility, check out my earlier articles on this topic.

Simplify to Overcome Historical IT Complexity

Data virtualization's simplified information access and faster time-to-solution is especially useful as an enabler for  more agile analytics and BI

Is Data Virtualization the Fast Path to BI Agility? describes how the architectures of most business intelligence systems are based on a complex chain of data stores starting with production databases, data staging areas, a data warehouse, dependent data marts, and personal data stores.   Simply maintaining this complexity is overwhelming IT today.

These classic BI architectures served business well for the last twenty years. However, considering the need for more agility, they have some disadvantages:

  • Duplication of data
  • Non-shared meta data specifications
  • Limited flexibility
  • Decrease of data quality
  • Limited support for operational reporting:
  • Limited support for reporting on unstructured and external data"

From a different point of view, SOA World's Zettabytes of Data and Beyond describes the challenges of force-fitting development methods that were appropriate for earlier times when less data complexity was the norm.

In addition, the proliferation of fit-for-purpose data stores including data warehouse appliances, Hadoop-based file systems, and a range of No-SQL data stores are breaking the hegemony of the traditional data warehouse as the "best" solution to the enterprise-level data integration problem.   The business and IT impact of these new approaches can be explored in the Virtualization Magazine article NoSQL and Data Virtualization - Soon to Be Best Friends.

Self-Service Analytics and BI are Important Too!
Responding to constantly changing business demands for analytics and BI is a daunting effort.

Mergers and acquisitions and evolving supply chains require new comparisons and aggregations. The explosion of social media drives demand for new customer insights. Mobile computing changes form factors. And self-service BI puts users in the driver's seat.

Business Taking Charge of Analytics and BI

In true Darwinian fashion, the business side of most organizations is now taking greater responsibility for fulfilling its own information needs rather than depending solely on already-burdened IT resources.

For example, in a 2011 survey of over 625 business and IT professionals entitled Self-Service Business Intelligence: TDWI Best Practices Report, @TDWI July 2011,The Data Warehousing Institute (TDWI) identified the following top five factors driving businesses toward self-service business intelligence:

  • Constantly changing business needs (65%)
  • IT's inability to satisfy new requests in a timely manner (57%)
  • The need to be a more analytics-driven organization (54%)
  • Slow and untimely access to information (47%)
  • Business user dissatisfaction with IT-delivered BI capabilities (34%)

In the same survey report, authors Claudia Imhoff and Colin White suggest that IT's focus shifts toward making it easier for business users "to access the growing number of dispersed data sources that exist in most organizations."

Examples Imhoff and White cite include:

  • providing friendlier business views of source data
  • improving on-demand access to data across multiple data sources
  • enabling data discovery and search functions
  • supporting access to other types of data, such as unstructured documents; and more.

Data Virtualization to the Self-Service Rescue

In the TDWI survey, 60% of respondents rated business views of source data as "very important," and 44% said on-demand access to multiple data sources using data federation technologies was "very important."

According to Imhoff and White, "Data virtualization and associated data federation technologies enable BI/DW builders to build shared business views of multiple data sources so that the users do not have to be concerned about the physical location or structure of the data.

These views are sometimes known as virtual business views because, from an application perspective, the data appears to be consolidated in a single logical data store. In fact, it may be managed in multiple physical data structures on several different servers.

Data virtualization platforms such as the Composite Data Virtualization Platform support access to different types of data sources, including relational databases, non-relational systems, application package databases, flat files, Web data feeds, and Web services.

To Achieve Self-Service BI, Consider Using Data Virtualization provides additional insights on about how data virtualization enables self-service analytics and BI.

More Stories By Robert Eve

Robert Eve is the EVP of Marketing at Composite Software, the data virtualization gold standard and co-author of Data Virtualization: Going Beyond Traditional Data Integration to Achieve Business Agility. Bob's experience includes executive level roles at leading enterprise software companies such as Mercury Interactive, PeopleSoft, and Oracle. Bob holds a Masters of Science from the Massachusetts Institute of Technology and a Bachelor of Science from the University of California at Berkeley.

@ThingsExpo Stories
In the next five to ten years, millions, if not billions of things will become smarter. This smartness goes beyond connected things in our homes like the fridge, thermostat and fancy lighting, and into heavily regulated industries including aerospace, pharmaceutical/medical devices and energy. “Smartness” will embed itself within individual products that are part of our daily lives. We will engage with smart products - learning from them, informing them, and communicating with them. Smart produc...
SYS-CON Events announced today that 910Telecom will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Housed in the classic Denver Gas & Electric Building, 910 15th St., 910Telecom is a carrier-neutral telecom hotel located in the heart of Denver. Adjacent to CenturyLink, AT&T, and Denver Main, 910Telecom offers connectivity to all major carriers, Internet service providers, Internet backbones and ...
SYS-CON Events announced today that Coalfire will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Coalfire is the trusted leader in cybersecurity risk management and compliance services. Coalfire integrates advisory and technical assessments and recommendations to the corporate directors, executives, boards, and IT organizations for global brands and organizations in the technology, cloud, health...
SYS-CON Events announced today that Transparent Cloud Computing (T-Cloud) Consortium will exhibit at the 19th International Cloud Expo®, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. The Transparent Cloud Computing Consortium (T-Cloud Consortium) will conduct research activities into changes in the computing model as a result of collaboration between "device" and "cloud" and the creation of new value and markets through organic data proces...
The Internet of Things (IoT), in all its myriad manifestations, has great potential. Much of that potential comes from the evolving data management and analytic (DMA) technologies and processes that allow us to gain insight from all of the IoT data that can be generated and gathered. This potential may never be met as those data sets are tied to specific industry verticals and single markets, with no clear way to use IoT data and sensor analytics to fulfill the hype being given the IoT today.
WebRTC defines no default signaling protocol, causing fragmentation between WebRTC silos. SIP and XMPP provide possibilities, but come with considerable complexity and are not designed for use in a web environment. In his session at @ThingsExpo, Matthew Hodgson, technical co-founder of the Matrix.org, discussed how Matrix is a new non-profit Open Source Project that defines both a new HTTP-based standard for VoIP & IM signaling and provides reference implementations.
In his general session at 18th Cloud Expo, Lee Atchison, Principal Cloud Architect and Advocate at New Relic, discussed cloud as a ‘better data center’ and how it adds new capacity (faster) and improves application availability (redundancy). The cloud is a ‘Dynamic Tool for Dynamic Apps’ and resource allocation is an integral part of your application architecture, so use only the resources you need and allocate /de-allocate resources on the fly.
We're entering the post-smartphone era, where wearable gadgets from watches and fitness bands to glasses and health aids will power the next technological revolution. With mass adoption of wearable devices comes a new data ecosystem that must be protected. Wearables open new pathways that facilitate the tracking, sharing and storing of consumers’ personal health, location and daily activity data. Consumers have some idea of the data these devices capture, but most don’t realize how revealing and...
A completely new computing platform is on the horizon. They’re called Microservers by some, ARM Servers by others, and sometimes even ARM-based Servers. No matter what you call them, Microservers will have a huge impact on the data center and on server computing in general. Although few people are familiar with Microservers today, their impact will be felt very soon. This is a new category of computing platform that is available today and is predicted to have triple-digit growth rates for some ...
SYS-CON Events announced today that MathFreeOn will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. MathFreeOn is Software as a Service (SaaS) used in Engineering and Math education. Write scripts and solve math problems online. MathFreeOn provides online courses for beginners or amateurs who have difficulties in writing scripts. In accordance with various mathematical topics, there are more tha...
In past @ThingsExpo presentations, Joseph di Paolantonio has explored how various Internet of Things (IoT) and data management and analytics (DMA) solution spaces will come together as sensor analytics ecosystems. This year, in his session at @ThingsExpo, Joseph di Paolantonio from DataArchon, will be adding the numerous Transportation areas, from autonomous vehicles to “Uber for containers.” While IoT data in any one area of Transportation will have a huge impact in that area, combining sensor...
SYS-CON Events announced today that SoftNet Solutions will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. SoftNet Solutions specializes in Enterprise Solutions for Hadoop and Big Data. It offers customers the most open, robust, and value-conscious portfolio of solutions, services, and tools for the shortest route to success with Big Data. The unique differentiator is the ability to architect and ...
The best way to leverage your Cloud Expo presence as a sponsor and exhibitor is to plan your news announcements around our events. The press covering Cloud Expo and @ThingsExpo will have access to these releases and will amplify your news announcements. More than two dozen Cloud companies either set deals at our shows or have announced their mergers and acquisitions at Cloud Expo. Product announcements during our show provide your company with the most reach through our targeted audiences.
More and more brands have jumped on the IoT bandwagon. We have an excess of wearables – activity trackers, smartwatches, smart glasses and sneakers, and more that track seemingly endless datapoints. However, most consumers have no idea what “IoT” means. Creating more wearables that track data shouldn't be the aim of brands; delivering meaningful, tangible relevance to their users should be. We're in a period in which the IoT pendulum is still swinging. Initially, it swung toward "smart for smar...
@ThingsExpo has been named the Top 5 Most Influential Internet of Things Brand by Onalytica in the ‘The Internet of Things Landscape 2015: Top 100 Individuals and Brands.' Onalytica analyzed Twitter conversations around the #IoT debate to uncover the most influential brands and individuals driving the conversation. Onalytica captured data from 56,224 users. The PageRank based methodology they use to extract influencers on a particular topic (tweets mentioning #InternetofThings or #IoT in this ...
SYS-CON Events announced today that Niagara Networks will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Niagara Networks offers the highest port-density systems, and the most complete Next-Generation Network Visibility systems including Network Packet Brokers, Bypass Switches, and Network TAPs.
In an era of historic innovation fueled by unprecedented access to data and technology, the low cost and risk of entering new markets has leveled the playing field for business. Today, any ambitious innovator can easily introduce a new application or product that can reinvent business models and transform the client experience. In their Day 2 Keynote at 19th Cloud Expo, Mercer Rowe, IBM Vice President of Strategic Alliances, and Raejeanne Skillern, Intel Vice President of Data Center Group and ...
Data is the fuel that drives the machine learning algorithmic engines and ultimately provides the business value. In his session at Cloud Expo, Ed Featherston, a director and senior enterprise architect at Collaborative Consulting, will discuss the key considerations around quality, volume, timeliness, and pedigree that must be dealt with in order to properly fuel that engine.
Virgil consists of an open-source encryption library, which implements Cryptographic Message Syntax (CMS) and Elliptic Curve Integrated Encryption Scheme (ECIES) (including RSA schema), a Key Management API, and a cloud-based Key Management Service (Virgil Keys). The Virgil Keys Service consists of a public key service and a private key escrow service. 

Fact is, enterprises have significant legacy voice infrastructure that’s costly to replace with pure IP solutions. How can we bring this analog infrastructure into our shiny new cloud applications? There are proven methods to bind both legacy voice applications and traditional PSTN audio into cloud-based applications and services at a carrier scale. Some of the most successful implementations leverage WebRTC, WebSockets, SIP and other open source technologies. In his session at @ThingsExpo, Da...