Welcome!

Cognitive Computing Authors: Yeshim Deniz, Elizabeth White, Pat Romanski, Liz McMillan, Zakia Bouachraoui

Related Topics: @CloudExpo, Java IoT, Microservices Expo, Microsoft Cloud, Containers Expo Blog, Cognitive Computing , Agile Computing

@CloudExpo: Article

A Sixth Thing CIOs Should Know About Big Data

Are you prepared?

Amid the slew of articles offering advice on Big Data,  Joab Jackson's, Five Things CIOs Should Know About Big Data. stood out because of how absolutely spot on it was.

The five points he makes nearly always come up in our conversations with customers and prospects:

  1. You will need to think about big data. What we're seeing now is that the price of entry to big data, at least from a CapEx standpoint, is pretty low. Open source tools like Hadoop, Cassandra, MongoDB, MapReduce and others, combined with the relatively low price of cloud computing, means organizations that may not have been inclined to collect, store and analyze their data volumes are now more willing and able to do so.
  2. Useful data can come from anywhere. Data that used to be "dropped on the floor" is one way to categorize big data. Gazzang CEO, Larry Warnock, likens to big data to a giant fishing net trolling the ocean floor. What we're hearing from customers is that big data is often a combination of innocuous machine exhaust, customer transaction histories, geolocation, and some personally identifiable information like health records and bank account data. How you use those disparate pieces of data to enhance your business or advance a project is what big data is all about.
  3. You will need new expertise for big data. Could big data be the next growth industry? We certainly think and hope so.
  4. Big data doesn't require organization beforehand. Here we have the analogy of big data as a "dumping ground." Poor big data. In just the last three paragraphs, we've referred to it as stuff you drop on the floor, a fishing net scooping up debris and a dumping ground. If big data were a kid, he'd be in therapy right now.
  5. The point is valid nonetheless. Big data allows you to ingest what you want, and worry about how you're going to use it later. This is how sensitive information often winds up in a big data environment.

  6. Big Data is not only about Hadoop. There are a number of really popular tools out there to help you make sense of your massive volumes of data. Joab mentions Splunk, HPCC Systems and MarkLogic. We have customers also using MongoDB, Ironfan from Infochimps and Chef for cloud infrastructure automation.
  7. In the next few weeks, Gazzang will bring to market a new big data monitoring and diagnostics tool called zOps. Stay tuned for news on the newest member of the Gazzang product family.

    Finally, I wanted to add a sixth, and final piece of advice to Joab's article.

  8. Think about security before you start. Too often, we hear from companies that leave data unprotected in a big data environment only to realize later that usernames and passwords, credit card data or health records were at risk of exposure. Fortunately, this hasn't come back to bite anyone yet (that we know of), but it's likely only a matter of time.
  9. Retrofitting security into an existing big data cluster, which may contain thousands of nodes, is challenging. It also takes time to understand what data is being collected and whether it's even worth protecting.

    Data encryption and key management can act as a last line of defense against unauthorized access or attack. It's relatively inexpensive and won't noticeably impact performance or availability of big data. So our advice to customers is, if you think you might have some sensitive data in your environment, it's better to be safe then sorry.

More Stories By David Tishgart

David Tishgart is a Director of Product Marketing at Cloudera, focused on the company's cloud products, strategy, and partnerships. Prior to joining Cloudera, he ran business development and marketing at Gazzang, an enterprise security software company that was eventually acquired by Cloudera. He brings nearly two decades of experience in enterprise software, hardware, and services marketing to Cloudera. He holds a bachelor's degree in journalism from the University of Texas at Austin.

IoT & Smart Cities Stories
The deluge of IoT sensor data collected from connected devices and the powerful AI required to make that data actionable are giving rise to a hybrid ecosystem in which cloud, on-prem and edge processes become interweaved. Attendees will learn how emerging composable infrastructure solutions deliver the adaptive architecture needed to manage this new data reality. Machine learning algorithms can better anticipate data storms and automate resources to support surges, including fully scalable GPU-c...
Machine learning has taken residence at our cities' cores and now we can finally have "smart cities." Cities are a collection of buildings made to provide the structure and safety necessary for people to function, create and survive. Buildings are a pool of ever-changing performance data from large automated systems such as heating and cooling to the people that live and work within them. Through machine learning, buildings can optimize performance, reduce costs, and improve occupant comfort by ...
The explosion of new web/cloud/IoT-based applications and the data they generate are transforming our world right before our eyes. In this rush to adopt these new technologies, organizations are often ignoring fundamental questions concerning who owns the data and failing to ask for permission to conduct invasive surveillance of their customers. Organizations that are not transparent about how their systems gather data telemetry without offering shared data ownership risk product rejection, regu...
René Bostic is the Technical VP of the IBM Cloud Unit in North America. Enjoying her career with IBM during the modern millennial technological era, she is an expert in cloud computing, DevOps and emerging cloud technologies such as Blockchain. Her strengths and core competencies include a proven record of accomplishments in consensus building at all levels to assess, plan, and implement enterprise and cloud computing solutions. René is a member of the Society of Women Engineers (SWE) and a m...
Poor data quality and analytics drive down business value. In fact, Gartner estimated that the average financial impact of poor data quality on organizations is $9.7 million per year. But bad data is much more than a cost center. By eroding trust in information, analytics and the business decisions based on these, it is a serious impediment to digital transformation.
Digital Transformation: Preparing Cloud & IoT Security for the Age of Artificial Intelligence. As automation and artificial intelligence (AI) power solution development and delivery, many businesses need to build backend cloud capabilities. Well-poised organizations, marketing smart devices with AI and BlockChain capabilities prepare to refine compliance and regulatory capabilities in 2018. Volumes of health, financial, technical and privacy data, along with tightening compliance requirements by...
Predicting the future has never been more challenging - not because of the lack of data but because of the flood of ungoverned and risk laden information. Microsoft states that 2.5 exabytes of data are created every day. Expectations and reliance on data are being pushed to the limits, as demands around hybrid options continue to grow.
Digital Transformation and Disruption, Amazon Style - What You Can Learn. Chris Kocher is a co-founder of Grey Heron, a management and strategic marketing consulting firm. He has 25+ years in both strategic and hands-on operating experience helping executives and investors build revenues and shareholder value. He has consulted with over 130 companies on innovating with new business models, product strategies and monetization. Chris has held management positions at HP and Symantec in addition to ...
Enterprises have taken advantage of IoT to achieve important revenue and cost advantages. What is less apparent is how incumbent enterprises operating at scale have, following success with IoT, built analytic, operations management and software development capabilities - ranging from autonomous vehicles to manageable robotics installations. They have embraced these capabilities as if they were Silicon Valley startups.
As IoT continues to increase momentum, so does the associated risk. Secure Device Lifecycle Management (DLM) is ranked as one of the most important technology areas of IoT. Driving this trend is the realization that secure support for IoT devices provides companies the ability to deliver high-quality, reliable, secure offerings faster, create new revenue streams, and reduce support costs, all while building a competitive advantage in their markets. In this session, we will use customer use cases...