|By Ted Alford, Gwen Morton||
|October 26, 2009 02:00 AM EDT||
Government Cloud Computing on Ulitzer
The President's budget for fiscal year 2010 (FY10) includes $75.8B in information technology (IT) spending, which is a 7-percent increase from FY09. Of this, at least $20B will be spent on IT infrastructure investments.  The FY11 budget for IT is projected to be nearly $88B. The government is actively seeking ways to reduce IT costs, and the FY10 budget request highlights opportunities for the federal government to achieve significant long-term cost savings through the adoption of cloud computing technologies:
"Of the investments that will involve up-front costs to be recouped in outyear savings, cloud-computing is a prime case in point. The Federal Government will transform its Information Technology Infrastructure by virtualizing data centers, consolidating data centers and operations, and ultimately adopting a cloud-computing business model. Initial pilots conducted in collaboration with Federal agencies will serve as test beds to demonstrate capabilities, including appropriate security and privacy protection at or exceeding current best practices, developing standards, gathering data, and benchmarking costs and performance. The pilots will evolve into migrations of major agency capabilities from agency computing platforms to base agency IT processes and data in the cloud. Expected savings in the outyears, as more agencies reduce their costs of hosting systems in their own data centers, should be many times the original investment in this area." 
The language in the budget makes three key points: (1) up-front investment will be made in cloud computing, (2) long-term savings are expected, and (3) the savings are expected to be significantly greater than the investment costs.
Booz Allen Hamilton has created a detailed cost model that can create life-cycle cost (LCC) estimates of public, private, and hybrid clouds. We used this model, and our extensive experience in economic analysis of IT programs, to arrive at a first-order estimate of each of the three key points in the President's budget. Overall, it appears likely that the expectations highlighted in the budget can be met, but several factors could affect the overall degree of economic benefit.
The government's adoption of this new IT model warrants careful consideration of the model's broad economic implications-including the potential long-term benefits in terms of cost savings and avoidance as well as the near-term costs and other impacts of a transition from the current environment. Factors such as the number and rate of federal agencies adopting cloud computing, the length of their transitions to cloud computing, and the cloud computing deployment model (public, private, or hybrid) all will affect the total costs, potential benefits, and time required for the expected benefits to offset the investment costs.
Booz Allen developed a first-order economic analysis by considering how agencies might migrate to a cloud-based environment and what the costs and potential savings might be under a variety of scenarios. Specifically, given long-standing efforts to protect the privacy and security of the federal government's data and systems, a key variable will be whether agencies take advantage of public clouds, build their own private clouds, or adopt a hybrid approach., The focus was on Cloud Computing infrastructure services as these tend to represent a relatively more consistent set of costs/investments/operating requirements across all agencies. We made some high-level, simplifying assumptions in our initial analysis:
- There is an existing data center(s) currently operational that is a baseline for economic comparison of migrating to a cloud environment.
- Existing application software will migrate with the infrastructure to the cloud. Application software support costs remain out of scope.
- Migration decisions will be made at the department or agency (rather than bureau) level in order to aggregate demand and drive scale efficiencies.
- We assume the perceived sensitivity of an agency's mission and data will be a primary factor (though by no means the only factor) driving its decisions on which path to follow..
Next, we developed three high-level scenarios that represent potential migration paths. The three scenarios are as follows:
Scenario 1: Public Cloud Adopters
Key Agency Characteristic: Migrates low-sensitivity data to an existing public cloud.
Assumptions: Transition to the new cloud environment will occur steadily over 3 years; workload remains constant (i.e., no increase in capacity demand).
Scenario 2: Hybrid Cloud Adopters
Key Agency Characteristic: Uses a private cloud solution to handle the majority of its IT workload; also uses a public cloud solution to provide "surge" support and/or support for low-sensitivity data.
Assumptions: Seventy-five percent of the IT server workload will migrate to a private cloud, and the remaining 25 percent will transition to a public cloud; transition to the new cloud environments will occur steadily over 3 years; existing facilities will be used (i.e., no new investment is required in physical facilities); workload remains constant (i.e., no increase in capacity demand).
Scenario 3: Private Cloud Adopters
Key Agency Characteristic: Builds its own private cloud solution or participates in an interagency cloud solution (i.e., community cloud). Broad mission sensitivity results in the need to maintain control of infrastructure and data.
Assumptions: Transition to the new cloud environment will occur steadily over 3 years; existing facilities will be used (i.e., no new investment is required in physical facilities); workload remains constant (i.e., no increase in capacity demand).
Agencies publicly report only their "consolidated" IT infrastructure expenditures, which include end-user support systems (e.g., desktops, laptops) and telecommunications. Additional spending on application-specific IT infrastructure is typically rolled up into individual IT investments. In an effort to isolate data center costs, we extrapolated findings based on our experience with actual federal data centers. Specifically, we developed a "representative" agency data center profile that serves as a useful proxy for other agencies and enables us to explore the potential savings of a migration to cloud computing under the scenarios described above. Although agencies of similar size can have very different IT infrastructure profiles, we modeled an agency with a classic standards-based web application infrastructure. For our representative agency, we began with an assumption that a Status Quo (SQ) data center containing 1,000 servers with no virtualization is already operational.  The results at different scales are shown in our analysis.
Using a Booz Allen proprietary cloud computing cost and economic model that employs data collected internally, data from industry, and parametric estimating techniques, we estimated the LCCs for our representative agency to migrate its IT infrastructure (i.e., its server hardware and software) to the cloud under each of the three scenarios described above. We compared these costs to the LCCs of the SQ scenario (i.e., no cloud migration).  We also calculated three common metrics to analyze each scenario's potential economic benefits. These metrics allowed us to evaluate the three elements of the business case in the President's budget and estimate the absolute and relative benefits, as well as the time over which the outyear savings will pay back the investment costs.
The three key metrics in our analysis are as follows:
- Net Present Value (NPV) is calculated as each cloud scenario's discounted net benefits (i.e., the cloud scenario's reduced operations and support [O&S] costs relative to the SQ environment's O&S costs) minus the cloud's discounted one-time investment costs. A positive dollar figure indicates a positive economic benefit versus the SQ environment. NPV is an absolute economic metric.
- Benefit-to-Cost Ratio (BCR) is calculated as each cloud scenario's discounted net benefits divided by its discounted investment costs. A number greater than 1.0 indicates a positive economic benefit versus the SQ environment. BCR is a relative economic metric.
- Discounted Payback Period (DPP) reflects the number of years (from FY10) it takes for each scenario's accumulated annual benefits to equal its total investment costs.
The top portion of Exhibit 1 shows the analysis results. This exhibit presents the one-time investment phase costs as well as the recurring O&S phase costs for each scenario with a 13-year life cycle (3-year investment phase and 10-year steady-state O&S phase) from FY10 through FY22.
Assuming a 3-year transition period for each scenario, investment costs are expected to be incurred from FY10 to FY12 and include (depending on the scenario) hardware procurement and commercial off-the-shelf (COTS) software license fees; contractor labor required for installation, configuration, and testing; and technical and planning support (i.e., system engineering and program management costs) before and during the cloud migration. Because the SQ reflects an operational steady state, no investment costs are estimated for that scenario. Although the public cloud scenario does not present any up-front investment costs for hardware or software procurement, it does require program planning and technical support, support for porting applications over to the new cloud environment, and testing support to ensure programs and applications are working correctly in the new environment.
Recurring O&S costs "ramp up" for all cloud scenarios beginning in FY10 and enter steady state in FY13, continuing through FY22. For private clouds, these costs include hardware and software maintenance, periodic replacement/license renewal costs, system operations labor support costs, and IT power and cooling costs. For hybrid clouds, the O&S costs include the same items as the private cloud (albeit on a reduced scale), as well as the unit consumption costs of IT services procured from the public cloud. For public cloud scenarios, the O&S costs are the unit costs of services procured from the cloud provider and a small amount of IT support labor for the cloud provider to communicate any service changes or problems. In all three cloud scenarios, a significant portion of the O&S costs are incurred while phasing out the SQ environment during the transition. The SQ phase-out costs "ramp down" from FY10 to FY12, dove-tailing with the ramp up of the new clouds' O&S costs. Not surprisingly, the total LCCs are lowest for the public cloud scenario and highest for the private cloud scenario, with the hybrid cloud scenario's LCCs falling in the middle.
The economic analysis confirms that the projected NPV and BCR for all three scenarios are significant relative to the SQ environment. Once the cloud migrations are completed, our model estimates annual O&S savings in the 65-85 percent range, with the lower end corresponding to the private cloud scenario and the upper end corresponding to the public cloud scenario. These percentages can be applied to overall federal IT spending for data centers to estimate the potential absolute savings across the federal government. (As part of the Information Technology Infrastructure Line of Business [ITI LoB] initiative, General Services Administration [GSA] is coordinating a benchmarking effort across the government. If those figures are made public, a total dollar savings estimate will be possible).
Our model shows that the net benefits and payback periods for agencies adopting the hybrid cloud scenario are closer to those for the private cloud than the public cloud. This variation is largely a result of our assumption that 75 percent of the current server workload would migrate to a private cloud and only 25 percent would transition to the public cloud. If we were instead to assume the opposite mix (i.e., 25 percent of the workload migrating to a private cloud and 75 percent to a public cloud), the hybrid scenario economic results would be closer to the public cloud results.
We conducted a sensitivity analysis on several of the variables in our cost model to determine the major drivers for cloud economics. The two most influential factors driving the economic benefits are (1) the reduction in hardware as a smaller number of virtualized servers in the cloud replace physical servers in the SQ data center and (2) the length of the cloud migration schedule. Exhibits 2, 3, and 4 show the results of varying these factors.
In practice, several factors could cause agencies to realize lower economic benefits than our estimates suggest. One factor is underestimation of the costs associated with the investment or O&S phase for the cloud scenarios. Another factor is server utilization rates (both in the current environment and the new cloud environment). Our analysis assumes an average utilization rate of 12 percent of available CPU capacity in the SQ environment and 60 percent in the virtualized cloud scenarios. This difference in server utilization, in turn, enables a large reduction in the number of servers (and their associated support costs) required in a cloud environment to process the same workload relative to the SQ environment. Agencies with server utilization rates that are already relatively high should expect lower potential savings from a virtualized cloud environment.
The charts indicate two key takeaways:
- Scale is important: The economic benefit increases as virtualized servers replace larger numbers of underutilized servers.
- Time is money: Because of the cost of parallel IT operations (i.e., cloud and non-cloud), the shorter the server migration schedule, the greater the economic benefits.
These findings, in turn, lead us to the following recommendations for agencies and policymakers contemplating a cloud migration:
- It is more cost-effective to group smaller existing data centers together into as large a cloud as possible, rather than creating several smaller clouds.
To reduce the cost of running parallel operations, organization should properly plan for and then migrate to the new cloud environment as quickly as possible. The three lines in Exhibit 5 show (in this case, for the public cloud) that the BCR goes down rapidly and the DPP increases as the transition time increases.
A few agencies are already moving quickly to explore cloud computing solutions and are even redirecting existing funds to begin implementations. However, for most of the federal government, the timeframe for redirecting IT funding to support cloud migrations is likely to be at least 1-2 years, given that agencies formulate budgets 18 months before receiving appropriations.
Specifically, an agency develops IT investment requests each spring and submits them to OMB in September, along with the agency's program budget request, for the following government fiscal year. OMB reviews agency submissions in the fall and can implement funding changes via passback decisions (generally in late November) before submitting the President's budget to the Congress in February. Theoretically, the earliest opportunity for OMB to push agencies to revise their IT budgets to support a transition to the cloud will be fall 2009; however, agencies typically only have about 1 month to incorporate changes to their IT portfolios during passback. To give GSA and OMB time to develop more detailed guidance, as well as necessary procurement mechanisms and vehicles, it is more likely that OMB will direct or encourage agencies to plan for cloud migrations during the FY12 budget cycle (starting in the spring of 2010).
Other Considerations with Potential Economics Effects
When deciding on moving to the cloud, agencies need to consider some additional technical aspects of cloud computing and their potential impact on their organization. Such areas include but are not limited to data security, software migration, technical architectures, and the skill set of the IT workforce.
All government organizations struggle with ensuring that the data they have remains secure and adheres to current policies and regulations. Because data security is such a critical issue, cloud providers will be required to address it in their products and services, and should be able to tailor the level of security to meet demand. Additionally, by centralizing data and servers, a cloud environment will allow for easier detection and investigation of incidents, and allow enabling IT staff to replicate and address them efficiently.
However, there are currently no security standards for cloud computing, and until such standards have been developed, and used effectively to measure provider services and enforce accountability, any failures will fall on the agency's in-house IT organization. In awareness of this reality, organizations should be careful about putting mission-critical and core processes into a public cloud, and private cloud architectures should be designed to minimize any security concerns while realizing the benefits of cloud optimization.
Service Oriented Architecture
As the government moves towards embracing Service-Oriented Architecture (SOA), cloud computing will optimize the benefits of those investments. Cloud computing is inherently a Service Oriented Architecture and implementing the private clouds will provide for more control over data, security and privacy.
Migration of Applications to the Cloud
This article identifies the financial benefits of migrating the IT infrastructure to the cloud.
Cloud architectures and service delivery models will lead to changing needs for technical skills amongst agencies' IT workforces. CIOs will need to plan to conduct or refresh workforce assessments and training, as well as set aside the necessary funding, to ensure technical staff are trained on cloud architecture, implementation and operations.
Economic Influence on Policy
From an economic perspective, GSA and OMB can take a number of steps to maximize the probability that the cloud computing business model can work in the federal government; i.e., that it can achieve its objective of enabling significant cost savings. These steps promote information sharing and transparency in the realistic costs and benefits of various cloud models, as well as establishing the necessary policy and contracting frameworks. Because scale is a key variable affecting both costs and benefits, policy guidance regarding scale considerations will be particularly critical (e.g., determining how much flexibility, if any, agencies and departments have to create private clouds at the bureau and/or interagency level).
As a cloud "storefront," GSA should conduct due diligence reviews to establish that public cloud providers, once identified, indeed offer highly efficient, highly scalable (both up and down) usage-based pricing beyond traditional managed services (e.g., by comparing proposed rates against commercial benchmarks). GSA should also work with potential providers to ensure agencies can readily understand service definitions, service levels, terms, conditions, and pricing. These steps will provide transparency to facilitate agencies' ability to compare potential provider pricing against their legacy operations costs-an essential component of building a credible business case for any type of cloud migration. In earlier shared services initiatives, such as financial management, the lack of such standardized information on pricing and service levels in the first few years proved a major impediment to progress, as agencies faced decisions about alternative solutions that were often based on unreliable cost data from potential vendors.
Finally, GSA will need to establish and communicate its own schedule for cloud services founded in the pricing for the services with different cloud venders..
Summary of Key Observations
Although cloud computing offers potentially significant savings to federal agencies by reducing their expenditures on server hardware and associated support costs, chief information officers, policymakers, and other interested parties should bear in mind a number of practical considerations:
- It will take, on average, 18-24 months for most agencies to redirect funding to support this transition, given the budget process.
- Some up-front investment will be required, even for agencies seeking to take advantage of public cloud options.
- Implementations may take several years, depending on the size of the agency and the complexity of the cloud model it selects (i.e., public, private, or hybrid).
- It could take as long as 4 years for the accumulated savings from agency investments in cloud computing to offset the initial investment costs; this timeframe could be longer if implementations are improperly planned or inefficiently executed.
Given these observations, we offer the following recommendations:
- OMB, GSA, and other organizations, such as National Institute of Standards and Technology (NIST), should provide timely, well-coordinated support-in the form of necessary standards, guidance, policy decisions, and issue resolution-to ensure agencies have the necessary tools to efficiently plan and carry out migrations to cloud environments. As the length of the migration period increases, the potential economic benefits of the migration decrease.
- OMB and GSA should seek to identify those agencies with the highest near-term IT costs and expedite their migration to the cloud.
- To encourage steady progress, OMB should establish a combination of incentives and disincentives; e.g., consider allowing agencies to retain a small percentage of any savings realized from cloud computing for investments in future initiatives. To monitor progress and heighten transparency and accountability, OMB could incorporate cloud-related metrics into the new government-wide IT dashboard.
- Agencies should consider which of the high-level scenarios described in this article best suits their needs, with the understanding that regardless of scenario chosen, proper planning and efficient execution are critical success factors from an economic perspective.
- Given the significant impact of scale efficiencies, agencies selecting a private cloud approach should fully explore the potential for interdepartmental and interagency collaboration and investment (consistent with emerging OMB and GSA guidance). This, in effect, leads to the fourth cloud deployment model-the community cloud. A community cloud is a collaboration between private cloud operators to share resources and services.
- Agencies should identify the aspects of their current IT workload that can be transitioned to the cloud in the near term to yield "early wins" to help build momentum and support for the migration to cloud computing.
Cloud computing has received executive backing and offers clear opportunities for agencies to significantly reduce their growing data center and IT hardware expenditures. However, for the government to achieve the envisioned savings, organizations charged with oversight, such as OMB, NIST, and GSA, will have to facilitate progress, and departments and agencies will have to carefully select and plan for future cloud scenarios that yield the best tradeoffs among their respective costs, benefits, and risks.
- Figures from INPUT data for the FY10 President's budget; of the $20B in expenditures categorized as office automation and IT infrastructure spending, about $12.2B is spent on major IT investments, with the remainder on non-majors. Additional expenditures on application-specific IT infrastructure are typically reported as part of individual IT investments.
- President's budget, FY10 (Analytical Perspectives).
- The 1,000 servers are broken down in our cost model by server processing capacity (small, medium, and large) based on proportions consistent with our experience.
- Our model focuses on the costs that a cloud migration will most likely directly affect; i.e., costs for server hardware (and associated support hardware, such as internal routers and switches, rack hardware, cabling, etc.), basic server software (OS software, standard backup management, and security software), associated contractor labor for engineering and planning support during the transition phase, hardware and software maintenance, IT operations labor, and IT power/cooling costs. It does not address other costs that would be less likely to vary significantly between cloud scenarios, such as storage, application software, telecommunications, or WAN/LAN. In addition, it does not include costs for government staff. Further, for simplicity we removed facilities cost from the analysis
|jhbeil 10/21/09 03:51:00 PM EDT|
so when is "cloudonomics" going to hit the bookshelves?
|Phillip Hallam-Baker 10/20/09 09:30:00 PM EDT|
Looking at the numbers in the article a little further, it is assumed that the utilization rate will increase from 16% to 60% and that the reduction in the number of machines is the reason for the purported 60% cost saving.
The only way I can make those numbers work is if it is assumed that 80% of the costs in a data center are driven by nothing more than the number of machines in the data center that are powered.
This seems to be an absurdly high assumption to me.
|Phillip Hallam-Baker 10/19/09 05:07:00 PM EDT|
I found the basic assumptions in this article to be unsupported. It is really easy to assume 65% savings from an infrastructure change if you ignore most of the costs of making the change.
I examine this in more detail on my blog.
I think this type of article will do great damage to cloud computing as it sets out claims that are simply ludicrous and will not be believed. It is entirely credible that newly deployed software services will be cheaper when designed for cloud deployment. It is not credible that anyone should expect to save a single dollar by taking a deployed application that does not otherwise need changing and throwing it into the cloud.
Once hardware costs are sunk, they are sunk. thus there are no savings to be won through 'migration' if you are a large corporation or a government agency. There will be real savings, but they will be modest and come gradually.
The savings from cloud computing will be for the smaller enterprise right down to the small business which does not even have a machine room let alone a data center. There the savings are real and dramatic. But let's not get cloud computing dismissed as hype with unsupported claims.
WebRTC converts the entire network into a ubiquitous communications cloud thereby connecting anytime, anywhere through any point. In his session at WebRTC Summit,, Mark Castleman, EIR at Bell Labs and Head of Future X Labs, will discuss how the transformational nature of communications is achieved through the democratizing force of WebRTC. WebRTC is doing for voice what HTML did for web content.
Oct. 9, 2015 06:00 AM EDT Reads: 1,393
The buzz continues for cloud, data analytics and the Internet of Things (IoT) and their collective impact across all industries. But a new conversation is emerging - how do companies use industry disruption and technology enablers to lead in markets undergoing change, uncertainty and ambiguity? Organizations of all sizes need to evolve and transform, often under massive pressure, as industry lines blur and merge and traditional business models are assaulted and turned upside down. In this new data-driven world, marketplaces reign supreme while interoperability, APIs and applications deliver un...
Oct. 9, 2015 06:00 AM EDT Reads: 283
Too often with compelling new technologies market participants become overly enamored with that attractiveness of the technology and neglect underlying business drivers. This tendency, what some call the “newest shiny object syndrome,” is understandable given that virtually all of us are heavily engaged in technology. But it is also mistaken. Without concrete business cases driving its deployment, IoT, like many other technologies before it, will fade into obscurity.
Oct. 9, 2015 05:45 AM EDT
Today air travel is a minefield of delays, hassles and customer disappointment. Airlines struggle to revitalize the experience. GE and M2Mi will demonstrate practical examples of how IoT solutions are helping airlines bring back personalization, reduce trip time and improve reliability. In their session at @ThingsExpo, Shyam Varan Nath, Principal Architect with GE, and Dr. Sarah Cooper, M2Mi's VP Business Development and Engineering, will explore the IoT cloud-based platform technologies driving this change including privacy controls, data transparency and integration of real time context w...
Oct. 9, 2015 05:15 AM EDT
The IoT is upon us, but today’s databases, built on 30-year-old math, require multiple platforms to create a single solution. Data demands of the IoT require Big Data systems that can handle ingest, transactions and analytics concurrently adapting to varied situations as they occur, with speed at scale. In his session at @ThingsExpo, Chad Jones, chief strategy officer at Deep Information Sciences, will look differently at IoT data so enterprises can fully leverage their IoT potential. He’ll share tips on how to speed up business initiatives, harness Big Data and remain one step ahead by apply...
Oct. 9, 2015 05:15 AM EDT Reads: 503
Nowadays, a large number of sensors and devices are connected to the network. Leading-edge IoT technologies integrate various types of sensor data to create a new value for several business decision scenarios. The transparent cloud is a model of a new IoT emergence service platform. Many service providers store and access various types of sensor data in order to create and find out new business values by integrating such data.
Oct. 9, 2015 04:00 AM EDT Reads: 566
The broad selection of hardware, the rapid evolution of operating systems and the time-to-market for mobile apps has been so rapid that new challenges for developers and engineers arise every day. Security, testing, hosting, and other metrics have to be considered through the process. In his session at Big Data Expo, Walter Maguire, Chief Field Technologist, HP Big Data Group, at Hewlett-Packard, will discuss the challenges faced by developers and a composite Big Data applications builder, focusing on how to help solve the problems that developers are continuously battling.
Oct. 9, 2015 04:00 AM EDT Reads: 500
WebRTC services have already permeated corporate communications in the form of videoconferencing solutions. However, WebRTC has the potential of going beyond and catalyzing a new class of services providing more than calls with capabilities such as mass-scale real-time media broadcasting, enriched and augmented video, person-to-machine and machine-to-machine communications. In his session at @ThingsExpo, Luis Lopez, CEO of Kurento, will introduce the technologies required for implementing these ideas and some early experiments performed in the Kurento open source software community in areas ...
Oct. 9, 2015 03:00 AM EDT Reads: 724
There are so many tools and techniques for data analytics that even for a data scientist the choices, possible systems, and even the types of data can be daunting. In his session at @ThingsExpo, Chris Harrold, Global CTO for Big Data Solutions for EMC Corporation, will show how to perform a simple, but meaningful analysis of social sentiment data using freely available tools that take only minutes to download and install. Participants will get the download information, scripts, and complete end-to-end walkthrough of the analysis from start to finish. Participants will also be given the pract...
Oct. 9, 2015 03:00 AM EDT Reads: 291
Internet of Things (IoT) will be a hybrid ecosystem of diverse devices and sensors collaborating with operational and enterprise systems to create the next big application. In their session at @ThingsExpo, Bramh Gupta, founder and CEO of robomq.io, and Fred Yatzeck, principal architect leading product development at robomq.io, discussed how choosing the right middleware and integration strategy from the get-go will enable IoT solution developers to adapt and grow with the industry, while at the same time reduce Time to Market (TTM) by using plug and play capabilities offered by a robust IoT ...
Oct. 9, 2015 02:00 AM EDT Reads: 2,215
Today’s connected world is moving from devices towards things, what this means is that by using increasingly low cost sensors embedded in devices we can create many new use cases. These span across use cases in cities, vehicles, home, offices, factories, retail environments, worksites, health, logistics, and health. These use cases rely on ubiquitous connectivity and generate massive amounts of data at scale. These technologies enable new business opportunities, ways to optimize and automate, along with new ways to engage with users.
Oct. 9, 2015 02:00 AM EDT Reads: 167
The Internet of Things (IoT) is growing rapidly by extending current technologies, products and networks. By 2020, Cisco estimates there will be 50 billion connected devices. Gartner has forecast revenues of over $300 billion, just to IoT suppliers. Now is the time to figure out how you’ll make money – not just create innovative products. With hundreds of new products and companies jumping into the IoT fray every month, there’s no shortage of innovation. Despite this, McKinsey/VisionMobile data shows "less than 10 percent of IoT developers are making enough to support a reasonably sized team....
Oct. 9, 2015 02:00 AM EDT Reads: 208
“In the past year we've seen a lot of stabilization of WebRTC. You can now use it in production with a far greater degree of certainty. A lot of the real developments in the past year have been in things like the data channel, which will enable a whole new type of application," explained Peter Dunkley, Technical Director at Acision, in this SYS-CON.tv interview at @ThingsExpo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
Oct. 9, 2015 01:45 AM EDT Reads: 7,026
Through WebRTC, audio and video communications are being embedded more easily than ever into applications, helping carriers, enterprises and independent software vendors deliver greater functionality to their end users. With today’s business world increasingly focused on outcomes, users’ growing calls for ease of use, and businesses craving smarter, tighter integration, what’s the next step in delivering a richer, more immersive experience? That richer, more fully integrated experience comes about through a Communications Platform as a Service which allows for messaging, screen sharing, video...
Oct. 9, 2015 12:00 AM EDT Reads: 1,133
SYS-CON Events announced today that Dyn, the worldwide leader in Internet Performance, will exhibit at SYS-CON's 17th International Cloud Expo®, which will take place on November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Dyn is a cloud-based Internet Performance company. Dyn helps companies monitor, control, and optimize online infrastructure for an exceptional end-user experience. Through a world-class network and unrivaled, objective intelligence into Internet conditions, Dyn ensures traffic gets delivered faster, safer, and more reliably than ever.
Oct. 8, 2015 10:00 PM EDT Reads: 595
The IoT market is on track to hit $7.1 trillion in 2020. The reality is that only a handful of companies are ready for this massive demand. There are a lot of barriers, paint points, traps, and hidden roadblocks. How can we deal with these issues and challenges? The paradigm has changed. Old-style ad-hoc trial-and-error ways will certainly lead you to the dead end. What is mandatory is an overarching and adaptive approach to effectively handle the rapid changes and exponential growth.
Oct. 8, 2015 09:00 PM EDT Reads: 118
Mobile messaging has been a popular communication channel for more than 20 years. Finnish engineer Matti Makkonen invented the idea for SMS (Short Message Service) in 1984, making his vision a reality on December 3, 1992 by sending the first message ("Happy Christmas") from a PC to a cell phone. Since then, the technology has evolved immensely, from both a technology standpoint, and in our everyday uses for it. Originally used for person-to-person (P2P) communication, i.e., Sally sends a text message to Betty – mobile messaging now offers tremendous value to businesses for customer and empl...
Oct. 8, 2015 05:30 PM EDT Reads: 233
Can call centers hang up the phones for good? Intuitive Solutions did. WebRTC enabled this contact center provider to eliminate antiquated telephony and desktop phone infrastructure with a pure web-based solution, allowing them to expand beyond brick-and-mortar confines to a home-based agent model. It also ensured scalability and better service for customers, including MUY! Companies, one of the country's largest franchise restaurant companies with 232 Pizza Hut locations. This is one example of WebRTC adoption today, but the potential is limitless when powered by IoT.
Oct. 8, 2015 04:30 PM EDT Reads: 7,472
You have your devices and your data, but what about the rest of your Internet of Things story? Two popular classes of technologies that nicely handle the Big Data analytics for Internet of Things are Apache Hadoop and NoSQL. Hadoop is designed for parallelizing analytical work across many servers and is ideal for the massive data volumes you create with IoT devices. NoSQL databases such as Apache HBase are ideal for storing and retrieving IoT data as “time series data.”
Oct. 8, 2015 02:45 PM EDT Reads: 499
Clearly the way forward is to move to cloud be it bare metal, VMs or containers. One aspect of the current public clouds that is slowing this cloud migration is cloud lock-in. Every cloud vendor is trying to make it very difficult to move out once a customer has chosen their cloud. In his session at 17th Cloud Expo, Naveen Nimmu, CEO of Clouber, Inc., will advocate that making the inter-cloud migration as simple as changing airlines would help the entire industry to quickly adopt the cloud without worrying about any lock-in fears. In fact by having standard APIs for IaaS would help PaaS expl...
Oct. 8, 2015 02:30 PM EDT Reads: 657