Thursday, October 10, 2013

Should organizations address the gap between traditional data management “rigor” and new technology and practices “flexibility” demanded by big data? If so, how?

Overview – Balancing "rigor" with "flexibility" will create the most leverage for your information management programs. Taking advantage of Big Data technology does not mean absolving governance. 

Key findings – Organizations are rushing to embrace the new technologies of Big Data, and often hard fought data management rigor is being set aside in the quest.  There is the potential to keep the cost of "failures" low, so the cost of "experimentation" and barriers to entry are low. Rigor can be injected after business value is established.

Recommendations – Maintain focus on the rigor of effective governance. In fact, re-double your efforts while reaping the benefits of new Big Data technology and processes. There is no absolution for those that forget quality, compliance, security, and privacy. Prioritize a review of your information governance plans and processes to insure that you are maintaining and increasing compliance in the head waters and wake of emerging Big Data programs. Take advantage of the flexibility to experiment, then apply rigor to operationalize once the business value is established.

What are the best ways to measure the success of a Big Data program?

Overview – Determining metrics and KPI's that align your Big Data program with intended outcomes and costs is key to success.  There is the potential to keep the cost of "failures" low, so the cost of "experimentation" and barriers to entry are low. 

Key findings – Organizations tend to wrestle with defining success for most programs, and Big Data is no exception. The key is to ascertain the measurable potential business outcomes, both positive and negative and align those with the organizations strategy. The intended approach, both technology and process, may be influenced as the costs are balanced with the desired outcomes.

Recommendations – The metrics for positive outcomes, such as monetization of information assets, must be weighed against the costs associated with producing results.

Pros and Cons of Big Data and the Cloud. When to use and how?

Overview – The source, audience and sensitivity of data will be determining factors in how the Cloud can be leveraged. Combining disparate data from disparate sources for disparate audiences can yield innovation and information value. There is the potential to keep the cost of "failures" low, so the cost of "experimentation" and barriers to entry are low. 

Key findings – Public clouds can provide the ubiquity of storage, compute and network resources to make collecting and sharing data, and analysis readily available globally.  Communities can be fostered around Big Data programs in the Cloud creating opportunities for collaboration that can lead to new insights and outcomes. Security and privacy are a major consideration in many cases. The transmission of data continues to be a significant consideration, depending on the sources and transmission methods possible but Big Data doesn't necessarily have to be big to provide value.

Recommendations – Consider the data sources, security, privacy and transmission methods available to assess the viability of Cloud storage. Review the community or audience and opportunities for collaboration. "Fail" fast with minimal investment, adjust and try again.

What will be the three most important trends/developments in Big Data Technologies through 2020?

Overview – Through 2020 three key trends impacting Big Data will be 1) monetization; 2) mobility and 3) privacy. The pervasiveness and pace of communications, collaboration, and transactions will dramatically accelerate all aspects of Big Data, driven by mobility, creating challenges in security and privacy and opportunities to monetize.

Key findings – Mobility is accelerating the pace of data and the opportunities for capitalizing on the flow of information. Exponential increases in volume, velocity and variety of data will propel the market, creating challenges and opportunities especially for organizations that move to monetize the information assets. The frequency of security breaches will drive the adoption of increasingly strict privacy regulations. 

Recommendations – Assess the information assets of the organization and consider opportunities for monetization, and channels to market. Consider carefully the implications of security and privacy for these assets. Prepare for the glut of data and onslaught of emerging technologies. 

Friday, October 22, 2010

Are you insinuating that Amazon is shorting you, or that you enjoy expanded space and time?

On the cusp of the GA version of the Amazon AWS connector for CA Process Automation, a gift from…

The new tier has limits, but offers the chance to experience Amazon Web Services provision.

From the article, I noticed something odd…

“The allocations announced included 750 hours per month of free EC2 and Elastic Load Balancer usage, which the company claimed was equivalent to continuous running.”

Last I checked there were 745 hours in the longest month of any given year, when accounting for Daylight Saving adjustment. So, the insinuation of “claimed” implies that somehow AWS was shorting us on our free (as in free beer).