By: Rich Clayton, VP Business Analytics Product Group at Oracle
Thomas Edison, the inventor of the incandescent lightbulb created the right environment for innovation, using a laboratory containing what he perceived as all of the key materials to make change. Each of his buildings in Menlo Park and Newark, New Jersey had all the necessary tools, machines, materials, and skilled personnel housed within the complex. This allowed him to have small teams working on several problems at once and streamlining the disruption process.
This ‘laboratory’ approach works as well in modern times as it did 150 years ago; to use data to disrupt you must be able to see and understand all the signals faster than your competitors, so you can predict faster and disrupt rather than be disrupted.
The ‘lab’ approach
What you intrinsically need is a fundamentally new approach, which is on offer through a concept becoming known as the big data lab.
Like Edison’s example, this lab becomes the access point to all data including external as well as internal, and transcending silos. Rather than delivering a system of record, it is a system of innovation. This is because it enables users to see all the relevant data in one place, understand its potential, enrich the data and make it better and thereby both unlock insights and share the value, all in a highly visual and intuitive manner.
This ability to provide data to non-technical people has the impact that it accelerates the usefulness of data and access, potentially enhancing its value. Users can start to experiment with the data, and test out hypotheses quickly, easily and in an extremely low no risk environment.
The tools of the past put bias into the data because it had to be structured into the yes/no format of the relational database. Today’s new big data technologies don’t have that rigidity of structure and facilitate open ended and “what about?” type questions. This can be a very scary change for some.
Here’s an example of what a data lab approach can mean to a media and communications company. Say currently it gets half of its multi-billion $ in revenue the old-fashioned way – from ads. To keep that spend up and potentially increase it, the company needs to offer its advertisers the best value, and to do that, it has to be able to target right down to the individual set top box. By using data for customer insight to combine and enrich data from set top boxes with customer data to create better models the organisation can better understand customer preferences and experiment with ways in which to provide additional value, which can translate to more revenue from network partners, happy subscribers, and innovation by monetizing data as a new revenue stream.
Contemporary excuses for not harnessing data driven decision-making in business are fading fast. Workers are becoming increasingly data literate. And the advent of powerful data visualisation tools is putting insight into the hands of everyone, and fast.
In fact, the economics of Cloud for BI are staggering. If you are just looking at one dimension: the time to provision, it takes approximately six months to setup enterprise environment for on-premises analytic applications. With Cloud, that can be as little as just six hours. In terms of effort, what you are looking at is as little as a three step process to provision the cloud, compared to 84 in your own environment.
This is enabling us to re-imagine how analytics and data play in the world today, and are empowering a completely different approach.
Powering an innovation sprint
I believe that this is going to power a really interesting time of change. Why? Because what we see is that innovation, which is so essential in business today, happens in short sprints, not long marathons.
Gartner refers to this as “bimodal IT” – where you have the traditional, stable and efficiency driven IT, alongside experimental, agile environments that are tightly aligned with business units. What this enables is two speeds within the business: a sprint and a marathon.
It empowers the non-techie business person to be able to sprint and do something quickly to see if there’s value. Then that learning can be taken and put it into the marathon that moves from experimentation to implementation. And that’s a hard thing to get right, and many companies still don’t realise they need to change their processes and culture, as well as upgrade their skills, as they evaluate what new big technology can do for them.
It’s also essential that these discovery projects are tightly connected into the broader organisation before any experimentation goes too far and can’t be commercialised, either because there’s no budget, corporate appetite or the fact that the pilot just can’t be scaled to production effectively. There are few success stories from organisations that use standalone visualisation tools – with little or no overriding governance, security, integrity, consistency, or ability to scale.
We are the first generation to use these tools, and undoubtedly, we will learn and we will get better at it. For those at the forefront of experimenting with data, the key things to remember are that innovation happens in short sprints, and therefore what we need to make this happen is more sprinters than marathon runners. In addition, the technology itself will continue to evolve. Increasingly it will be the machines asking the question or spotting the needle in the haystack. This is going to be a really interesting time of change.