Up next – AaaS (Analytics-as-a-Service)

Written by on July 28, 2016 in Opinion with 2 Comments

AdobeStock_71229456I, and most other industry journalists, have been harping on for years about the growing value of data – especially big data. Much of it has been sitting around for years in data bases, repositories or warehouses mainly to be taken advantage of by marketing departments and bean counters.

But all that changed not so long ago when we discovered we could do so much more with it. Suddenly, collecting data and analysing it in near real-time to ‘enhance’ the ‘customer experience’ became all the rage. The fact that Google and Amazon, to name just two digital players, had been doing that quietly for many years was overlooked.

Then came the app explosion. Those clever little applications we downloaded by the billions turned out to be little more than data collection tools tracking our every move, every browse, every call, every photo and even our own body functions thanks to the addition of wearables to the mix.

They serve the purpose of filling the gaps for collectors of our personal data. Because our smartphones are personal devices they can be pretty sure the data being generated from them belongs to an individual. The Google-owned entity that developed the wildly popular, and totally invasive game Pokemon GO, managed to fill in the few remaining gaps the Google Maps and other apps were missing. And we fell for it – hook, line and Magikarp!

Other enterprises are now joining the data collection and analysis throng, and if they can’t collect it themselves they can always find someone willing to sell it to them. That’s all well and good, but do they really know what they can do with all that data and are they using it sensibly to get the best return on their investment?

The data comes to them in multiple formats, is often duplicated or over-lapping and is rarely verified for authenticity, i.e. did it come from a reputable source, is it original, has it been tampered with or is it a cover for fraudulent activity.

Companies employ ‘data scientists’ these days to perform magical analytical functions but although they can access the data and present it in a format the business wants, they don’t always have the experience to ensure they are getting all the relevant data and that it is ‘clean’.

I’m surprised to hear some telecoms operators stating that they are using their big data caches to perform audit functions, track operational KPIs, do revenue assurance and even uncover fraudulent activities on their networks. But what risks are they taking?

The last two, in particular, have been the realm of expert software developers and systems that are designed and constantly being upgraded by RA and Fraud experts that know what they are doing and what to look out for. RA and Fraud departments rely on these tools to certify and disseminate the relevant data and warn them in real-time of any anomalies.

Sure, this could be replicated in-house with data scientists but would they have the knowledge and skills to keep ahead of the crooks? The very same dilemma is faced by other business departments knowing they can utilise big data but not sure how best to. And how many data scientists and business intelligence analysts will you need to cover all facets of the business?

That’s why we are starting to see growth not just in cloud and virtual data storage but also services associated with extraction and presentment of the relevant analytics. There’s Analytics-as-a-Service (AaaS), which allows enterprises to outsource business analytics processing tools and platforms, and then you have Big Data-as-a-Service (BDaaS), in which a service provider can host your big data sets in the cloud and crunch them there.

The main sticking point for such services has been convincing businesses to unlock the ‘crown jewels’, stick them in the cloud and entrust tem to an outsider. On the other hand, more and more enterprises are embracing the cloud – IDC predicts more than two-thirds of IT organizations in Asia-Pacific will commit to hybrid cloud architectures by 2017. So judging from the growth of Everything-Else-as-a-Service, it’s probably just a matter of time before cloud-based analytics services take off.

Tags: , , ,

About the Author

About the Author: Tony is a freelance writer, regular speaker, MC and chairman for the telecoms and digital services industries worldwide. He has founded and managed software and services companies, acts a market strategist and is now Editor of DisruptiveViews. In June 2011, Tony was recognized as one of the 25 most influential people in telecom software worldwide. .


If you enjoyed this article, subscribe now to receive more just like it.

Subscribe via RSS Feed

2 Reader Comments

Trackback URL Comments RSS Feed

  1. Martin Chesbrough says:

    Tony, good article. I think analytics has been “in the cloud” for quite a while. Anything to do with Google, with Amazon, with wearables is there already. The “as a service” term is not particularly new but what I think does change the game is:

    1. Combining internal data (like CRM, billing, ERP, sales data) with external sources (social, IoT, web, open data) – you refer to this obliquely on your article.

    2. Using advanced data science techniques, like NLP, to develop patterns and insights from the combined data sets.

    3. Being able to automate the resulting actions, which requires analytics systems to become operational (with all that entails).

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

%d bloggers like this: