Seven decades of evolution to reach today’s digital revolution

Written by on January 12, 2015 in Guest Blog with 0 Comments

One year from now it will be seven decades since ENIAC, the “GIANT” brain was introduced. If this can be regarded as the starting point of the computerized technological evolution, why are we today talking about a digital revolution? If this question is asked to a million individuals, I guess a million different answers will be given.

Some weeks ago I was listening to an interview on the radio with an Indian professor arguing that our university level education system is obsolete. The long and short of my take on his argument was that information is readily available to everybody at a touch of a button to the effect that there is no logical need for information to be stored in the human brain for instant retrieval. It is much more important for humans to learn how to join dots and reach consensus and conclusions based on the collective relevant information available in the “GIANT” brain i.e. Cloud, Analytics from Big Data sources.

Thinking about it: We put our trust in individuals who have stored large amounts of information in their brains, often segregated by specialization (engineering, financial, health care…), and we end up thinking of them as subject matter experts. As such we believe, since they have proven through tests/exams to be able to store and retrieve information, they are the best qualified to act on the information as well, and best fit to provide guidance, often in isolation of others.

There is an old saying that: “Information is power.”

Is that still true?

Is the revolution about digital or is the revolution about a totally different way elements (humans to humans, humans to machine, machine to humans and machine to machine) interacts, and that are made possible by technological evolution and the coming together of technological component?


Technology is today fundamental to our individual, family and business lifestyle. As such we relate and adopt what is being made available to us and that we regard as adding value. Through technology we have instant access to information as and when we need it. Imagine a car salesman or woman, in the past they were required to attend corporate training in order to obtain knowledge about the product they where selling. Today customers know more about a product’s features than most sales staff. Hence consider this:

  1. Is it cost-effective to send i.e. car sales personnel on curriculum based training, being trained on how to position a car’s specification and functions?
  2. Do individuals really need to visit a shop to make a car purchase, when they know 80-90 percent of the car’s characteristics?

Expand this, can we ask similar questions in other industries and what happens if we consider colleagues from other functional areas.

Jay Cross of the Internet Time Alliance claims that corporate training is broken, and what worked 20 years ago doesn’t work well in the social, always-on, networked world of business we now inhabit.

This is true regardless of industry – Bank & Finance, Insurance, Health Care, Communications, Media or Entertainment.

Old Habits Die Hard

For decades corporate IT has adopted a project oriented approach to implement technology. In a nutshell, project approach is based on:

  • Gathering Functional Requirements
  • Design
  • Implementation
  • Testing
  • Change orders, and
  • Launch

While organizational development has been determining:

  • Business Direction
  • Strategic Objective
  • Business Capabilities
  • Business Goals

However, the above always creates a gap between the two, Business strategies and Technological execution.

In short, with technological evolution and changes to individual and corporate lifestyles enabling every individual within an organization the potential access to most or all the information needed to form qualified opinions, don’t we need to seriously reconsider our approach to:

  • Aligning business strategies and technological execution;
  • Learning methods of collaborative based learning;
  • Perceive technology as an integral part of Organizational Development and focus on business enabled technology;
  • Assume individuals will draw or reach conclusions (most often dissimilar) based on experience, background and responsibilities.

Or is this too disruptive/revolutionary?


Many industry events point out the need for innovation. More often than not innovation do not take place simply because the organization itself do not foster or motivate innovation. They all live by the definition of conservatism: “Agree, something needs to change, but not right now. We need to be practical.”

What if we took advantage of the easy access to information and the informed employees or individuals?

What if we accepted that employees or individuals who holds exact same information reach different conclusions?

…and created a platform for collaborative learning environment. Could the result be more innovation and less bickering and protective or confrontational views?

Technological evolution, I believe, has given us the means to revolutionize the way we interact and drive innovation.

Tags: ,

About the Author

About the Author: Cato is an industry veteran and Subject Matter Expert bridging Business Architecture and Support Architecture with more than two decades hands on leadership and management experience performing complex Business Solutions project. Cato also regularly speaks at industry events. He has experience from working with vendors, service providers, management-for-hire and consulting. .


If you enjoyed this article, subscribe now to receive more just like it.

Subscribe via RSS Feed

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.