Wednesday 26 October 2016

Big Data, Data Analytics and AI

Image source: livemint.com
Big Data, Data Analytics and AI have been topics and trends that I’ve been keeping a "layman's" eye on for several years, mainly because I don't like surprises. If I'm going to be replaced at some point by a machine, I'd like to see it coming from a distance rather it sneaking up behind me!

One of the issues I have with with Big Data is just that – the term “Big Data”. It’s fairly abstract and defies a precise definition. I’m guessing the name began as a marketing invention, and we’ve been stuck with it ever since. I’m a registered user of IBM’s Watson Analytical Engine, and their free plan has a dataset limit of 500MByte. So is that ‘Big Data’? In reality it’s all relative. To a small accountancy firm of 20 staff, their payroll spreadsheet is probably big data, whereas the CERN research laboratory in Switzerland probably works in units of terabytes.
Eric Schmidt (Google) was famously quoted in 2010 as saying “There were 5 exabytes of information created between the dawn of civilisation through 2003, but that much information is now created in 2 days”. We probably don’t need to understand what an ‘exabyte’ is, but we can get a sense that it’s very big, and what’s more, we begin to get a sense of the velocity of information, since according to Schmidt it’s doubling every 2 days, and probably less than that since we’ve moved on by 6 years since his original statement.
It probably won’t come as a surprise to anyone that most organisations still don’t know what data they actually have, and what they’re creating and storing on a daily basis. Some are beginning to realise that these massive archives of data might hold some useful information that can be potentially deliver some business value. But it takes time to access, analyse, interpret and apply actions resulting from this analysis, and in the mean-time, the world has moved on.
According to the “Global Databerg Report” by Veritas Technologies, 55% of all information is considered to be ‘Dark’, or in other words, value unknown. The report goes on to say that where information has been analysed, 33% is considered to be “ROT” – redundant, obsolete or trivial. Hence the ‘credibility’ gap between the rate at which information is being created, and our abilities to process and extract value from this information before it becomes “ROT”.
But the good news is that more organisations are recognising that there is some potential value in the data and information that they create and store, with growing investment in people and systems that can make use of this information.
The PwC Global Data & Analytics Survey 2016 emphasises the need for companies to establish a data-driven innovation culture – but there is still some way to go. Those using data and analytics are focused on the past, looking back  with descriptive (27%) or diagnostic (28%) methods. The more sophisticated organisations (a minority at present)  use a forward-looking predictive and prescriptive approach to data.
What is becoming increasingly apparent is that C-suite executives who have traditionally relied on instinct and experience to make decisions, now have the opportunity to use decision support systems driven by massive amounts of data.  Sophisticated machine learning can complement experience and intuition. Today’s business environment is not just about automating business processes – it’s about automating thought processes. Decisions need to be made faster in order to keep pace with a rapidly changing business environment. So decision making based on a mix of mind and machine is now coming in to play.
One of the most interesting bi-products of this Big Data era is ‘Machine Learning‘ – mentioned above. Machine learning’s ability to scale across the broad spectrum of contract management, customer service, finance, legal, sales, pricing and production is attributable to its ability to continually learn and improve. Machine learning algorithms are iterative in nature, constantly learning and seeking to optimise outcomes.  Every time a miscalculation is made, machine learning algorithms correct the error and begin another iteration of the data analysis. These calculations happen in milliseconds which makes machine learning exceptionally efficient at optimising decisions and predicting outcomes.
So, where is all of this headed over the next few years? I can’t recall the provenance of the quote “never make predictions, especially about the future”, so treat these predictions with caution:
  1. Power to business users: Driven by a shortage of big data talent and the ongoing gap between needing business information and unlocking it from the analysts and data scientists, there will be more tools and features that expose information directly to the people who use it. (Source: Information Week 2016)
  2. Machine generated content: Content that is based on data and analytical information will be turned into natural language writing by technologies that can proactively assemble and deliver information through automated composition engines. Content currently written by people, such as shareholder reports, legal documents, market reports, press releases and white papers are prime candidates for these tools.(Source: Gartner 2016)
  3. Embedding intelligence: On a mass scale, Gartner identifies “autonomous agents and things” as one of the up-and-coming trends, which is already marking the arrival of robots, autonomous vehicles, virtual personal assistants, and smart advisers. (Source: Gartner 2016)
  4. Shortage of talent: Business consultancy A.T. Kearney reported that 72% of market-leading global companies reported that they had a hard time hiring data science talent.(Source: A.T Kearney 2016)
  5. Machine learning: Gartner said that an advanced form of machine learning called deep neural nets will create systems that can autonomously learn to perceive the world on their own. (Source: Ovum 2016)
  6. Data as a service: IBM’s acquisition of the Weather Company — with all its data, data streams, and predictive analytics — highlighted something that’s coming. (Source: Forrester 2016)
  7. Real-time insights: The window for turning data into action is narrowing. The next 12 months will be about distributed, open source streaming alternatives built on open source projects like Kafka and Spark(Source: Forrester 2016)
  8. RobobossSome performance measurements can be consumed more swiftly by smart machine managers aka “robo-bosses,” who will perform supervisory duties and make decisions about staffing or management incentives. (Source: Gartner 2016)
  9. Algorithm markets: Firms will recognize that many algorithms can be acquired rather than developed. “Just add data”. Examples of services available today, includingAlgorithmiaData Xu, and Kaggle (Source: Forrester 2016)
The one thing I have taken away from the various reports, papers and blogs I’ve read as part of this research is that you can’t think about Big Data in isolation. It has to be coupled with cognitive technologies – AI, machine learning or whatever label you want to give it. Information is being created at an ever-increasing velocity. The window is getting ever narrower for decision making. These demands can only be met by coupling Big Data and Data Analytics with AI.
A summary of all the above is included in these slides

Tuesday 18 October 2016

What modern organisations can learn from the Bletchley Park code-breakers



A few weeks ago I wrote about my eye-opening visit to Bletchley Park in a post called 'Huts and Silos'. This inspired me to arrange a KIN Site Visit to the home of WW2 codebreaking. The idea was
to see what modern organisations could learn from Bletchley Park's innovation, collaboration and organisation set-up. On Friday, 13 of us had an inspiring tour of the site; this is the result of our reflections at the end of the day.



Participant observations of the Bletchley Park operation
Possible lessons for modern organisations
1.
Diversity of backgrounds and professions represented. Unusually, class distinctions were immaterial.
Different perspectives & backgrounds = higher likelihood of finding solutions to problems. Complementary skill sets.
2.
‘Silo’ working at Bletchley Park was a necessity for security reasons
Sometimes there is a good reason for clear separation of operations, for example Chinese Walls for financial operations.
3.
Despite much of the work being tedious and the workers conscripted, morale was high and ambitious targets achieved
Intrinsic motivation (having a goal that workers believe in and work that plays to strengths) can compensate for difficult circumstances. It’s not all about pay and rations (literally!)
4.
Socialisation and relaxation was seen by senior management as an important factor in managing stress and keeping productivity high. Eg tennis, dances, beer!
Informal spaces to relax and converse with co-workers are vital in building relationships, trust and the exchange of ideas (clearly the latter didn’t apply at Bletchley Park)
5.
Unusually for the time, female staff at Bletchley Park (2/3 of the total) received equal pay to men. Note: we are unsure if this applied just to the code-breakers, or all female staff.
One hopes that equal pay is no longer an issue, but we must be vigilant with regard to biases. The KIN Spring 2017 Workshop will include this issue.
6.
Individuals with specialist skills were given very specific tasks; not asked to be generalists
Too often experts are asked to take on generalist roles (such as managing teams or budgets). This can be a distraction, or cause stress or under-performance.
7.
There were many failed attempts at problem solving. This was anticipated and processes in place to understand root cause of failure. In one instance, the Navy code-breakers took 9 months of repeated failure before cracking a problem.
We need to have a defined level of tolerance for failure, and ensure processes are in place to take action as a result. ‘Anyone who has not experienced failure has never tried anything new’ – A Einstein
8.
The code breakers had to deal with up to an astonishing 6000 messages per day. These had to be processed before midnight every day, when the Enigma settings changed. The industrialization of the processing and analysis may be the first example of Big Data and Data Analytics.
Processes and skills for the analysis of huge volumes of rea-time data are becoming ever more important. AI may be a way of understanding hidden patterns an inferences (see KIN Winter Workshop, 7th December).
9.
The actors in ‘The Imitation Game’ spent time talking directly with Bletchley Park veterans, to  understand what it was like to work there.
First-hand, verbatim knowledge is vital in understanding context and nuance for handovers and other knowledge transfer situations.
10.
Having tough targets and working under critical time constraints can sometimes foster ingenious solutions. For example the ‘cribs’ shortcuts.
Sometimes disturbing the staus quo or adopting counter intuitive approaches can bear foster innovation.
11.
A good source of personnel were cryptic crossword puzzle fanatics and other critical thinkers
Do we encourage critical thinking and individualism sufficiently in our education systems?

Monday 3 October 2016

Rhetoric - a much need skill for knowledge workers

Rhetoric seems to have negative connotations these days. That's a shame, as Aristotle's approach to 'the art of effective or persuasive speaking or writing' is a skill that anyone effecting change in organizations must have. Female staffers at the White House have proactively employed rhetoric in a very innovative and specific way to get their voices heard.
Whilst we're talking philosophers, the most effective masterclass technique that I train facilitators in is Socratic knowledge transfer. Too often deep experts reach for their Powerpoint slides and simply impart their wisdom, without demanding critical thinking. Much better to have a dialogue based on seeded 'judgement call questions' and elicit personal insights or experience from all those participating. Getting the 'expert' to hold off imparting their solution or answer until the end of the discussion is tricky!
The process involves a carefully selected and rehearsed case-study that gives plenty of context and has two or three decision points which relied on judgement. The 'expert' pauses at the judgement calls and asks, for example, 'what would you do?' or 'what else do we need to know?' or 'what do you think happened next?'. On several occasions an entirely novel approach or solution has emerged that the 'expert' had not considered.
One vital component is getting the right participants. Everyone invited should potentially have something to contribute to the topic. In that way there is not just one 'expert' in the room.
The process is particularly effective in generating new insights and conveying complex ideas, but needs careful coaching and facilitation.