Thursday, October 30, 2014

How Government uses the power of high performance analytics - or how SAS would encourage them to do so (using SAS tools maybe?)

A 2012 white paper covering a survey led by SAS amongst USA government agencies on the use of analytics by these agencies offers some nice insight into government trends with regard to the importance, understanding, and use of web and social media analytics - even if it is 2 years old.

The white paper: How Governments are Using the Power of High-Performance Analytics

The authors: SAS (pronounced Sass as in the American Southern expression 'don't sass me' - after all, global headquarters are based in North Carolina's family-friendly answer to Silicon Valley) used to stand for 'statistical analysis system'. It grew out of a system rooted in agricultural analysis to become globally renown as the go-to company for analytics software for, among others, governments and pharmaceutical companies.

A bit about the authors

SAS is a leader in predictive analytics. Predictive analytics roughly means that where most of us react to our analytics in real time, noting a sudden trend and formulating a quick response in what is known as reactive analytics, we should be mining our deep pockets of historical and current data to forecast what actions (actions, not reactions) we need to take to avoid, exploit, or even cause online and offline events we want to see with regard to our brand, policy, communication activity, event, etc.

SAS uses predictive analytics and its widely dispersed teams of expert analysts for its business clients and for its government clients. An oft-cited report I've seen in my field is the UN Global Pulse - SAS study on how English-language forums in Ireland and the USA could be used to correlate online conversation about employment with trends in actual unemployment, i.e. relevant forum discussion between specific individuals could predict that unemployment was on the horizon for a large number of individuals.

A bit about predictive analytics

Not so impressive, huh? Who couldn't see that talking about unemployment probably stems from possible upcoming unemployment?

Well, wait, it gets more interesting. I worked in my early beginnings as an Analyst on predictive software for pharmaceuticals (not for SAS, though I did work with SAS a little bit later in my career). My work involved building a taxonomy - a list of terms - and citing each of these terms as positive or negative terms with regard to a specific subject. For example, 'headache' = bad when found in a sentence or paragraph with a specific medication.

Why? Because a headache is an annoying side effect of that medication, so probably the sentence or paragraph is bad news for the medication and the patient on that medication. Too specific? Well, imagine feeding these terms and others into an algorithm that learns over time which words are 'good' words when associated with a specific medication or illness and which words are 'bad' words when associated with the same medication or illness. Now release that algorithm on the web to review and catalogue and even learn (yes, the machines - sorry, the algorithms - are smarter and learn faster and, importantly, are more objective in their analysis than the human beings these days) as much about the public perception about a drug or illness as possible.

Suddenly, marketing, public relations, and even research divisions are better informed with regard to which part of the public will spend money on what and why. Very helpful information when formulating business goals and allocating resources, yes? What will be an easy sale to whom and how, where people will pay more attention (and may be willing to spend more money), etc. is all right there in the data.

This applies to government too - governments provide services to their citizens, after all. Wouldn't it be good to be able to analyse which citizens wanted what sorts of services and via which channels and where improvements could be made and where current successes could be promoted - and how? The Barack Obama Presidential campaign used simple (and free) Google Analytics to optimise a site that aimed to collect donations and volunteers and noted a 40% increase in subscriptions and financial donations as a result of keeping an eye on just a few metrics.

Back to the white paper...

SAS is not without an agenda.

No white paper or report is written without an agenda, and SAS is more upfront than many in pointing out that it sells what it feels the US government should use: more high-level analytics modelling to better plan, implement, and achieve their missions. 

That said, the benefits of large-scale analytics and the data-driven decision-making that, when used properly, such analytics are supposed to enforce have obvious benefits. Machines and algorithms are faster than people, process more data than people, and can detect random trends more quickly than people. Plus, people are subject to their personal and societal biases (even top-level universities are subject to this), which means that people can miss important patterns, correlations, trends and shifts among audiences, users, and stakeholders - thus missing warning sides and/or opportunities that are obvious to the machines. (Though, yes, software can have a cultural bias built into it, but this is a topic for another blog post.)

Given we're all drowning in data, what's not to like about machines designed to move us from information overload to the more easily rectified (we hope) filter failure? Analytics, properly used, are supposed to correct filter failure and help us turn amazing swathes of relevant data into usable information. 

So how do government agencies in the USA federal system use the power of high performance analytics? 

The survey focused on analysts and managers and their perception and use of analytics in decision-making. 

Where did analysts and mangers agree
Both felt that employees tended to lack the relevant skills and training necessary to take advantage of analytics - to turn simplistic metrics and data collection into useful, actionable, and insightful data. Both managers and analysts seemed to also agree that existing analytics needed to be more timely and accessible (read: easy-to-understand and take action on) as well as more automated with regard to sourcing and collating (read: less human-resource intensive). Lastly, both agreed that improved data visualisation of analytics would be a great plus (which kind of combines the two earlier points, I think. The ability to access and take action on the data is, to a large degree, dependent on how quickly you as an analyst or manager can see and understand both the data and the data's implications for your business or organisation.)

Where did analysts and managers disagree? 
Managers tended to still rely more on prior experience than on data to make decisions. In my experience, this could be due to many reasons, from the data arriving too late to be factored into an urgent decision to managers who are a bit too self-confident (the 'I know best'/'Mommy manager' syndrome). Not to mention, decisions made by managers, in government and elsewhere, tend to factor in a lot more than just the obvious data trends in a market or among a target audience - office politics, aspirations for promotions, and personal relationships and departmental priorities all impact any decision a manager must make. Factoring all this data into a so-called 'data-driven decision' could definitely result in data taking a back seat to prior experience. 

Analysts, in a move to make super-analytics Google 'master' Avinash Kaushik proud, preferred data-driven analytics when it came to decision-making. Analysts also thought that managers lacked commitment to making data-driven decisions (according to managers apparently, analysts aren't far off in that assumption.) Finally, it looked like analysts felt that sufficient resources were not being appropriately diverted to make true analytically-backed data-driven decision-making possible (okay, so we are agreed that we need more skills and training - budget for it, why don't you, managers?)

Overall SAS lauded the managers and analysts proactive will to use analytics but seemed critical of the overly simplistic metrics upon which most analysts and managers relied. Not sure if I agree completely - the KISS (Keep it Simple, Stupid) rule is very important when it comes to any sort of web metrics. If people don't feel like they understand the data, they are less likely to trust it (unless it backs up what they already believe, that is, which is not exactly a good thing if you want a data-driven decision-making culture in place of a self-involved navel-gazing, I know best because I know best because I know best...etc. culture.) However, SAS did cite the need for more easily accessible, easy-to-understand and take action on behalf of my organisation data for decision-makers, so maybe my caution is unnecessary. 

Maybe missing...

Something that jumped out at me, as a student of AK's Market Motive Web analytics class, is that some government departments may simply lack a clear digital marketing and measurement model. Analytics, like any form of data collection, need a framework that corresponds with the overall purpose of the group or organisation running the analytics and collecting the data. Data collected with no clear purpose is, while not useless, not easy-to-use. If managers are struggling with shifting political agendas and unclear or simply opaque organisational missions and goals, it would make any form of data-driven decision-making (or even just identifying simple metrics to measure) difficult. 

No comments:

Post a Comment