Getting Started with Analytics: Creating a Path Towards Usable Insights

If you are around public-sector transformation initiatives enough, you will likely have heard a variation of some (if not all) of the following statements:

  • “BIG DATA will transform our organization.”
  • “We will make DATA-DRIVEN decisions.”
  • “We need to collect MORE data on our services/operations/etc.”

Over the past several years we’ve seen an evolution in how federal organizations are looking to data to help facilitate both organizational transformation and ongoing continuous improvement to operations and services.  The statements listed above aren’t necessarily wrong (although the hyperbole doesn’t help), but might be misplaced given the current level of organizational maturity to actually USE data to implement changes. Further, Analytics is not simply reporting on the data you have, but also reporting on the data you need to make decisions and effectively drive policy, programs and services. The TBS Policy on Results, based on the ideology of Deliverology, is implementing instruments (DRF and PIPs) that are becoming a key part of departmental performance culture, and driving discussions around improving analytical capabilities.

This maturity gap isn’t always about a shortfall in data collection or infrastructure, but a willingness and capability at the program level to analyze and react to what the data is revealing.

Analytics is meant to describe the set of systematic approaches to analyzing data across different facets and sources within an organization to identify trends and patterns that can be used to predict and adjust parts of a business or program.  In the past, Analytics was commonly referred to only in the context of web or application-based data only (ie # of hits on a website, or app downloads).  Part of this was due to the fact that these sources of data are supported by robust analytics tools that slice and dice the data into easily consumable indicators and visualizations. It was also because business owners for these parts of the organization had been working with data longer and, in some cases, have more experience making more frequent changes within their business domain based on what their data is revealing. In response to these inherent cultural and maturity-level barriers, we have seen departments start to implement dedicated Performance, Measurement, and/or Analytics teams.

There isn’t a one-size fits all approach to building an organization’s analytics capabilities (in support of broader business intelligence capabilities, in most cases).  Some are dealing with massive amounts of data that require specialized software to process. Others require real-time information to make decisions on a day-to-day basis.  Regardless of the requirements, we would like to provide a few tips to those who are getting started:

  • Start Small: Building a dedicated analytics program within an organization can seem daunting, especially if the task involves improving maturity across the entire organization.  If possible, start with a few departmental programs that are readily available with the data that is needed and are willing to work toward more usable insights.
  • Explore, Prototype and Iterate: By taking a Design Thinking approach to working with data, you can help the organization determine where they actually need to use Analytics. Create mock-ups of the visualizations you intend to create; use those mock-ups to engage stakeholders in the organization and collect the data you need. Explore and analyze the data, build the visualizations, and allow for adjustments and revisions based on your findings.
  • Correlate and Predict: This is the one that many groups tend to miss. For every data set, the two questions you should seek to answer are: “What is this telling me about the past?”, and “What is this telling me about the future?” Most data and information is presented in a way that only illustrates what happened in the past (think speed of service over a given time period).  To start to leverage Predictive Analytics, organizations need to understand how different pieces of data and information are correlated across time and influence each other (for example: how does the number of customer inquiries today influence the number of permit applications tomorrow).
  • Avoid the Search for the Silver Bullet: We are seeing that everyone wants to race towards a specific tool or solution. While this may be a viable end-state, it doesn’t automatically lead to good analytics. Without understanding exactly where data is coming from and what departmental stakeholders need to know to make decisions, the solution can easily turn into a “garbage-in, garbage-out” situation.
  • Start with Existing Tools: Excel is your friend. While not ideal for large scale data sets or complex algorithms, it can be a powerful tool to start to analyze most data. As a bonus, most organizations are familiar enough with how to view and move data within Excel that it won’t seem like an IT-driven initiative. As an example, I downloaded the City of Ottawa’s full listing of Service Request for January 2015 (YAY OPEN DATA!!!), a total of 14,884 individual requests.

Through Excel’s built in Data Analysis “Add-in”, I used the Histogram function to count how often Service Requests occurred at different times during the day:

Without knowing too much about the data set, we can already start to analyze the visualization and posit some questions:

  • Why is there a noticeable dip in the # of Service Requests between 11:18AM and 1:41PM? Lunchtime?
  • Why do the # of Service Requests seem to increase slightly past 10:00PM? Noise Complaints?
  • Why is there a steep drop off at 7:00PM? Hours of operation for the call centre?

The above example was completed using excel in about 20 minutes.  A few logical next steps would be to:

  • Compare the above histogram with a similar visualization for each Call Type to see which ones influence different time periods.  This could help determine staffing levels for different types of roles (ie # of bylaw officers required to respond to late night noise complaints).
  • We could also run a Correlation analysis to see which Call Types occurred in the same time periods. This could help determine what types of cross-training the call-centre agents need to have.
  • We could load the data into a GIS tool and analyze the geographic distribution of different call types to identify patterns and dispatch maintenance crews more efficiently.
  • We could load the data into a visualization and analysis tool like Tableau to more quickly slice and dice the data.

As you can see, there is a lot that Government organizations can do to get started with Analytics.  While the end goal may be a robust business intelligence/analytics solution (hello Watson!), starting small and prototyping the organization’s needs will help uncover a host of benefits and work through business requirements in real time.

Happy Analyzing…



Comments:


Leave us a comment: * Your information is never shared