Category: Business Operations


GOC: Why You Need a Community Manager

by Denise Eisner

Every week it seems I hear of another government department that is drafting its social media strategy to “communicate with external and internal audiences.” Once implemented, a member of the Communications team is tasked with overseeing the social media channel, which typically involves getting approvals for messages to be posted in both official languages. The icons are put on the departmental home page, and management is content that they are in the game.

But not so fast. It isn’t a social media strategy that’s needed as much as an engagement strategy, and to engage, departments need specialists who perform that strategic role. That’s the view shared by Kelly Rusk, a consultant with Thornley Fallis here in Ottawa. With a background in PR/communications and stints as a community manager, Kelly has experienced what it takes to put the social into social media. We asked her to share her thoughts on why government needs community managers to oversee their social media efforts.

How did you start?

My first job was at a small e-marketing firm, where I carved out my own role in PR/communications which looked eerily like the role of today’s community manager. I started up and managed our company blog, managed our newsletter, updated the web site, hosted events for our customers and wrote for industry publications. Later I was head hunted by a Montreal-based start-up and hired as community manager where I managed the company newsletter, blog, Twitter account, and was responsible for finding and retaining members in our online community. I also did media relations and travelled to industry trade shows and conferences to help get our name out and meet our online followers/fans/community members in person.

There are social media managers and community managers. Which role best fits government departments and agencies and why?

Different people define roles differently, but in my eyes the community manager is a strategic role that revolves around building a community of interest, whether that is in a separate online community, through various social media and marketing channels or whatever makes sense for the audience. Building a community also means getting members to speak and interact directly with each other so engagement tactics play a huge role as well. A social media manager is usually a marketing position for someone whose main responsibility is updating social properties and creating content specifically for those properties/channels.

That definition in mind, I think community manager makes more sense for government department and agencies because it’s a goal-focused rather than tool-focused. Government needs to be adaptable and accountable when it comes to online strategies and I see the community manager as a more adaptable position. The trend is spreading where social media functions across an organization, rather than putting it in a separate silo, which is what a social media manager position might be perceived as. I like and believe in this approach and feel a good community manager is poised to lead this type of change internally both in private organizations and government departments.

Given the role you identified, what are the key activities that must be managed?

A community manager starts with a plan that feeds a business goal (i.e. increase membership, revenue, awareness etc.,) and defines target audiences (customers, partners, employees, stakeholders, etc.). Then she/he must determine the appropriate ways to reach and engage those people. In my experience this can include a mix of the following: a newsletter and/or email list, an online community site, a blog, social networking sites (Twitter, Facebook, LinkedIn), webinars and in-person events (meet-ups, conferences, tradeshows, seminars). Often these activities feed into each other. For example, you can collect email addresses – with permission of course – at events to add to newsletter or place social follow buttons on the blog.

To be in that role, what skill sets are needed?

Excellent communication skills are essential. Often times a community manager is the “face and voice” of the organization, so s/he needs to be able to express him/herself professionally and in a way the organization is comfortable with at all times. A community manager needs to be forward thinking and always looking out for new industry trends as things can change quickly online. This person also needs to understand how to measure his/her activity, use web analytics and probably Excel.

If you were hired by a government department to lead the organization’s social media efforts, what would be the first three things you would try to accomplish and why?

I would conduct a social media audit by looking at what’s happening inside and outside the organization with relation to social media, what tools are available, training etc. Then I would focus on developing internal guidelines and policy. Ideally a community manager is most effective when he/she has buy in and support from the entire organization. The guidelines and policy help make other employees comfortable with potentially using social media at work, as well as to help management understand and buy in to employee usage. And lastly, I would build a strategy – this would outline what I would be doing, how it will be executed, and how it will be measured. It would feed into or be part of a larger communications strategy.

Kelly Rusk is a consultant at Thornley Fallis. Follow Kelly at @krusk.

Denise Eisner is a senior consultant focusing on information architecture, performance measurement and web strategy. Follow Denise at @2denise.

Dashboards with a Difference

by Kathy Roy

I could write about Dashboards from many different perspectives: I could talk about the importance of credible data and its merits for decision-makers or I could talk about their highly visual and easy to understand look with funky gauges and brightly coloured circles. But if I did that, I wouldn’t be talking about what will really help your organization measure for success.

You see, many Dashboards fall short of expectations, and the reason for this is simple: Not all information is created equally.

Too many Dashboard development processes spend more time on the look and feel of the Dashboard than they do on ensuring they are collecting the right performance measures. In the age of data overload, it’s not the colour of your gauge that will let you hone in on the right performance measures. So how do we differentiate which are the right performance measures?

There are three key characteristics that you can benchmark against when deciding which measures to use:

  1. They must be aligned with the organization’s current direction and pressing priorities. Senior management must identify the performance measures they need to be successful then support their organization to go out and measure them.
  2. They must be understood and controllable at some level in the organization. If an organization has a poor performance in one area, senior management should be able to ask the tough questions, receive answers, assign fixes, and see the performance improve on future reports.
  3. They must set an expectation for people to strive to achieve. Senior management must align their performance measures with their planning processes in order to set targets for attainment in their work performance.

Dashboards that include the right performance measures find themselves in the privileged position of being able to achieve improved performance results. The shape and form that the dashboard takes is about usability; the data and measures underpinning the dashboard are about performance improvement. And isn’t that what all organizations are seeking when they implement a Dashboard?

Clean out the ROT in your Web Closet

There’s something eerily satisfying about cleaning out a closet, or at least it is to chaosphobes likes myself who can’t function in a messy space. With less stuff to search through and now organized in neat categories, that closet is a place where I know I’ll find what I need and whether there’s a gap that has to be filled.

There’s a strong parallel to looking for content on a large Web site. Too much badly organized and irrelevant content makes finding the good stuff harder and more frustrating. But cleaning out the ROT (Redundant, Outdated or Trivial content), is not exactly a priority for most government departments. Putting up content, particularly reports and news releases, seems to dominate the priorities for most Web teams. And so the ROT builds up, until someone says a major overhaul is needed.

A major overhaul is like moving to a new house: expensive, stressful and time-consuming. A regular content audit however, is like cleaning out a closet every so often: cheaper, manageable and faster. For a site or subsite of less than 5,000 pages, it can be done in approximately two months, barring any other major drains on the team’s time. The steps are straight forward:

  • Designate a project manager to oversee the content audit and track progress.
  • Capture all the existing content information in a spreadsheet.
  • Determine what you want to know about your content: accuracy, findability in search engines, usefulness to audiences, etc. and share that methodology with the project team.
  • Engage content owners to identify their content and determine its accuracy, and web specialists to rate the content’s findability and usefulness.
  • Collate the findings to see what content can be archived, refurbished or kept as is.
  • Report recommendations to senior management for action.

If nothing has been done to the website in three years, a good content audit should identify at least 50% ROT. One of our clients just completed an audit with 67% ROT. They now can focus on improving the remaining content, and thus make their site more useful to the people that visit it.

With departments starting the planning process for the next fiscal year, now is the time to determine which older sections of the Web site (or the entire site) need a review, identify available project resources and build timelines for a content audit.

“Do Not Track List” Implications for Web Analytics

The U.S. Federal Trade Commission’s recently released report about consumer privacy at first glance makes web analysts like me shudder. It says, not surprisingly, that some companies appear to treat consumer information “in an irresponsible or even reckless manner” and “do not adequately address consumer privacy interests”. The gist is that unless government steps in to corral the unfettered encroachment on individual privacy, consumers may become the hapless victims of “physical security, economic injury, and unwanted intrusions into their daily lives.” Talk of a creating a “do not track” list to protect consumers is underway both in the U.S. and now in Canada.

Assessing the performance of a website without some quantitative gauge of what people are doing (or not doing) makes it nearly impossible to know if the web channel is meeting the needs of users and the goals of the organization. I worried that the FTC’s proposed framework for protecting consumers would greatly impact any analyst’s ability to use quantitative tools to measure web performance.

But the FTC does put some common sense into their framework by allowing data collection for what they term “commonly accepted practices.” These include the obvious, such as collecting the address for shipment of a purchased product, fraud prevention, first-party marketing (the site you’re visiting offers you free shipping) and happily, collection about visits and click-throughs to improve customer service.

Mind you, quantitative or clickstream tools are only part of the analyst’s arsenal. Real insight comes with a myriad of inputs about user behaviours, be they on-page surveys (“did you find what you were looking for today?”), testing of different calls to action, or other methods that marry the what with the why with respect to how users interact with a Web site.

I agree with the FTC’s assertion that organizations should be more transparent about their data practices, which at a minimum includes publishing plain language policies explaining how individual data is used. The same policy considerations are needed in Canada, allowing both for privacy of Canadians’ personal information as well as recognition that Web performance data helps government, non-profits and for-profit organizations reach their audiences more effectively and efficiently.

Denise Eisner is a senior consultant within the Government Service Excellence practice.

Open Data in D.C.: From Potholes to Apps for Democracy

David Strigel is the Program Manager of the Citywide Data Warehouse at the District of Columbia Government. Citywide Data Warehouse is the first initiative in the US that makes virtually all current district government operational data available to the public in its raw form rather than in static, edited reports. David shared the challenges and successes with the project with Systemscope’s Denise Eisner following his presentation at GTEC.

David, at GTEC you described the effort involved in getting government data sets online. What was the biggest lesson from the early part of the project and what might you have done differently?

Start small and start now. Washington DC did not wait to plan out a 10-year program. We started with one dataset in one format and put it online as is. We started with a single dataset in a single format: it was service request data, including things like pot holes and trash pickup.

Looking back, what might you have done differently?

The District, as one of the first to open their data in such a large volume, had to create custom code to build their data catalogue and data connections. With open data initiatives becoming more of a priority in many jurisdictions, many companies have appeared on the market offering solutions that can help a government launch a program similar to the Citywide Data Warehouse.

We now state in many RFPs that projects must share the data so the vendor and project budget take this effort into account at the planning stages. Adding this language to RFPs from the start would have helped us move along faster.

Can you describe some of the interesting ways in which people are using the city’s data?

A resident has created an intricate website that uses the District’s data feeds to provide economic development information, including permit status and crime statistics. There’s also crime reports as I mentioned and another called everyblock.

The District’s Apps for Democracy contest challenged residents to utilize information available from DC’s Data Feeds to develop consumer-based applications. Local developers produced 47 applications that were conceived, developed, and delivered in 30 days. Winning applications include a DC biking guide, government purchases over $2,500, parking meter locations, community garden sites, and more.

Our clients here in Ottawa always face change management issues with major projects. What are the change management issues you encountered for the warehouse project and how did you mitigate them?

The most prominent organizational obstacle is that some individuals within government do not always welcome the idea of creating opportunities for the public to criticize government operations. There are select employees that worry that the public will misinterpret raw data or that the data owner will be unable to affect how the data is used after it resides in the centralized data warehouse. These employees often cite budget cuts, limited production capacity, insufficient technology resources, or other top priority projects as reasons not to participate in data democratization.

How did we convince people? It depends on the objection:

  • We have no money: If our program has the time, we can do the database work for the agency and create a read-only table in the customer’s database for CityDW’s use.
  • This will create opportunities for the public to criticize government operations: we are in a way opening ourselves to criticism but in some cases the public can help the government QA the data, find issues and help us fix those issues.
  • Limited production capacity: CityDW can connect and get updates in the middle of the night.
  • We don’t have a way to get you the data: CityDW has become very flexible as to how agencies can send and update data. We prefer an Oracle to Oracle connection but as with any government many different technologies are used across the city. In many cases agencies update a Excel report and send a CSV file on a scheduled basis; CityDW automatically ‘watches’ for the emails and updates the data in the data warehouse.

What are the financial realities for executing an open data strategy in government? How do you make a business case for embarking on open data initiatives in these economic times?

You don’t need a massive budget to get started. You can start an open data program with a part-time: project manager, developer and DBA and grow from there once you have users requesting more data. Governments and resident will both benefit with having a centralized location for the data… there will be second set of eyes on the data. Issues may be found in the data that will enable the government to save money or identify revenue sources.

What was the desired outcome for your open data strategy and program, and how have you been able to measure its success?

We wanted the residents to be able to take our data and visualize it or create applications that would benefit other residents and create new ways of looking at the data that we never considered. We wanted another set of eyes on our data, looking for issues that need to be corrected.

All our data is available via other applications, dashboards, tools, and reports. So in addition to tracking downloads we have to track usage and log-ins of the other applications and dashboards… for downloads in fiscal year 2010 we had almost 2 million downloads. We can also measure our success by the number of incoming requests for more data, dashboards, and reporting environments.

Do you have any suggestions as to how an open data initiative at the municipal level might be leveraged across all levels of government (city, state/provincial, federal)?

Washington DC is a unique situation in that we are a government that has to function as a state, city and county. Building a program that contains the central location for enterprise wide data creates many beneficiaries and not just the residents and press. Once of the most common uses of data are Districts employees. They may not know who in the government controls access to a particular dataset but they do know that the data is published at data.dc.gov. There are no forms to fill out; employees just like the residents can download agency data so they can do their job faster and more efficiently.

A centralized data warehouse allows you to expand beyond open government and data sharing. Once you have the data, reporting environments can be created and used inside the government. Business intelligence applications and dashboard can be built to help executives view the overall ‘health’ of their agency without having to drill down in to all of the data line by line.

Denise Eisner is a Senior Consultant in the Government Service Excellence practice.

What do wines and apples have to do with Service Innovation?

I tweeted the title of this Blog recently and left the answer hanging as it does here.  Due to the raucous response of my fans begging me for it, I have decided to satiate their inquisitive appetite and finally post this Blog entry.  Ok, so if you replace “the raucous response of my fans begging me for it” with “a few of my friends asking if I’m actually going to post the Blog” you’re probably more accurate, but let’s not digress too far.

If you have read any of my posts, you will have recognized an underlying reductionist theme.  I often find myself trying to simplify things.  My posts about Service Innovation and Transformation carry these reductionist tones as well.  My most recent post, “The Fallacy of the Innovator’s Ambition – A Call for Tinkerers“, was very specifically a discussion of how innovators need to simplify their ambitions to protect being overwhelmed by them.

What does this have to do with wines and apples, you ask?

Well, I recently read a brief article in the New York Times about a restaurant in Atlanta called Bone’s that had integrated Apple’s iPad into the dining experience.   They had simplified their view of innovation and added a simple element to their service experience.  When patrons are greeted at the entrance for a table, they are handed a menu and an iPad.  The restaurant purchased 30 iPad’s and built an application that housed their wine inventory along with descriptions of the wine and expert ratings to help diners understand and select bottles.

(Pause: for those who haven’t made the connection, “Apple’s iPad” and “Wine Application” and “Bone’s Restaurant in Atlanta” is the answer)

Device inspired moment of truth

Customers to date (it’s been about two months since they have been offering the iPad experience) seem to be pleased by the twist to their dining experience, and if an 11 percent increase in wine purchases per diner is any indication, so is restaurant management.

What was interesting to me about this story was the simplicity of the addition of the iPad to the restaurant experience, the appropriate “fit” for the device at a table, and the utility of the application for a restaurant context.  Setting aside those individuals who know a lot about wine and wine pairings, wine lists at a restaurant for many represent an opportunity to use random selection as an efficient decision making tool (hold menu up – close eyes – point index finger somewhere on page – press against page – choose wine).   While the iPad and the wine application don’t turn patrons into wine connoisseurs overnight, it does empower them with knowledge so that a confidence is instilled in the choice that is made.  This confidence translates into positive emotions, and thus, a better dining experience.

The restaurant hasn’t disrupted the restaurant industry business model or invented a new food.  They have not redefined the dining experience or significantly altered the value proposition.  In fact, their location, staff, and menu likely hasn’t changed at all.   So, have they shown us a great example of Service Innovation?  Absolutely.

A hallowed Eaves takes the scary out of open data at GTEC

A public policy entrepreneur, open government activist, negotiation expert and loquacious blogger, David Eaves champions the cause of open data in government. Systemscope’s Denise Eisner spoke to David after his well-received appearance at the GTEC conference in Ottawa.

David, one of the open data issues that emerged during the GTEC conference was the public sector concern regarding risk. Clearly the issue hit home as you blogged about it immediately! How would you advise federal departments trying to balance openness/transparency and risk management?

The only acceptable risk is if you think people are going to do something illegal. The policy infrastructure for dealing with people who do bad things already exists. This is not to say you shouldn’t have a communications strategy about your data. People may find missing or incorrect data, but that’s OK, you can fix that.

At GTEC, you issued an open challenge to the public sector audience to open their data. Why do you think open data in government has yet to take root?

The technology for sharing is relatively new and people in government are not used to sharing. But there’s a huge appetite among a cohort of public servants and they’re trying to get their institutions interested in sharing.

So what “low hanging fruit” should government departments be thinking about when deciding which data sets to make available first?

I would find all the data sets we already share with the public and consolidate them to a single portal. It is going to cost money and it will require people, something like a swat team to make data sets ready. Sometimes you have data trapped in proprietary structures and you need to get that into a format that’s usable for the world. Edit your procurement so that any new system you buy has an open data component to it. You have to reshape the vendor market.

The return on investment for open data is not just transparency: it’s an efficiency ROI.

You started an open data Web site, ostensibly to fill a void for accessible government open data. What have you learned from that process?

The thing that surprised me the most was how far some government departments have come in sharing in their data, particularly Natural Resources Canada and Environment Canada. As of today there are about 200 sets available.

Why not wait for government to finish their data portal project?

What makes the digital era so exciting is that we don’t have to wait for government to act: we can do it on our own. The datadotgc.ca site creates a safe place where we can model the behaviour we want government to display and then have them copy us. That would be the goal.

Did anything surprise you during your recent visit at GTEC?

In Canada, we’re late to the game. What makes the conversation around open data different in DC or Vancouver, is that in those places the conversation is around how we’ve come this far, and how do we keep going. In Ottawa it’s still a question of how we get past that first hump. I don’t know exactly what’s causing this, other than big projects across ministries are always incredibly difficult. A notable difference between the UK and Canada is political leadership. Open data and transparency is a real priority in England. On day one of the new conservative administration, the UK’s prime minister announced he would release more data than the previous administration.

You sound hopeful.

It’s not a huge onerous task. I think that other countries have been able to do it. They have been able to move forward. In most cases government has the data and it is extractable. There may a cost to making it ready. But it’s not a hard issue. What’s hard is shifting the culture.

There are a lot of great people doing this work and we need to support them. One group that I’ve been working with and is pushing forward is Parliament. They’re developing an XML feed to the Hansard with a launch planned for January. This is an important and huge step forward.

I’m hopeful that the incentives are in place to do open data here in Canada. There are potentially significant savings for government through efficiencies, better vendor agreements and more.

Denise Eisner is a Senior Consultant in the Government Service Excellence practice.