Documents/DHSWIP/1: Web Improvement/1.4: User Expectations and Customer Service

1.4: User Expectations and Customer Service

Ensure that websites are meeting user expectations and needs and that the customer experience with websites is continually enhanced.

Other Information:

How does your Agency currently ensure that websites are meeting user expectations and needs and that the customer experience with websites is continually enhanced? Developing Web metrics at DHS is a key goal for the Web governance bodies. We learned from the secretary's 2011 data-call and benchmarking report about our baseline on web metrics. Only one component has a key performance indicator program and of the eight operational components only three use satisfaction survey data. Furthermore, nobody is actively managing search and we pay for multiple implementations. For human performance testing for usability there are limited efforts at work and but no agency-wide strategy. The Metrics and User Experience Committee is leading the charge to develop a more mature approach for how DHS measures our investment in online communications. Performance measures for the Web ideally cover five areas: Usability testing – We are starting a usability program with a best-practice scorecard to evaluate how sites perform key metrics including usability heuristics. The scorecard, now in a pilot, measures several factors in a weighted system that provides each site on a 100 point scale. The metrics factors map to those listed on page three of the federal domain survey. We hope to gain an apples-to-apples comparisons in how our sites perform. The scorecard, in an excel workbook with formulas, has an accompanying handbook which explains the factors included in the scorecard and how to measure your performance with each factor so it can be a self-assessment tool. Web analytics – Behavior-based data on factors like traffic, page views, bounce rates, and time on site can be captured as key performance indicators and require an enterprise wide analytics tool. We are making plans to roll-out an agency wide implementation of Google Analytics in FY12 after we clear all the policy hurdles. Satisfaction surveys – The backbone of any satisfaction survey is measured in three questions: Were you satisfied? Would you come back? Would you recommend to others? Right now DHS has five websites across three operational components that utilize satisfaction surveys. We aim to have more uniformity in question sets as part of our improvement plan. Search – Insights here can help us actively manage search. We are currently limited by the lack of a common search appliance. We are examining the GSA search offering as an opportunity to make gains in the execution of search, which hopefully will lead to gains in search performance. Business goals – The Department's Efficiency Review office has identified cost-avoidance as a key goal for us to measure, including accounting for expected savings from the shift to cloud computing. Other business goals - which are in development - can build on this foundation. As we take steps toward turning website management into a data-driven process, the metrics committee's work will give us firm ground for success. The metrics committee has a mandate from the Executive Steering Committee to take a number of steps in FY12 to make improvements. The group will also recommend key performance measures to be used across all of DHS. This will give us an apples-to-apples way to talk about how sites are performing. A roadmap to common metrics tools and a monthly metrics dashboard will cap the committee's initial push for change. Another exciting aspect of the metrics committee's work is the forum where we'll take a collective look at our Web customer service standards. The outcome of this discussion will be captured in the Customer Service Plan developed in response to the Executive Order on Customer Service.

Indicator(s):