Content strategy: getting it and testing it at ASTC (NSW) 2014
I’m attending the annual conference of the Australian Society for Technical Communication (NSW), ASTC (NSW) 2014. These are my session notes from the conference. All credit goes to the presenter, and any mistakes are mine. I’m sharing these notes on my blog, as they may be useful to other technical communicators. Also, I’d like to let others know of the skills and knowledge so generously shared by the presenter.
Carol Barnum‘s keynote presentation was titled “Content Strategy: How to get it, how to test it?” Carol introduced content strategy, then shared some work she did with clients that focused on content, showing a success and a failure.
Carol started with the definition of content strategy (CS) from Wikipedia:
Content strategy refers to the planning, development, and management of content—written or in other media. The term is particularly common in web development since the late 1990s. It is recognized as a field in user experience design but also draws interest from adjacent communities such as content management, business analysis, and technical communication.
A definition from the Content Strategy Consortium:
CS is the practice of planning for the creation, delivery and governance of useful, usable content.
The two key points are planning and usability.
Carol gave us a short history of CS, and some reference material and the people involved, showing how new the field is. It’s a good time to get involved! Looking at the authors of the books, they’re mostly technical communicators. Yet somehow, we’re not deeply involved in the core discussions. It’s time for us to take the initiative here in Australia. Carol showed a map showing that the main interest was in the US, followed by Australia, then the UK, Canada and India.
What do content strategists do?
Audit the existing content, figuring out where it resides, when last updated/accessed, what is still relevant, and so on. This provides a high-level overview of the content.
Plan for the creation, publication and governance of useful, usable content. How you’re going to update, remove and create content.
Manage the content creation and revision.
We may be currently engaged in the last activity. To move up the ladder, add the earlier steps to what you do. This will help to prove our worth and become more valuable to our organisation.
Ensuring your content is useful and usable
Be there to see where the users struggle. Especially when there’s no help or user assistance yet, this will help guide the content creation.
Top findings in usability testing: People can’t find what they’re looking for. They get stuck navigating around the site. They don’t understand the terminology, or there’s inconsistency in the use of words. The page design is confusing or not accessible. The design of the interface doesn’t match the users’ mental model. The users bring expectations based on previous experience, and a good user experience (UX) design will tie into these expectations.
What about the content specifically? Users need to be able to find the content. Next, they need to understand it and be motivated to take action. And they need to be able to complete the action satisfactorily. Were they happy or frustrated at the end of the process?
Carol talked through a list of techniques on making your content work for users.
Case study: Green Engage
Green Engage is a web app produced by Intercontinental Hotels Group for general managers of their hotels around the world. The app allows the managers to create plans and track progress of projects aiming to make the hotels green, that is, environmentally friendly.
The app is content based. Carol’s team did a round of usability testing with 14 users. Carol recommends a group size of 10-15 people (so, a small group) of people who are deeply engaged in the subject. The customer’s requirements were to test navigation and workflow, defects and system feedback issues, perceptions around content, and training requirements. Carol emphasised specifically the term “perceptions” in this list of requirements, which came from the customer. Carol thought there were problems with this list of requirements, but went with it.
The first round of testing showed that users liked the idea of the system, but found it hard to understand. There was no user assistance at this time. Of the 14 users, 4 people called for help – they called 11 times. In other words, they were good and stuck. The existing content was overwhelming and unhelpful.
Eight months later, round 2 of the usability testing happened after a complete redesign of the system. The system was redesigned, but the content was exactly the same. There were no technical communicators involved. The requirements for the testing were the same, but with a new goal: Measure the SUS (System Usability Scale) score improvements. Carol recommends this scale as an excellent measure. The average score is 68 – you can use this as a benchmark for measuring your own results. The next step is to measure your own benchmark, so that you can raise your own score over time.
The SUS score from round 1 was 54. This is bad, and not surprising. Round 2 had a SUS score of 73.
Five months later was round 3. There were only 4 users in this study, so we need to use caution when drawing conclusions. This study happened during a pre-launch of the system. Users loved the new system, in comparison with the old system. They found the homepage helpful and that the app supported their goals. The SUS score was 94! Don’t pay too much attention to that – it was only 4 users and they were comparing the new system to the bad old one.
Carol’s team suggested another round, using new users as well as existing ones, employ some tech writers to improve the content and add context sensitive help. so round 4 happened one year later, with 16 users including new and returning ones. The focus was on the content, with a goal to find out how well users understand and can act upon the content. The test gave users a specific set of tasks, based on their use of the content and a specific goal to achieve.
Feedback showed that users had a goal (get profit for their hotel) rather than wanting to read content. They wanted a solution, not an idea. On the negative side, users couldn’t find what they wanted. They were confused by the terminology, and there was a lot of content, but not useful and not well organised. The SUS score was 50. Carol says this was fantastic, because it shows they had work to do. It was the first time content creators were involved.
Case study: Footsmart
Footsmart is a website for people who have troubles with their feet. Carol did one usability study of a single template.The study compared the original content (A) with the new (B). Users would see one or the other, and did not know which they were seeing. The goals were to verity that the new content (A) was effectively communicating with users, to understand how users responded to the content (e.g. it’s voice and tone) and to see if they acted upon it.
Carol showed us examples of A and B. The new content was better laid out, and included graphics.
All 6 participants liked the newer content better and found the thing they were looking for more easily (heel cushions). An interesting conclusion from the detailed results is that people asked for more content in the redesigned site (B). In other words, perhaps you should show them a little at first and give the option to find more content for those who want it.
How to start as a content strategist
Plan the content you control. Create an overview of the types of content on the user interface (UI) such as labels, buttons, help text, etc. Think about the tone and treatment of that UI content. Create personas of your users, and collaborate to develop those personas. Create a style guide based on your conclusions from the above steps. And become involved in usability testing yourself, or even run one yourself.
Thanks Carol for a very good introduction to content strategy. The case studies were fun and interesting, and a good illustration of how usability testing fits into an overall content strategy.