STC Summit day 3 – Mobile app design
Joe has been working with mobile apps for quite a while. He thinks that we’re at a really interesting point, a time that will change the way we do things. He compared this period with the arrival of the mouse in 1989, and the way that changed what we do.
Mobile means widely different screen dimensions (mostly small) and touch interfaces. People are using new interactions with the UI. We need to look at software in a different way.
The interactions are changing very quickly. There are very few explicit standards. We need new techniques and processes, to deal with proprietary interactions.
We need a different vocabulary. Instead of “click this”, it’s “touch this”, “tap that”. But even more, we need to understand the full environment. The person is in motion when working with their software.
One technique that may be useful in this area is conditional processing. This is something we must look at in our tools.
Joe predicts that Apple will announce something new to do with touch interfaces in their upcoming conference.
Examples of touch devices:
- Tablets: iPad
Writing help for user interfaces
Joe showed us the Chicago Tribune mobile app, which overlays a layer on the screen showing you how to use the touch interface. A member of the audience mentioned that the USA Today app does the same thing.
Joe thinks that all mobile apps need some form of user assistance. He works for clients that develop mobile apps. For example, a mobile app that works with payrolls on the move. The mobile app is a “sidecar”, in that it provides some of the functionality of the desktop app but not all. The users already knew how to use the desktop app. The challenge was to bridge the user’s existing knowledge and transfer it to the mobile app.
Some interesting facts and tidbits:
- A member of the audience mentioned that you need to be careful when showing pictures of hands, or pointing fingers, as different gestures are considered rude in different countries.
- Tablet dimensions allow more interactivity, which can mean a more complex UI to document.
- An easter egg: Go to Speedtest.net on a mobile device, and pull the speedometer down and you see the person’s cat. Keep pulling down too see him in various positions. This is a hidden gesture just for fun.
The language of gestures
Finding the best word is very important. Work with your thesaurus and test the candidates with your users, to see which words work better for your customers.
It may take several days to design 4 words!
For example, the use of the word “hide” dramatically changed the way users thought of the interaction.
Remember that the choice of words may need to change depending on the mobile device in use.
Joe showed us some examples of gesture language in user assistance:
- AutoCAD gives short textual instructions in a banner across the top of the screen. This is guided help that instructs users while they do the task. This is effective but takes a lot of resources to do.
- Convertbot has a tutorial that shows popup bubbles describing the UI elements. Interesting is that they use the words “press the OK button” instead of “tap OK”. This may be the use of generic language to work on different platforms, or it may be that they didn’t think of changing the language.
- eBay have video tutorials, using live fingers. This works, but you need to consider how much you need to show. Also, the device itself tends to be obscured and in the background.
Vocabulary for touch and gestures
See the Touch Gesture Reference Guide. The authors went through a number of pages of documentation for various platforms, to analyse the terminology used for gestures. This is very useful.
Joe led us through a discussion of whether we would use specific terminology per platform, as chosen by the device vendors, or whether we would use generic language. The audience chose to use generic language. Joe said that sounds sensible, but consider the user. For a customer who has chosen a Windows phone, you would probably be using terminology that is more typical in an iPhone or Android phone. Your documentation may use a different vocabulary from the other help for this device.
A member of the audience pointed out that this terminology issue means that you must work closely with your translation manager. Joe showed a slide about translation issues for gesture terms. In some languages, you will probably end up with different translations for the same term, depending on the context.
Conditional text will be a big part of our toolset.
Joe works with HTML5, CSS and MediaQuery as a solution for allowing devices to announce their screen dimensions and interfaces to stylesheets, which can then do transformations based on the device requirements. He has stylesheets to cater for 4 device types:
- 7-inch tablet (Galaxy and Kindle)
- 10-inch tablet (iPad)
We need to consider device-specific controls in our documentation. For example, different Android devices have different button bars. Or a gaming device will have gaming controls instead of those used on a phone.
Things are moving so fast that voice will soon start making user assistance easier. Siri works fairly well. It’s limited to apps that Apple wants to support. There is talk that Apple will expose an API for Siri so that other app developers can use Siri to give user assistance, either through return voice or by taking the person to a certain topic on the app developer’s web server.
Joe showed us an example of asking Siri how to write a cheque in Quicken. You get a link to the Quicken documentation. This doesn’t really work, because there is no API for Siri. Joe hacked the example together to show us that it could happen!
Thanks Joe. This was an inspiring session.
Posted on 24 May 2012, in STC, technical writing and tagged Joe Welinske, mobile UA, STC Summit 2012, stc12, technical communication, technical documentation, technical writing. Bookmark the permalink. 1 Comment.