Designing remote content testing
Building a pandemic-friendly test plan and the pros and cons of going remote.
I recently had to do some content testing for a user research activity. The content was for a public facing website, targeted at all Australians and the topic was a complex one. Learning how users read and understood the content was a must.
I decided on running a highlighter testing activity to learn what content the participants understood and what content was problematic.
What is highlighter testing?
Highlighter testing involves presenting participants with content. They read through it armed with two different coloured highlighters. The task — highlight sentences they find easy to read and understand in one colour and sentences that are difficult to read or understand in the other.
The testing is conducted, findings are analysed, and the content and design is then iterated in line with the user feedback.
The testing allows designers to tailor the content, so it fulfils the business goals, and it is easily readable and actionable by users.
Why highlighter testing?
I picked highlighter testing because it was important to me that I understand what language my target audience uses and finds easy to comprehend.
This type of testing also allows researchers to learn what information users actually need to understand the topic at hand.
The twist for this project was that my test needed participants aged 60+ from all over Australia. And in a world riddled with stay-at-home orders and border restrictions, this made it even the slightest bit more challenging.
In a pre-COVID world I would do this activity in person. I’d sit with the participant, explain the context and watch them highlight away, asking questions as they go.
The current environment and recent shift to remote work and research meant I needed a new pandemic approved method.
Testing new methods
Take one
Initially, I thought Google Docs would be the best solution to remote highlighter testing. I’d give each participant a link to a Google Doc with the content and be able to watch them highlight in real time.
The interview component could be conducted via Google Meets to allow me to ask questions if need be.
It was a brilliant solution, but it only worked for 2 of the participants. The digital literacy level required to set up this method was far too high for my target age group. Even the mention of Google Docs was enough to raise eyebrows from some participants.
Back to the drawing board.
Take two
Finding a way to conduct remote highlighter testing with users who aren’t digitally native was a real challenge.
After my Google Doc failure, I sent the content as a Word document attachment to an email. I also canned the idea of being on a video call.
Instead, I thought I’d give the participants a phone call and explain the activity. Once they knew what was involved, I’d send them an email with the Word doc attached.
Then I’d ask them to download to document and highlight sentences on their devices while staying on the phone to me. After they’d finished reading the content, I’d get them to save the document and email it back to me.
Participants still struggled with this method. Downloading attachments from an email and highlighting text on Word were barriers for my participants.
Again, I had failed to cater for my users in the design of my activity. Back to the drawing board.
Take three
Third time round I stripped the method right back.
I planned to begin the activity with a phone call. Once I had explained the activity, I’d send an email to the participants with the content in the body of the email.
I’d have them read the sentences aloud to me on the phone, allocating each sentence a colour — green for easy to read and understand, red for difficult or hard to understand. I would do the highlighting on my end.
I’d finished the activity off with the same follow up questions.
This method was by far the most successful and the one I ended up running with. All participants found the process straight forward and weren’t overwhelmed. The activity ran successfully and smoothly.
The outcome of the testing was mostly positive. It outlined the language choices that the participants understood and identified with the most.
It also highlighted to me that public facing content should have no terms that have assumed prior knowledge. I was able to iterate on the content in line with the participants preference for language choices and offer simplified explanations for complex terms.
The iterating and testing process for designing a highlighter activity was interesting.
Trying to come up with a solution that catered for remote participants and had low digital entry barriers made it a challenge.
To me, it highlighted the importance of considering different user groups when designing research activities and making sure I cater for them appropriately.
I really only resorted to remote testing because of the current situation with COVID-19 locking down our States and making access to information and services worse for vulnerable people. For this type of testing and with the broad range of demographics, in person would still have been the preferred method for doing the testing.
However, adapting the method to cater for remote access allowed me to actually get valuable data I would’ve otherwise not had. But performing this activity via online channels rather than as an in-person activity does come with its downsides.
Pros for remote highlighter testing
· Access to more people — in my case, this trumped all. The method I adapted for this activity allowed me to recruit far more participants than I would’ve otherwise had access to.
· Low cost — conducting the testing via channels that participants already had and not having to offer large incentives to take time out of their day to be physically present cut costs.
· Speed — testing can be conducted quickly and iterations can be made
Cons for remote highlighter testing
· Digital barriers — Despite stripping the method right back to cater for non-digital natives, there is still a digital barrier present in the final approach I took for the highlighting activity. The testing required users to have an email address and a working phone in order to participate.
· Harder to build relationships — Let’s face it, over the phone drinks really aren’t the same as sitting down with someone over a glass of wine. The same applies to the testing process. It is much harder to build rapport with participants over the phone. This can affect the findings of the activity, because users may be more honest with researchers, they feel comfortable with.
· Harder for participants to be honest — By asking participants to read the sentences aloud and identifying challenging words or phrases, we are relying on participants being honest about their level of understanding. This can be challenging for some people and may lead to participants altering their responses to save embarrassment.
Hopefully this helps you next time you’re doing remote highlighter testing with non-digital natives!