Product
Design
UX
UI
All the most typical culprits of wanting to engage in usability testing or user research [UR]. Or, the [set] of qualitative / quantitative processes that better guide whatever you're designing → a product, a service, a community and so on. User research allows you to measure, basically, and it's an excellent staple to any stack of community metrics. UR can be as small as a three question survey every quarter to a fully fledged, months long deep dive that includes surveys, interviewing and an all hands presentation. I say, start small and work your way into UR comfort.
It can be overwhelming if you let it.
Quantitative research seeks to understand or measure the behavior of your members. Inevitably, it can be honed [quantified] to view and analyze statistical results. Think, NPS score, linear scale, or survey.
Qualitative research, alternatively, is more anecdotal than statistical. They encompass interviews and/or testing to get a deeper view and understanding of the experience of your members. Hard to put into numbers, ya dig?
SO, why in the hazelnuts do you care to introduce this to your community? Oh, let me count the ways!
To ensure a curated and resonant experience for YOUR members; not just following a 'tried and true' model that simply isn't tried or true for you
To finagle seamless processes [ie. onboarding] and systems [ie. gamification] that are attractive and memorable
To better understand and later, relay, the impact and value that your community has on the bottom line [there's always a bottom line]
To pave the way for potential leadership levels within your community, such as an advisory board, focus groups, or similar
Design thinking is a great perspective to begin considering how you'll tackle a project involving user research; community style.
Below are two examples of how to set up a quantitative or qualitative UR process. These are high level and thus, pretty simple, which allow you the space to expound upon them as much as you deem necessary.
The Survey Example | Quant
WHY is this being created? | Is it a recurring, monthly community health check? Do you want to know why your product-community isn't buying your product? Why engagement dips at certain parts of the year?
Less is more | Why have 15 questions, when you can have 10 direct, impactful and prioritized questions? There's no good answer to that so, do less. If necessary, create a little table for yourself and write your proposed questions as well as exactly how and why they will impact the BIG WHY.
Shorter is better | Same, same but different. Your members are filling things out constantly in their personal lives and they may be part of multiple communities doing similar things. Make your surveys short, user friendly and easy to navigate.
Clarity | Work on your own written communication! Clear, direct, culturally aware language gets better results. No double negatives! Get to your point and keep in mind the format of the question.
Formatting | Balance these out. If you have 10 questions: consider 3 open ended, 2 checkboxes, 2 multiple choice, 1 linear scale, and 2 surprises :) No one wants a full survey of open endeds, let's face it.
NPS | Always include this linear scale, you'll be happy you have this data over time.
Incentivizing | Only if you feel it's necessary [and it often works well].
Test | it on a small batch of people or your teammates/friends to catch errors.
Sorting | Have a place for the responses to go as well as a place for you to do your analysis of the results. Consider giving yourself a task for a month in the future or the end of a season, to return to the results and work on them for a block of time.
Understanding | Figure out what those next steps are → what have the results spelled out for you? Who sees them, do they have access? Are you presenting at an all hands call or to stakeholders?
The Interview Example | Qual
These don't necessarily have to be in the order I've placed them, hence the bullets. This would be great to have in a slidedeck or kanban format.
Scope | Another way of saying, WHY → what's the purpose of this research? What do you want to know and how will it impact your bottom line?
Roles | Who's doing what, who are the stakeholders, who do you need to collaborate with? I like using the MOCHA format.
Questions | Create that table from the above example and clearly identify the questions you plan to ask and why those particular ones.
Participants | What's the criterion for their participation? How will you be doing outreach? What's the duration for this part of the project?
AB Test | Could be interesting to a/b some facet of your project → for example, if you want to understand engagement waverings, maybe you'll a/b test by sending outreach via your proprietary app AND via the email your member has on file. Or, simply toggle the AB test button on your email marketing tool [like Klayvio].
Personas | Optional portion; but from all the interviewing/surveying, you will be orbiting the truth-telling arena of personaland. This is a whole other ballgame to discuss, but for now, this is the step where you could ponder the creation of member personas.
Key Insights | I particularly enjoy this one! This is where you can review all the results and discern patterns or trends. Fun [not being sarcastic, hard to relay in writing you know].
CTAs | What calls to action follow your big ol research project? What happens now? Will you tell your community the patterns you found? Will you be able to better show your higher ups the value and impact of your community?
This part is truly all fun and games, user research can be joyous, ambitious and oddly simple [not easy]. I believe that strategizing, executing and presenting on user research is a skillset that can be honed with each iteration of practice. It leads you to increased credibility, it leads your community to potentially deeper feelings of being understood, and it may lead your higher ups to strengthened trust and value in you. A beauteous cycle.
Comments