Some 20 years ago, I used to play an online video game called Red Alert 2.
I once racked up 45 straight wins and my username was listed among the top 10 players in the world with a unique ‘unstoppable’ badge.
I don’t think I ever played another game using that account. The fear of losing my top 10 status and breaking my streak was too high.
Instead, I created another account and began playing for fun again.
To a game designer, it probably made a lot of sense to list and celebrate the top players – especially those with great streaks like mine. But they didn’t truly emphasize with how it made top players feel.
We make these mistakes all the time in our communities too and it’s what our psychology of community course is really about.
It’s about being able to very clearly see and understand what members want.
There aren’t any Jedi mind tricks here, just a practical understanding of how to identify what different segments want (you can’t just ask them) and deliver on those wants.
This depth of understanding should inform everything you do, from developing the community platform to welcoming newcomers, rewarding members to punishing members etc…
The more you understand how members actually think, the easier it becomes to deliver on their wants and avoid common mistakes. Some examples:
- Members care far less about any reward than what the reward represents. How many people received it? How publicly was it given? How does it relate to other rewards?
- Once you begin calling members experts or giving them badges, they feel pressured to always have the right answer and maintain their reputation (see above). This is as likely to reduce their level of motivation than increase it.
- Members care far less about 1 million people liking their post than 10 close peers loving their post. If the top 10 people in a field love their article, they’re thrilled.
- Letting members submit ideas without being able to see the impact of those ideas (even if they’re rejected) harms their motivation and attitudes towards the company.
- Members want consistent moderation processes, yet they want to feel special and unique when you engage with them.
- It’s far more motivating for members to participate in a small, exclusive community than it is for a big, mass community. Mental associations with a community are critical in determining when and how people participate.
- Asking someone to help out in a small way is likely to drive people away, creating a big opportunity only a tiny number of people can apply for attracts people.
Psychologists have long dropped the notion that people are rational actors who make decisions based upon which option offers them the biggest tangible reward.
It’s time we did too.
Instead, your members (and you) are far more likely to take action based on how you conceive your own identity in relation to the group you feel you/want to belong to.
If you want to get better at understanding what your members truly want, I recommend signing up for our Psychology of Community course.
(reminder prices for our Strategic Community Management course rise tonight).
It’s a mistake to believe that you win the argument if you have the right data. During this week’s webinar, I presented this slide and asked attendees whether the level of traffic was going up, down, or staying the same.
The audience was split between the three.
We’re all looking at exactly the same graph and coming to completely different interpretations.
That’s pretty incredible don’t you think?
But this is true of most data. Do you look for the line of best fit, the trend in the past few months/years, or some sort of average over them?
You can see this in the images here:
This happens with almost every possible metric. Everyone brings their own biases into their interpretation of data. If they don’t know you, don’t believe in you, or don’t understand the community then the same metrics you’re delighted to present can be perceived negatively.
You might be delighted to show the community solved 20% more customer questions than last year while the head of customer support might note it’s still only 5% of the total questions the support team receives. Why bother?
Data is one of many signals that helps people understand how the community is doing/what it’s worth. It isn’t the strongest or weakest. Others include the relationships you’ve developed with senior people, whether you’re delivering impact for them, what the narrative of the community is, who else supports the community, whether they’re engaged in the process of community etc. etc…
It’s better to have great data than not, but don’t imagine it will be a silver bullet solution to get the support you want.
- Why most community strategies are pretty terrible.
- Why it’s important to define the problem (or opportunity) before suggesting solutions.
- The components of a community strategy.
- How to collect and interpret community data.
- And a lot of practical tips on deciding and executing your tactics.
Don’t worry if you missed it, you can now watch it below (or click here).
We also highlighted a lot of the resources we share and teach people to use during the course. This includes:
- A big strategic community management resource.
- Our list of 1600+ brand communities.
- Our member survey template.
- A (simplified) example roadmap.
- A typical project plan template.
- Our community benchmarks.
- And our detailed community strategy template (course only)
I believe learning to think strategically about a community and putting together a clear and comprehensive strategy you actually execute is a skillset every one of us should have.
If you’re working without a strategy today (or if your strategy is missing key elements such as a logical plan of action, risk factors, governance, resource allocations, etc), this course will help.
The course fee rises this Friday, you can sign up using the links below:
Here are responses to a client’s survey question distributed by length of time they’ve been working in the field (n=356).
Take a second and think how you would interpret this data.
One interpretation is to look at the newcomer group (and possibly the veteran groups) and decide to have more relevant discussions, content, to cater to that audience.
For example, a beginner’s guide to the topic, introductory level discussions, topic overviews etc…
Perhaps a more exciting interpretation is to look at these same groups and conclude “hey, maybe my community isn’t for them”
The survey data could equally be telling you to focus on people with 1 to 10 years of experience in that field.
Instead of trying to balance out the mean, you increase the inequality to focus entirely on a small group of members.
Remember, the danger in trying to accommodate newcomers is the content might become less relevant to the people who find it most relevant today.
It’s often a lot easier to double-down on what is working than trying to fix what isn’t. You can design the entire community concept solely for those who find it most relevant today.
Sometimes, it’s just easier to focus on the good than the bad.
If you want me to ruin your day, ask me if you’re measuring the right things.
I’ll probably ask you what you will do differently if any of your metrics go up or down.
Then I’ll probably try to say something profound like: “If you don’t know what you will do differently because of the data, it really doesn’t matter what data you collect”
Data analysis should be the most powerful tool in your community professional toolkit.
It should tell you exactly what to stop doing, keep doing, and how to fix problems in your community.
But almost no-one is doing this today.
We spend a lot of time during our Strategic Community Management course changing how participants think about measurement.
Four Levels of Using Data
In our benchmarks, we rate data skills on a range of 1 to 4.
You can see a simple breakdown of these below:
Almost everyone begins at a level 0 (not even measuring the right things). By the end of the course, we aim to get most people to level 3 or 4.
Here’s a breakdown of what each level looks like:
Level 0 – 1 – Measuring what happened
If you’re dropping monthly metrics into a big spreadsheet to send to your boss, you’re probably at level 0.
You’ve probably selected the metrics which are easiest to get instead of those which best reflect your work.
You’re also probably not presenting the data well or making any inferences about what the data is telling you.
Level 1 is usually the level of being able to pull data (automatically) from multiple sources into a single spreadsheet with simple to read dashboards. Here’s an example we put together for a client below:
This spreadsheet automatically pulls data from Google Analytics and runs SQL queries on the platform (along with some simple calculations) to complete a useful data set.
From this you can put together some simple dashboards to show what is and isn’t working like those below:
p.s. Just from these 4 (of 10) graphs, can you spot a few major problems already?
However, none of this data means anything if you have no idea what to do with it.
This means we need to understand why a metric we care about went up or down.
LEVEL 1 – 2 – Analyzing why it happened
Amazingly only a tiny fraction of people ever bothers to do this.
If growth is rising/falling or if activity is going up or down you need to know why.
Most people prefer to guess rather than analyze the problem in detail.
To analyze why a metric went up or down, you need a decision tree and additional data.
For example, imagine the number of posts in your community drops. There can be three possible reasons for this.
- Fewer members are posting.
- Members are making fewer posts.
- Both of the above.
So you need to check the average number of posts per active member and number of members who made a post in the past month.
Let’s imagine members are posting less. This is still interesting, but it’s not useful.
To make it useful you need to know which type of members are posting less.
Hence (as you see in the table above) we like to know the average number of posts from the top 1%, top 10%, top 50% and bottom 50% of active members etc…
You can equally segment this by the average number of posts by newcomers, irregulars, veterans etc…
Is there a drop across all categories (which suggests a technology/level of interest problem) or is it focused on a single group (i.e. has the mean number of monthly posts from the top 1% dropped over time?)
Now you can look specifically at top members.
Was there a sudden drop in their level of participation or has there been a steady decline (usually speeding up until it becomes noticeable)?
This tells you if the problem is an external event (technology or product change) or if it’s a steadily declining loss of interest.
If it’s an external event you can pinpoint the date and see what else changed around that time to fix it.
If it’s a steadily declining loss of interest you need to figure out if they are losing interest in the topic altogether or just with your community?
If they participate in other communities, you can ask what they like better about that community and either incorporate features or develop new ones to bring them back.
By the end of this stage, you should be able to make really specific statements like:
The level of activity in our community has dropped in recent months because our top members are participating less. This is due to them preferring to answer questions on Quora, StackOverflow, and other online communities which are easier to use and gives them more visibility than they get from our community.
Notice how specific and concrete that is? It also rules out a lot of possible solutions which won’t work.
Most people if they see engagement dropping panic and try a bunch of silly tricks to get people to participate more (AMAs, photos, live chats, gamification etc…!).
These are wild guesses which have no chance of fixing the problem.
Now you know exactly what’s broken, but you still don’t know how to fix it.
LEVEL 2 – 3 – How To Improve It
This is the critical step.
It doesn’t matter how great you are at analyzing the problem if you don’t know how to fix it.
This means we need to put together a few ideas of how to fix the problem and test them until we get an answer.
From the above, we would probably test ways to improve the ease of community and the visibility members get from writing blog posts, asking questions, and answering questions.
First, we would look at who has solved this specific problem really well.
Asking peers helps (p.s. please stop asking for help until you’ve properly diagnosed the problem) so does your own research into existing brand communities which have managed to keep highly engaged experts.
Because every situation is different, you need to draw up a list of possible solutions to test. For example, this might include:
- Let top members speak at our annual conference.
- Help get positive PR coverage for best member contributions in the trade press.
- Feature member contributions on Twitter, in our newsletter, and link to their solutions within product documentations.
- Let top members co-create on user guides which get sent to all members who use the product.
- Develop a plugin/tool to enable members to share their external contributions easily with the community.
(aside, as a rule, don’t aim for equivalency with a competitor. Always do something a competitor can’t match).
If you’re pushed for time, you can try all of these at once. But you’re unlikely to have the resources to do it (and even if you did you wouldn’t know which was working). So try to test each individually.
Over time, you can rule out options until you find the ones that work.
Remember that any change might take 3 to 6 months to show up on the dashboard.
LEVEL 3 – 4 – Designing An Ongoing System For Improvement
But what if there isn’t an immediate problem?
This process doesn’t just explain a way to fix problems, it also shows a way to constantly improve your community.
You can inverse this process to identify what’s working well, analyze why it’s working, and then doubling down on the areas which work best.
At the very top level, you want to be dedicating more and more resources to the areas which produce the best results.
This first means identifying whether a tactic achieved its impact and deciding whether there is more potential in that impact (i.e. more people who can be reached, a bigger change in behavior etc..) if you invest more resources in the tactic.
This means you need to design a process where each month (or every 3 months) you measure which tactics are working well/not working well and then decide whether to repeat, kill, tweak, or invest more in the tactic.
The purpose of this system is to gradually allocate more and more of your resources to the areas where they will have the biggest impact.
It doesn’t matter if you have 1 or 10 hours a day, you can still allocate your limited resources (and all resources are limited) to use them as effectively as possible.
Now you have a system in place which constantly improves your community as you collect more and more data.
Amateurs use intuition, professionals use data
I’ve seen organizations waste millions of dollars on migrating platforms and hiring new staff to fix problems which they haven’t properly analyzed.
I’ve seen community professionals waste months, even years, struggling to fix problems in their communities because they haven’t properly identified what’s going wrong or feasible solutions to fix this.
I’ve also seen far too many community professionals waste far too much time and energy on tactics which aren’t working.
These are all amateur level mistakes and we’re community professionals.
Amateurs use intuition and gut instinct, professionals use data and reason.
This is why data professionals are far more effective than amateurs.
This is how we get great results from our consultancy engagements, we turn this free asset into a powerful tool to diagnose and improve the community.
Beginning on January 28, we’re going to teach you how to do this.
I hope some of you will join us.
Yes, Facebook Groups are free but you’re still paying for it.
You’re paying for it in missed opportunities, lost knowledge, and the members who drift away from the noise.
You’re paying when they make changes and don’t tell you until they’re live.
You’re paying when you can’t design the community to support what your audience needs.
You’re paying when you can’t move to a better platform when you outgrow the group.
You’re paying for it when you don’t get any SEO traffic which sustains most long-term communities.
You’re paying for it when you can’t design an onboarding journey for newcomers which helps them take their first steps and become top members.
You’re paying for it when you can’t treat members differently based upon their previous contributions.
You’re paying for it when Facebook’s reputation takes a hit and people leave the platform.
You’re paying for it when you can’t identify and reward members for great contributions.
The list goes on and on.
Is saving a few hundred (or even a few thousand) dollars a month really worth the cost?
Late September last year, we were invited to help a struggling superuser program for a SaaS support community.
The community team had spent the past year searching for a ‘motivation button’ they could push to increase participation. They had tried gamification and various rewards. This resulted in some spikes of activity, but it rarely lasted. The total volume of responses was low.
Recently they had been testing adding more people to the private group. As you can see, it didn’t work well. Each superuser was making an average of 1.5 contributions per week (barely more than an average member).
Our preliminary analysis showed just 8 of the 75 registered members were making 5+ contributions per week.
So (with their support) we overhauled the program.
First, we set up calls with all 8 of these members to tell them we were downsizing the program only to those who had really earned their place. They were told they were one of the few who made the cut.
Second, we also spent an hour or so with each of them to get a sense of what drove them, what they wanted to see in the program and what kind of rewards they wanted.
We discovered a few important things.
1) Their motivations (some they stated and some we ascertained) varied significantly. There wasn’t a single ‘magic motivation button’ we could push to increase participation from a large group of members.
2) From the eight, two loved seeing the impact of their work (helping others/contributing to the field), two loved getting to know the employees and feeling part of the group, one wanted to be better than another member, two were doing it to promote their own business, and one we struggled to get any read on.
We then worked with the community team to design a system to engage each of this group with these individual motivations in mind. This means treating each participant a little differently.
The two members who loved seeing the impact of their work were informed how many people their contributions had reached in the past week. We also put together a simple summary of any gratitude expressed by members and sent them a thank you card.
The two members who loved getting to know the employees were invited to participate in the community team’s secret santa and given direct lines to contact any member of the team at any time. If they didn’t reach out every few days, one of the community team would casually contact them to check in.
The two members who were doing it to promote their own business were invited to write long-form blog posts, speak (in a lightning round) at the company’s annual conference and give a monthly webinar to members tackling a common problem.
We didn’t link any of these benefits to a specific number of contributions (certainly if their contributions plummet we’ll probably stop doing them), but they are linked directly to our best sense of what motivates each member.
We also provided all 8 members of this group with best answers to the most common questions, direct contacts for any problems, and a few minor additional powers (being able to edit/correct problems in the initial questions).
The participation of 7 of the top 8 members rose by about 80% within the first month.
We also began gradually adding around one new member per week. Each member gets a call (or two) and we try to determine the specific motivation button we can push individually for each of them.
The community isn’t huge, so every single superuser has a huge impact. Previously employees answered 60% of member questions with superusers taking a small minority. Today superusers answer the majority of questions. You can see the impact below:
There’s still work to do. The no. of replies is a lot higher but still more volatile than we would like. We also need to ensure the quality of responses remains high and check member satisfaction has remained as high (and hopefully increased). But, small caveats aside, the results have been positive.
Understanding the Psychology Behind Successful Communities
When we talk about understanding psychology of community, we’re not talking about big broad theories.
We’re talking understanding members well enough to make direct, immediate, tactical steps you can take to increase participation and drive better results in your community.
It doesn’t matter how good your strategy is if you don’t understand psychology well enough to execute it. None of the actions we took above are hard to execute, but you have to understand members well enough to do them.
This week we opened enrollment for our Psychology of Community Course to help you understand things like member motivation and use it to help your members do what they want to do. I hope you will join us.
p.s. The fee for the course rises at the end of this week.
p.p.s Here’s an older talk I gave on Sense of Community in 2014. I believe it’s as relevant today as it’s ever been.
Next Tuesday, I’m hosting a free webinar to break down the community strategy process.
You’re going to learn what a strategic plan looks like, the key components of the plan, how to undertake good research, and why some plans are used while others are ignored.
You can sign up using the link below.
If you’re looking to evaluate your own strategy or be inspired to rebuild your strategy, this webinar will probably help (and if it doesn’t, I promise a full refund!)
I hope to see you there.
p.s. Here are some bonus community strategy resources:
Can you imagine anyone walking around your local community wearing a badge saying “hey, I started a conversation”?
Hopefully not, but that’s what we seem to expect members to do.
Not a single member cares about getting a ‘conversation starter’ badge when they post their first question (tip: they care a lot more about getting an answer to their question).
Forcing people to receive (and display) an embarrassing badge undermines the entire purpose of badges.
Badges work in three ways:
1) They’re connected to dedicated efforts. Scouts know this well. You decide what badges you want and work towards obtaining them for dedicated effort (i.e. not something members would do anyway). This works for medals too.
2) They’re an acknowledgment of expertise/status/contributions. They recognise great people for their contributions. The more unique the contribution the better. It’s hard to automate these. Knighthoods, purple hearts, and lifetime achievement awards all fall into this category. They can be applied for or awarded.
3) They’re a (good) hidden surprise. No-one knew the badge existed until someone did something completely unique and gained a new badge.
You can create and award an infinite number of badges to members. Automating badges for minor behaviors isn’t just lazy, it’s counter-productive. Badges should be the community equivalents of setting and achieving our goals, winning trophies, being knighted, or a surprise reward.
If you want badges to motivate members to do something extraordinary, you have to give them to members when they do something extraordinary. If a member wouldn’t boast about the badge and proudly display the badge next to their name, don’t create the badge.
Last year, I spoke at CMX.
It was my favourite talk to date.
If you’re looking for a primer to understand how to build a community that’s indispensable to your colleagues and get the most from your members, this might help.
If you couldn’t attend the event and still want to see all the speaker videos, I strongly recommend you purchase the video package here.
Trust me, it’s a bargain.
Joel identified the coherence problem.
Unless you’re selling memberships or advertising, maximizing engagement is useless.
Each additional engagement doesn’t yield an additional dollar (or cent).
Worse, it can decrease valuable activity as key contributions and members are lost in the clutter and overwhelm.
Forget commitment curves, engagement ladders, and the like. Only a tiny fraction of members will ever progress through them anyway. Instead, focus on a simple strategic plan.
The Mayo Clinic’s strategic plan, for example, is:
A plan directly connects your goal to the tactics. To clarify:
- The goal is something the organization cares about (the impact you want to make)
- The objectives are what you need (segments) of the audience to do.
- The strategies are why they will do it.
- The tactics are what you do to get them to do it.
We waste far too much of our community’s potential chasing engagement.
This would be a great year to stop chasing engagement.
The Economist used to boast about its limited circulation.
Major newspapers touted their mass readership and did everything to attract more. The Economist bragged it was “not read by millions of people”.
While print media circulations have collapsed this century, The Economist has grown their audience considerably. They did this without clickbait, SEO hacks, advertorials, and any tactic which would irritate their audience.
You don’t need to be bigger than any other community. Being small is a weapon. Being able to slice off a segment of the audience and cater exclusively to their needs is a weapon. Focusing on being better instead of bigger is a weapon.
Being patient is also a weapon. It’s always tempting to replicate engagement hacks you see elsewhere – especially when they seem successful. But it’s a fool’s game. You attract the audiences you don’t want at the expense of the audience you already have.
Far better to be a lighthouse of quality in a sea of mediocrity. Light a beacon to attract exactly the audience you want and no more. A beacon that shows your values, your focus, and your commitment to supporting your members in the best way possible.
The irony, of course, is The Economist today has grown their audience to the millions.