Month: June 2014
Adding user profiles and rating systems will increase the perceived credibility of community information.
But does it increase the accuracy of the information? i.e. is it actually more credible or does it just appear to be better information?
In some cases it's both.
Question and answer sites like Yahoo Answers, Quora, and StackExchange do well here.
Usually, it's just the latter. The longest answers, the best written answers, or simply those from people with expertise in an entirely different field are voted the most accurate.
Improving the accuracy of information requires not that everyone be allowed to vote on how good they think it is, but that everyone with any relevant information on the topic contributes that information. This is in then aggregated (e.g. Wikipedia) to give more accurate results.
If user profiles and rating systems encourage everyone with a relevant field of expertise to share whatever they know on that topic, then they increase accuracy.
You can add user profiles and rating systems. You would get more accurate information if you can convince members they have unique insight on an issue, explain their contributions would make a direct impact on the community, and establish ground rules for the submitting information.
Too often, we treat communities as a homogeneous, single-minded, group of individuals all looking to satisfy the same motivations at the same time.
This isn't the case. Communities are complex organisms comprising of eclectic groups of people with varying levels of interest in the topic, different personality types, and means of participating within that topic. The goal of the community professional is to ensure each is as actively engaged in the community as they can possibly be.
If we can segment members effectively, we can develop specific messages solely to engage each individual group.
Most community professionals send the same messages to every member. This is a mistake. The majority of members receive messages that don't appeal to them and ignore all communications from the community (This may be a cause of participation inequality; we're only catering to the needs of a tiny group of members.)
Basic segmentation can use the two simplest metrics:
1) Date of registration. The date when the member joined the community.
2) Post count. Total number of contributions to the community.
Using just these two segments (how long they have been a member and how many contributions they have made), we can still develop segments to which we can cater our messages and activities.
These metrics are proxies for their level of interest in the community and whether they consider themselves veterans or newcomers.
A developer should be able to integrate mailchimp, salesforce, or your e-mail tool with the platform in a day to automatically create these segments.
If you're unable to do anything more complex, use this and send different messages to those with higher post counts and those that have been members the longest. Better yet, use this information in all contacts and relationship-building activities with members.
A better method is to develop segments within segments.
3) Number of contributions within the previous month. This shows you how active members currently are. This is a good indicator of how interested they are in this community at this very moment.
Your categories here are generally 0, 1 to 5, 5 to 29, 30+ for lurkers, casual participants, regular participants, and heavy users (more than 1 post per day) respectively.
This gives you a good idea of how interested in the community they are right now.
Far better, however, to identify members by either the types of activities they have engaged in or look at the individual needs of members.
4) Type of activity. The type of activity members have participated within should also be considered. This gives you a baseline to identify the motivations of members. You can look to see if they have participated in self-disclosure discussions when they joined, completed their profiles, attended key events etc…
5) Needs. The needs of members (power, achievement, affiliation) can also be used as a simple heuristic to segment groups. Using responses to a survey or simple task, you can divide members into 3 groups and cater your messages to appeal to them. Use an automated survey to automatically filter members into distinct groups.
Now you can craft specific messages to the specific needs and motivations of your community members. You can develop messages specifically designed to urge members to move on to the next stage of the community membership lifecycle.
Better still, you can create unique segments based upon members that are participating heavily and have clear alignment to one of the three needs highlighted above.
The most effective approach is to have several dozens of segments with each carefully catering towards the specific needs of members.
You should be able to fully automate the segmentation process based upon the activities of community members. You might even be able to highlight key forum categories (or keywords) that each member is more likely to participate in.
You can use these segments to craft specific messages to unique groups, ensure you're not overwhelming a community with messages, and conduct important activities (e.g. surveys).
Imagine how much easier it becomes to interact with a member if you can see their key needs, the motivation they need to advance to the next level, how long they've been a member, and how many contributions they have made to the community. You will be far more effective in every interaction you have with every member.
In the 10+ years I've been working in this sector. I've met about 5 community professionals who can design and implement a decent segmentation system. These people have the incredible power to increase the levels of activity in a community many times over.
This is the level we need to aspire towards. We can start at a basic level and get more advanced. You can learn how to do this for yourself (or sign up for the advanced module of our community mangement course).
We're happy to announce we're once-again accepting applications for CommunityGeek.
A year ago we quietly launched a community for many of the world's top community professionals.
Like many communities, we struggled at first. Today the community is thriving.
We have vibrant discussions, a growing number of excellent resources (created by our members), the latest news about community professionals, and a level of debate that goes far deeper than any gathering of community professionals.
If you think you need help from (or can give help to) your fellow community professionals, we would love to have you. You can see how we put our own advice into practice.
Sign up here: http://www.communitygeek.com.
Continuing from yesterday:
A big mistake organisations make is to hire an experienced community managers to create a community.
The skills of managing and founding a community are very different.
Creating a community requires persistence, persuasion, market research skills, technology expertise, a huge dose of graft and, most of all, strong existing relationships.
There aren't many people they have a track record of founding a community.
You may have noticed we advertised the same job position several times last year. If it was a community manager role, that would be easy to fill. It's far harder to find someone who we think can be a great founder of a community.
From the applications we received, around 1 in 50 community managers have ever founded a successful community (that still exists) and just 2 applicants (in total) could prove they've done it more than once.
If you have a track record of founding several communities, advertise that more broadly.
If you're looking to find someone to launch your community, it helps to have someone that's done it before.
Look at the history of most communities, you'll notice the founder plays a pivotal role.
They set the tone, establish the rules, invite the first members.
This article (pages 24 to 31), shows the specific steps the Green Brothers did to develop a powerful community culture.
Compare this with how organisations develop communities. The founder is almost insignificant.
The founder works within strict rules of the organisation. They have little freedom to do the activities the Green Brothers did to develop the culture. It's too risky, it deviates from the norm too much.
Yet it is precisely this deviation from the norm that carves out a unique culture.
By not allowing anything unusual or wierd to happen, the community can't develop a unique culture and is unlikely to succeed. Organisations are strangling their own community efforts.
I think the current model is wrong.
Organisations find people to help them launch a community. They should find people they can help launch a community. They need to better emphasize the role and freedom of the founders. They need a degree of separation. They can own the platform, they can have the power to remove the community manager, but they need to provide the freedom for the founder to invite their friends and create a unique culture.
They should find specifically the people that can launch a community and then provide them with all the resources they need to make the community succeed.
This probably won't happen. That's a shame.
Our approach to researching a target audience has evolved over the years.
It now works like this:
Step 1: The Interviews
We will interview 10 to 20 prospective members of the target audience. This number changes if we have more time, but it's roughly accurate.
Our goal is three-fold.
First, we want to develop useful categories for a bigger survey.
Second, we want to build relationships with individuals who might become the founding members.
Third, we want to identify specific words and phrases they use to incorporate them within the community. This makes the community feel authentic.
We want to know the following:
1) How long they have been involved in that topic. This influences how advanced we target the material and type of discussions that appear in the community.
2) What do they spend most of their time doing (within the topic). This influences what type of content and discussions appear. We can also use these aspects as potential triggers to visit that community.
3) How they became interested/involved in the topic. This usually reveals common symbol systems and wording we can use to get people initially involved in the community.
4) What they hate most about the topic. This influences the initial motivation/call to action for people to join the community. It can also be a pain point we can use in the community concept.
5) What are they most afraid about in the topic. This influences the same things as above.
6) What do they like most about the topic. Again, this is the same as above. We can align the messaging within this.
7) What do they hope to achieve either within the topic (or where do they hope to be within the future). We can incorporate
8) Who do they consider their peers. This influences who we target the community for. We can often highlight specific segments here. We also know if we can get 20% of their peers in one community, the rest will follow.
9) What existing communities they participate in. This highlights potential competitors and ensures we can develop a unique community concept.
10) Who do you most admire within the topic.
Based upon this, we can develop create specific answer categories for each survey question. You saw a version of this in our event survey here.
Step 2: The Survey
Answers from 20 people aren't very useful unless they're validated by the broader group.
Now we develop a survey using almost identical questions above but with answers that came up both in the interviews and our own research.
We use SurveyMonkey to randomize how the answers appear. We also force people to rank by priority. This forces people to genuinely think about which issues are most important.
After each question, we add a question box to ask if there is anything else not considered above. This highlights anything glaring we might have missed in the interviews.
In the survey we also ask for demographic information (age/location/profession). This helps us identify potential clusters we can target the community towards to get started.
We've broadly found it better not to incentivize completing the survey and instead simply ask if they would like to help us by taking a survey.
Finally, we would also like their e-mail address to contact them in the future. We usually offer a benefit for providing this (discount, become a first member, free products/services/ebooks etc…).
If they provide their e-mail address, we can contact them in the future about becoming one of the first members (and know specifically what to say to persuade them to join the community).
Step 3: Split Testing Motivational Appeals
If we use BJ Fogg's motivational appeals (pleasure/pain, hope/fear, social inclusion/rejection), we can test which identified by community members is most effective.
Split your mailing list into three (or create 3 versions of your landing page using optimizely), and use the top answers in your motivational appeals.
You should be able to track which attracts members to both join and participate. Our experience is social inclusion based appeals (using exclusivity) is most effective to get a community started, but this might vary by the topic.
This research process takes time but gives you the best results.
I support Patrick's view that members should be able to download their content.
I can't think of a single platform that allows this (feel free to correct me).
I was devastated when one of my earliest communities closed.
I lost several year's of connections and contributions.
Every member should be able to download every contribution they have made to a community.
When a newcomer arrives at your community, what have they heard about the community so far?
This affects the messages they need to receive.
Have they heard about something specific happening within the community?
Have they heard about the community as a place where talented people in their sector go?
Have they heard about the community as a place that's fun to hang out in?
If they have heard the community is great and they should join, then converting into members is about reinforcing this perspective. You highlight current successes and lead members to the registration form.
If they have vaguely heard about the community, then you need to educate them and make a member more excited about the community before they hit the registration form.
If they have heard about a specific resource or event taking place within the community, then listing upcoming events on the non-registered landing page of that community matters.
You might want to create special non-member pages for those visiting from specific channels (e.g. search, twitter etc..)
It's important to uncover the mindset of members when they visit the community for the first time.
Broadly we see two types of newcomers Those already primed to join and those visiting out of curiosity. Converting the former requires reinforcement of existing messages. Converting the latter requires a little more persuasive messages.
If you can survey newcomers randomly, that works fine. If not, look at the origins of traffic (or ask how they heard about the community) and make an educated guess.
You speak to a lot of the people in the sector and realize that everyone would benefit from a community.
The community can be a place where people exchange ideas and learn from one another.
You talk to members of the target audience about the idea. They tell you they love the idea and this is exactly what they want.
You create the platform, fill it with the content they want, and invite prospective members to join and participate.
Then things start to go wrong. A lot of people join, but not many participate.
You ask them what's happening. They tell you they're too busy. They don't have time right now, but will participate soon.
But it doesn't happen. You might prompt someone to respond to a discussion, and they respond, but it doesn't become a habit.
After a year it's clear it's not taking off and the community is cancelled.
The question is this…what did you do wrong?
This is a fairly common story. It feels like you did everything right, but the community stilled failed.
There wasn't one big thing you did wrong. It's more likely the sum of many mistakes.
In our experience it's usually a combination of the following:
1) Bad community concept. You created a community about the topic, but it didn't have a specific audience-within-the-audience target market. It lacked a goal/purpose.
2) Bad research. If you ask someone if a knowledge sharing community is a great idea, they'll say yes. Of course it is. That's not a neutral question. People will always try to make you happy. You have to see if they develop the idea themselves, find out when they will participate, what time they will commit to it etc…
3) Developing A Content site. By creating a content site, you're encouraging the audience to read, not participate.
4) Lack of real relationships and poor invitations. You didn't establish real, genuine, relationships beforehand and the invitations you used sounded a little too corporateish. The exact words and style of your invitation matter a lot.
5) Ungenuine Discussions. The discussions you posted didn't sound like genuine, real, discussions. They sounded a little too formal, a little too ungenuine. They betrayed that you're desperate for discussions to take place.
6) Lack of community management drive. You're not driving the community enough. You're not there directly inviting people to join the community, nudging people to participate, hosting regular events/activities, or participating enough yourself.
7) No sense of community elements. You fail to bring in any sense of community elements into the community. Members don't visit to bond with one another. They only visit to get free information.
The list goes on.
The point is that you're not going to solve this with one big change. That's not how it works. You're only going to solve it by becoming incrementally better at a lot of different things. That takes time and it's harder to see the immediate results.
This Guardian article outlines how Radio 1 decide what music the show will play.
Given Radio 1's audience, they're essentially anointing the next music stars in the UK, possibly for the world.
The process used to select future stars is interesting.
It's part intuition and part data. They review their own thoughts on the work's quality, the current traction via plays/sales/fans/followers, and whether the artist can sell themselves if they do receive a push.
At our first community, UKTerrorist, we regularly pushed a talented video game player. It was like wrestling, we picked someone interesting and began frequently writing about them. This built up to an interview and main event status. It generated a lot of activity, ensured we always had fresh material, and encouraged others to contribute more to the community.
We repeated this tactic with a non-profit, advocacy, community a year ago. We picked those doing interesting things, checked their twitter followers, and began pushing them within the community if we liked what we saw.
You might not like it, but you too have the power to anoint future stars. You can find them by looking at those writing blogs, publishing books on Amazon, speaking at events, contributing interesting posts to the community etc…
If you're not comfortable in your own ability to do it, find others in the community that can.
If you manage a large, customer-service based community, calculating the ROI is a challenge.
The common approach is to measure the number visitors, questions asked, questions answered, or reduction in call volume. Each has their problems.
- Tracking the number of visitors. This is the worst metric, it includes every possible form of visitor regardless of quality, whether they asked a question, or whether the problem was resolved. It includes people searching for something else entirely.
- Tracking the # questions asked. This doesn’t track whether the question was answered. The same question might be asked repeatedly to get an answer. Each would add to the return (when it should reduce it).
- Tracking the # questions answered. This doesn’t track whether the problem was resolved. The answer might be wrong. A member might ask a question, receive an answer, and then call the customer service line to check. This becomes an extra expense.
- Tracking reduced # calls received. This doesn’t track community attribution. When WidgetCo releases a new product, a lot of people will have questions about it. Over time those questions will be answered. Less people will need to answer questions (call volume naturally declines for new products). It’s hard to attribute this to the community (even with strong correlation).
Using a survey to determine problem resolution.
We instead need a system that tracks a) if the problem was resolved b) if it was resolved satisfactorily, and c) if it led to a reduction in calls to the customer service team.
Google use a simple survey to ask if a visitor’s problem was resolved (collect e-mails to avoid duplicates). They then multiply the % whose problem was resolved by the % of visitors. This provides a call deflection number.
However, the response rate here is low (single-digits). Second, it’s prone to a non-response bias. Those that received a positive response are more likely to click the link to do the survey. You can’t generalize a small, self-selected, sample across the entire community.
Using quotas to generalize across the community.
We need to know the demographics of the audience (you can also do habits and psychographics if you have the resources).
Then determine the breakdown of the audience’s total composition. Most organizations have this information in their CRM system. If you don’t, use an incentivized poll to a large, random, sample of members.
This should tell you that your audience is 57% male, 43% female with further age-related segments. By the end you should have 10+ % segments.
Now you can survey your web visitors (even those that don’t ask questions may be receiving help) to ask if their problem was resolved and use this quota system to ensure it reflects the overall breakdown of the community.
For example, 77% of responses might be from males aged 30 to 45. However, if this segment only comprises of 27% of your overall audience, then it accounts for just 27% of the responses.
Did it lead to less calls to the customer-service line?
This doesn’t show whether the problem was resolved as well as the customer service team may have resolved the problem. Nor does it show if the member would have called the customer support anyway (or just lived with the problem).
Therefore, we need to ask three important questions.
1) Was your problem resolved?
2) On a scale of 1 to 5, how satisfied are you with the resolution?*
3) If the problem was not resolved, would you have called customer support?
*Question two can be changed with any comparative customer service question (e.g. how happy are you with the customer service, are you likely to recommend etc…?)
Understanding total return
Now we can make a rough calculation of the return on the customer community.
Step 1) Determine the cost per call of traditional customer-service line.
Divide the cost of your customer service efforts by the number of calls to have a cost-per-call of your traditional customer service efforts. Let’s assume this is $1.75.
Step 2) Measure % of web visitors who didn’t call because of the community.
Multiply the % of satisfied customers (answer to survey question 1) by those that would have called the customer service line. Let’s assume this is 87% that had their problem resolved, but just 73% would have called customer service.This tells you that 64% of web visitors who would have called customer service had their problem resolved through the community instead.
Step 3) Determine the number of calls deflected because of the community.
Multiply the % of call reduction above by the total number of unique visitors to the community. If your community has received 740,000 visits, you can estimate 473,600 of them had their problem resolved through the community.
Step 4) Determine the value of calls deflected because of the community.
Multiply the figure above (473,600) by the cost per call in step 1 ($1.75). This tells you the total return in value terms ($828,800).
Step 5) Compare the satisfaction rate from each method.
If the customer service team scores an average of 4.2 out of 5 (or using any comparative method e.g. NPS, future purchase intentions) and the community satisfaction rate is 3.7, we need to divide 3.7 by 4.2. This gives us a figure of 88.1%. The community resolves problems 88.1% as well as the customer service line.
Step 6) Multiply the total return in $ value by the comparative satisfaction rate.
Now we incorporate the comparative rate by the value of the calls deflected. In this case it would be $828,800 by 88.1%. This gives our final figure of $730,172. We can then divide this by the investment in the community (platform, staff, overheads) to determine the ROI.
To justify the community, we want a % that covers the costs (it should be positive %), better than putting the money in a bank (about 5%) and the opportunity cost (another 5% to 10%). This usually requires something in the region of 15% to 25%.
A few notes
1) This isn’t a perfect system. It’s a rough idea. It relies upon samples to reflect the overall community. Developing the right quotas and collecting enough data for significance is the hard part.
2) It doesn’t track other benefits. It doesn’t track possible improved SEO, increased lead conversion, shorter sales cycles, improved team morale/beneficial internal changes, product feedback etc…
3) It doesn’t track ‘real’ reduction. It doesn’t actually track if the customer service expense was reduced as a result of the community. It tracks the theoretical reduction. A manager might be reluctant to let staff go.
4) ROI increases over time. A community incurs significant initial costs which negatively skew the ROI until the return has been established.
The goal of this post has been to highlight a (simplified) process for measuring the ROI of customer service communities. At present, too many biases (notably non-response/selection biases) are creeping into measurement. We need most community professionals trained in using data to avoid these biases. I hope this post helps.
If you want to discuss advanced topics in more detail, sign up to www.communitygeek.com.
Both are simple, clean, lightweight, well-integrated, upgrades on traditional forum-technology.
Both prove that organizations don't have to spend their community budget on their platform.
Discourse is open-source, developing fast, and technically superior. Moot is embeddable, flexible and easier to use.
They might just be what organizations struggling to pay for community platforms need right now.