Why you should avoid vanity metrics and measure what matters

5 readers like this
5 readers like this
Penguins

Internet Archive Book Image. CC BY-SA 4.0.

Metrics. Measures. How high? How low? How fast? How slow? Ever since the dawn of humankind, we've had an innate and insatiable desire to measure and compare. We started with the Egyptian cubit and the Mediterranean traders' grain in the 3rd millennium BC. Today we have clicks per second, likes, app downloads, stars, and a zillion other ways to measure what we do. Companies spend ridiculously large sums of cash developing fancy KPI dashboards and chasing their needles on a daily—and sometimes hourly—basis, in the quest for the ultimate, ongoing measure of their health. Is it worth it? The answer to this question is as simple as it is annoying and not particularly helpful: It Depends.

In this column, I'm going to take you on a journey into the treacherous yet rewarding world of community metrics. I'll explore existing research, available tools, case studies (good and bad), and ultimately use what I've learned to develop a playbook for you to reference when creating (or improving) your own community metrics. You can then use these metrics to know—or at least approximate—where your community is today, where it's headed, and (spoiler alert!), what to do when that direction isn't quite what you need to reach your community's goals.

Vanity metrics

As a former community manager of a large open source community, I'm well aware of the addiction to and dangers of metrics that sound impressive, but ultimately mean very little. Approximately once a year when analyst firms would send out questionnaires to vendors of commercial open source, I would get this email:

To: James Falkner awesome_community_manager@project

From: CxO That Is Super Busy And Doesn't Have Time To Deal With This

Subject: updated metrics? Hey James, what are the latest stats on our community? I'm filling out this analyst request and need:

- Downloads
- Registrations
- Number of developers
- Number of commits
- Number of forum posts

While suppressing my anger and desired response, I would dutifully generate numbers that roughly corresponded to the above. For example, here was the graph I generated for registrations:

Metrics graph

In February 2011 we were "dinged" for what was clearly a slowdown in registrations. Something was very wrong, they said. The ship was clearly on fire, they said, and the community manager was at the helm. Not surprisingly, my LinkedIn activity picked up quite a bit that month. So what happened? Funny story—it turns out, in February we enabled a CAPTCHA on our registration form and started blocking spammers rather effectively, drastically depressing the new registration count. A few months later, after the analyst report, spammers figured out a way to get around the CAPTCHA, and things returned to "normal".

Photo of road sign

Photo by MikeGogulski on Wikipedia, CC BY 2.5

These and other "first order" metrics belong to a class of metrics called vanity metrics, which is an awesome and accurate term describing their superficiality and meaninglessness. Relying on them as a measure of community health is tempting, but taken out of context, they easily can be misleading, gamed/rigged, or even selectively chosen to fit a desired narrative. This was exemplified in a recent episode of HBO’s Silicon Valley, Daily Active Users. In the show, while the company celebrated the 500,000th download of its software platform, the founders realized the more accurate measure of success—Daily Active Users—was shockingly low, so they hired a click farm to game the system and impress investors. The plan backfired and they ultimately had to resolve the underlying usability issues, teaching them (and us) a valuable lesson in vanity metrics.

Ignorance is bliss

So what should you do with vanity metrics? It's OK if you want to collect, analyze, and attempt to draw conclusions from them. I get it, I really do. Just be aware of what you're doing. (I will explore various characteristics of metrics, vanity or otherwise, in future columns.) Overcoming addiction to vanity metrics is possible, but it's much more difficult than the initial effort required to produce it:

If that's all you have, and you have no other choice but to report them to a demanding exec or an overdue analyst questionnaire, then go for it—but consider using the opportunity to explain why having more accurate metrics that align with the business is better in the long run. And then get to work coming up with better metrics that more accurately tell the story of your open source community and help guide your decision making in the future. See you next month!

James Falkner's picture
Technology evangelist, teacher, learner, author, dedicated to open source and open computing. I work at Red Hat as a technical evangelist for Red Hat's portfolio of open source products and love what we do and learning from others, and occasionally teaching at conferences.

5 Comments

The irony is that opensource.com is publishes the most vanity metrics at the highest frequency I know of. Every week I'm sent stats for: page views, page views from search, monthly page views (to date), comments, facebook, twitter. Metrics like those -- measuring views -- can be aligned with the business, if the business is making money from Google adwords. But I don't think that's Red Hat's real business. So why do you care so much? And do you care more about page views than, say, finding articles of quality? Note that among the metrics, time on site isn't one of them -- which would be a rough measure of quality, if people are spending time to read the articles... Back to this article -- what metrics would be better aligned with Red Hat's business? (Perhaps they _are_ collected, but not sent to the mailing list.)

Hi Brendan,
Thanks for your feedback. I assume you're talking about the writers email list? In any case, you make a valid point, and one that the Opensource.com editorial team discusses often. In fact, our team recently watched video from an internal training James gave on community metrics, which is why we reached out to him and asked if he'd write up a series for us. We're interested in improving how we gather and share metrics, and readers can learn along with us (Thanks, James!)

I'm a huge fan of qualitative analysis, which is much harder to report when it comes to the health of communities and publications. Quantitative analysis shows us which articles readers clicked on, but doesn't tell the whole story. Did they read it because they like that topic/author/headline? Was the article a great resource other publications were able to reference? Did it create conversation on our site (comments), or did it lead to an interesting debate on another site/community/list?

Did an article that had 400 views in a month inspire a company/school/organization to adopt an open source tool? If so, I'd call that a huge success story...but we might never hear about it, and we certainly can't get reports that collect this kind of data. Did an article that had a few hundred page views bring much-need attention to an open source project/community/event that otherwise would have flown below the radar? Again, a successful article (in my humblest of opinions).

Quantitative analysis is easier to get and report, but often qualitative is much more useful. (In fact, I brought this topic up in a journalist round table I participated in earlier this week: https://youtu.be/1sTHTdLaSjM)

If you have other ideas for how we can improve our reporting for writers and readers, feel free to send suggestions to the editorial team at open@opensource.com or to me directly at rikki@opensource.com.

In reply to by Brendan Gregg

Thanks Rikki, I think I get it now -- those metrics (page views, etc) are intended for the _writers_, hence writers email list. I'm guessing there's a different set of metrics for the business, once that are aligned with the business. I was conflating the two. It might be interesting to discuss what the business metrics would be, as an example of metrics that matter to back up the article, but I'm not expecting to, since I presume that would be confidential.

In reply to by Rikki Endsley

"What gets measured, gets improved." John Gall, Systemantics.

In other words, choose your metrics wisely.

Time on site, bounce rate, repeat visitors, "fallen angels" (those that used to visit and have disappeared), engagement tracking (are your readers moving from "wallflowers" to "core reader" to "engaging commentator" to "author"? how many moved up? how many moved down?). These are a bit closer to the goals of opensource.com, which is to "create a connection point for conversations about the broader impact that open source can have—and is having—even beyond the software world". Ideally you could trace open source adoption back to patient 0 - the article that someone from the organization read and inspired them - but that's incredibly difficult (and usually approximated with regular surveys of readers). So you try to get as close to ideal as possible, with the resources you have. It's more difficult than just looking at google analytics, for sure.. but in my opinion, finding a better set of metrics lets you escape the ad trap and forces you to focus on achieving your goals. I'm getting ahead of myself now :)

Creative Commons LicenseThis work is licensed under a Creative Commons Attribution-Share Alike 4.0 International License.