Squashing inclusivity bugs in open source software

GenderMag methodology identifies gender bias in software tools to help designers eliminate it.
262 readers like this.

When people talk about diversity and inclusion in open source, the discussion is usually about how to improve a project culture's inclusivity. But can the software itself be gender biased? Our research says it can. So, how do you know if your software is biased? And, if it is, how can you make it more inclusive?

The GenderMag method is a way to identify gender-inclusiveness problems in your software. It is available in a "kit" freely available for download at GenderMag.org.

The method was developed by Oregon State University distinguished professor Margaret Burnett, whose internationally recognized work with students and collaborators has shown gender differences in how people problem-solve with software—from people who are working with Excel formulas to professional programmers.

She was inspired to design GenderMag by a software product manager who asked her for help with his company's application for medical practitioners to program medical devices for patients' needs. His customer base was mostly women, and unfortunately, many women did not like the software. With an all-male development team, the manager was at a loss for what to do.

In this article, I will share reasons for gender biases in software and then describe what these biases mean for open source tools.

How does gender bias sneak into software?

Individual differences in how people problem-solve and use software features often cluster by gender; that is, certain problem-solving styles are more favored by men than by women (and vice versa). Software tools often support the tool developers' preferred style of problem-solving. When those tools are developed by male-dominated teams, they can inadvertently create gender bias.

Research over the past 10 years across numerous populations has identified the following five problem-solving facets that impact how individuals use software:

  1. Motivations for using the software
  2. Style of processing information
  3. Computer self-efficacy
  4. Attitudes toward technological risks
  5. Preferred styles for learning technology

The GenderMag method has identified inclusivity issues in real-world software teams. As the following chart shows, 17 different software teams across different domains have found gender biases in their own software when using GenderMag.

Percentage of software features with embedded gender biases

Percentage of software features with embedded gender biases.

How does GenderMag work?

The GenderMag method consists of a gender-specific cognitive walkthrough with a set of personas. Each persona represents a subset of a system's target users as they relate to the five problem-solving facets listed above. Tool designers perform the GenderMag walkthrough to identify potential usability issues for new users to a program or feature.

In a GenderMag walkthrough, tool designers answer three questions through the lens of a specific persona's problem-solving facets—one question about each subgoal in a detailed use case and two questions about each interface action.

Abby Jones persona

To explain, let's look at a walkthrough using the Abby Jones persona:

Subgoal Q: Will Abby Jones have formed this subgoal as a step to her overall goal? (Yes/no/maybe, why, what facets did you use)

Action Q1: Will Abby Jones know what to do in the user interface at this step? (Yes/no/maybe, why, what facets did you use)

Action Q2: If Abby Jones does the right thing, will she know she did the right thing and is she making progress toward her goal? (Yes/no/maybe, why, what facets did you use).

If your answer to any of the above questions is no or maybe, you might have an inclusivity bug if it is also associated with one of the five facets.

Where are the biases in open source?

Our research shows that open source software would benefit from considering these individual differences in problem-solving styles in software design, as they might be contributing to open source communities' low diversity rates. In a recent field study, five open source teams used the GenderMag method to analyze open source tools in a code-hosting site, an issue tracker, and project documentation.

Using the GenderMag cognitive walkthrough, the open source teams identified gender bias in more than 70% of the tool issues they uncovered.

For example, one frequent problem showing gender bias is the fragmented way issues and their associated information are recorded in GitHub. The teams' analysis revealed that information fragmentation would disproportionately affect individuals with a comprehensive information processing style (i.e., getting a good understanding of the problem by gathering the pertinent information about it before proceeding with a solution). This fragmentation problem has gender bias because comprehensive information processing is statistically more prevalent among women than men. Even so, solving the fragmentation problem would help everyone who prefers comprehensive information processing, regardless of their gender.

Another problem the teams found was related to individual differences in learning style. When information was spread across the project site, with many actions for a newcomer to consider (e.g., clone, fork, different pull request options, finding issues), the teams showed that newcomers who like to learn by tinkering were likely to become disoriented. This problem also has gender biases but they disproportionately affect men because learning by tinkering is statistically more prevalent among men than women. Here again, although this problem affects one gender more than others, solving it would help everyone who prefers to learn by tinkering.

In these examples, the tools and technology were biased against people with problem-solving styles favored by women in one case, and against styles favored by men in the other case. In total, however, most of the technology-embedded problems the open source software teams in our study found were biased against problem-solving styles favored by women.

A subsequent study of newcomers' experiences with open source showed that these teams' findings were correct. The gender biases identified by the open source teams in our previous study matched the problems the newcomers in the second study reported in their diaries. Over the course of several months, these newcomers recorded the problems they faced with the tools and technology as they worked toward making their first contribution to an open source project. The newcomers' diaries showed statistically significant gender differences in how their problem-solving facets interacted with the barriers they encountered when they tried to participate.

What can you do?

You can help by using the freely available GenderMag method to find and then fix inclusivity bugs in the software you're building. You can also contribute to the GenderMag Recorder's Assistant, an emerging open source tool that aims to make the GenderMag process easier. If you're interested in partnering on other ways to help address gender biases in software, please contact us via the project website. 


Anita Sarma will present Is the Software Itself Gender-Biased? OSS Tools and Gender Inclusivity at the Open Source Summit North America conference, July 21-31 in Vancouver, British Columbia.

User profile image.
Anita Sarma is an Associate Professor at Oregon State University. Before this she was an Assistant Professor at University of Nebraska, Lincoln; a post-doctoral scholar at Carnegie Mellon University, and a doctoral student at University of California, Irvine. Through this journey her passion has been on helping humans make better software and work together.

3 Comments

I don't understand.

What evidence is there that Gendermag is not its-self biased and non-inclusive towards persons of indeterminate gender, or towards difference races, or even different species?

We all know that orangutans use iPads, but there is never any discussion as to the human-centric nature of the software they are forced to use.

To quote Professor Geoffrey Miller: "I just want to highlight a paradox at the heart of the ‘equality and diversity’ dogma that dominates American corporate life. This dogma relies on two core assumptions:

The human sexes and races have exactly the same minds, with precisely identical distributions of traits, aptitudes, interests, and motivations; therefore, any inequalities of outcome in hiring and promotion must be due to systemic sexism and racism;
The human sexes and races have such radically different minds, backgrounds, perspectives, and insights, that companies must increase their demographic diversity in order to be competitive; any lack of demographic diversity must be due to short-sighted management that favors groupthink.

The obvious problem is that these two core assumptions are diametrically opposed."

Nuff said. Or women actually have a handicap using technology and need additional help - or this is all moot.

Creative Commons LicenseThis work is licensed under a Creative Commons Attribution-Share Alike 4.0 International License.