Denelezh 2.0, a transitional version

At the beginning of April, a new version of Denelezh, a tool to explore the gender gap in the content of Wikadata and Wikimedia projects, was released. This post explains what led to this new version, including the choice of a new methodology to generate the metrics, and what you can expect in future releases. Finally, a technical overview of the tool is provided.

What’s new

A 4th dimension

Since its inception, Denelezh provides multidimensional analysis. You can explore the gender gap in Wikidata by several dimensions: the year of birth, the country of citizenship, and the occupation of a human. It is possible to combine these dimensions, for example to have metrics on the gender gap for French politicians born in 1901.

The most visible improvement in this new version of Denelezh is the addition of a fourth dimension: the Wikimedia project. All projects that have at least one page about a human according to Wikidata are included: not only the English Wikipedia, but also the young Atikamekw Wikipedia, Wikimedia Commons, Wikispecies, the Polish Wikiquote, …

Data is still extracted from Wikidata using its weekly dump. Thus, you can go back in time to observe the evolution of metrics you are interested in. For example, the French Wikipedia had 16.0 % of its biographies about women in January 2017, 16.3 % in July 2017, 16.6 % in January 2018, and is now, in April 2018, at 16.8 %. It seems encouraging but, in the meantime, 33,107 biographies about men were added in the French Wikipedia and only 11,202 about women.

A new methodology

Although it is less visible, the most important improvement in this version is the new methodology to generate the statistics. The idea is to generalize the statistics produced in the first version of the tool.

In the previous version, only around 50 % of humans in Wikidata were kept, mainly in the hope that keeping only humans with all studied dimensions would improve the quality of the metrics provided by the tool. The problem is that this hypothesis was never confirmed (nor contradicted). Now, all data available is used, and in particular:

The tool does not try anymore to provide statistics about biases by introducing new biases 🙂

Other improvements include:

Future

Main features

Even if they need to be clearly defined, the main new features will be:

It will also be the transformation of Denelezh into a more general tool, as explained in the next section.

Data quality

I already worked on data quality in Wikidata, for example by cleaning BnF IDs (in French) or by contributing to Wikidata about the members of the French parliament with Dicare (in French). In this last case, a dedicated dashboard (in French) provides statistics on the data held by Wikidata about members of the French National Assembly, legislature by legislature, and insights on what needs to be improved.

The idea is to provide, with Denelezh, a general dashboard to help Wikimedians to contribute about humans, with not only data on the gender gap but also other metrics, like missing properties (number of people without a gender, without a date of birth, …).

Usability

Usability is an important topic that needs to be covered. For example, the form needs to be more understandable and to have a dedicated documentation. A lot of little things can drastically improve the tool, like to provide links to Wikidata items, links to the Wikidata Query Service to have live results, exports in CSV format… Finally, Denelezh needs internationalization: it’s quite ironic to have an application about gaps only available in English!

Technical overview

Architecture

The tool is still divided in three parts:

In order to have reproducible results, the Wikidata Query Service is not used anymore (it was only used for labels in the previous version).

Some metrics

Denelezh is installed on a dedicated server with an i5-3570S CPU, 16 GB of RAM, and a slow hard disk, running Debian 8 (Jessie) as the operating system, nginx as the web server, and MySQL 5.7 as the relational database. The processing of the most recent dump (2018-04-09) takes around 11 hours:

From this dump, 29,338,817 sets with at least one human were generated. The corresponding MySQL data file is about 2.7 GB. Data from each dump is stored in a separate MySQL partition to improve performance and to ease maintenance.

Feedbacks

Feel free to send feedbacks, by email (envel -at- lehir.net) or on my Wikidata talk page.