Every industry will be affected by algorithms and algorithmic bias. But we could rethink this as – Do algorithms introduce bias or do they introduce transparency?
An interesting paper illustrates this point. Assessing Algorithmic Biases for Musical Version Identification is a study on algorithmic biases for musical version identification. In the music industry, the Version identification (VI) systems is the mechanism used to detect different renditions of a musical composition for determining the royalty due. There are three stakeholders here who can claim royalty: the original artist, the artist who rendered the performance which was created by the artist and the composer. There are two types of VI systems: learning and rule based. The paper proposes a framework quantifying performance disparities across 5 systems and 6 relevant side attributes: gender, popularity, country, language, year, and prevalence. By categorizing the recordings in the dataset using such attributes and stakeholders, the paper analyzes whether the considered VI systems show any implicit biases.
- The authors find signs of disparities in identification performance for most of the groups they include in their analyses.
- They also find that learning and rule-based systems behave differently for some attributes, which suggests an additional dimension to consider along with accuracy and scalability when evaluating VI systems.
- Overall, they observe that the learning-based systems work better for underrepresented groups.
- For the popularity experiments, they see that all the systems tend to perform better for popular artists and composers
- The results for the language experiments show disparities only for the systems using melody features, with recordings in languages other than English having better results than recordings in English.
- They observe that the learning-based systems work better for female artists/composers compared to males. While this implies that female composers are likely to be more rewarded by these systems, in contrast, female artists that perform a version of an existing composition (i.e., artists of the queries) are likely to pay more royalties.
- The authors observe that both the learning- and the rule-based systems show performance disparities for certain groups. Specifically, the learning-based systems show disparities for 54.4% of the cases while this ratio is only 30.4% for the rule-based system.
- The authors have presented various hypotheses for their findings
- They discuss the limitations of their approach including the limits of human annotation
- They share their findings and hypotheses with the industry in the hope of creating a discussion
- The authors have also shared their dataset
The findings could be interpreted in a few ways
- In terms of accuracy, learning based systems are a disadvantage however, because they scale well, they also have advantages
- For gender attribute, the study finds that “learning-based systems work better for female artists/composers compared to males. While this implies that female composers are likely to be more rewarded by these systems, in contrast, female artists that perform a version of an existing composition are likely to pay more royalties. "Therefore, interpreting the fairness outcomes should be considered independently for all the involved parties, as a result of having a multi-sided structure."
- They conclude that “As the result of 115 experiments in total, we have seen that VI systems may indeed perform differently on certain groups. Their behavior may vary depending on whether they are learning or rule-based, or whether they use melody- or chroma-based input features, but potentially other design choices could have an impact.”
The question is: Is this a case of algorithmic bias or is it a case of transparency arising from implementing algorithms? I would argue that in this case, the discussion has created more transparency in an industry that is notoriously non transparent . In other words, we should not jump to the conclusion that the algorithm is biased because the process itself could be non-transparent. Also, the question of bias itself could be subjective as per the example of the female artists vs female composers. In this sense, the implementation of algorithms (when they are explainable) could be beneficial to many industries as a driver for transparency.
Image source Willie Nelson museum - pixabay