Discrimination has been proven continuously in real life. Let’s all face it. It’s there. Artificial intelligence projects such as Algorithmic Justice League has proven it and is on a mission to stop it. Meanwhile, it’s still here, and it affects people of color worldwide, as shown in the Netflix documentary Coded Bias.
Algorithmic discrimination is said to even be affecting authors, specifically authors of color on YouTube’s Authortube. On various social media, Black creators must work harder, much harder, to gain profitability, even when their content is stellar. Why is that?
But first, let’s understand that AuthorTube is fairly new, and it’s main topic is authors and books as it branched off from BookTube. As with everything new, it takes time to grow, catch on and move, however, this one simply isn’t catching on for many AuthorTubers, but if one scans the crowd, it can be gathered that for Black AuthorTubers and even BookTubers, it hits in the hardest way.
According to one AuthorTuber FaaTima Rose:
“I don’t really want to go into it too much, but the algorithm doesn’t treat everyone equally. No one needs to make up being oppressed. No one needs to make up a ones and zeros algorithm treating them differently. No one needs to make up. No one needs to make that up. That’s not a lie that needs to be told. That doesn’t benefit anybody. So when creators of color specifically Black tiktokers feel like they need to go on strike because they’re not being treated the same, yeah I overstand, and that’s just the truth.”
Here is another voice from Francina Simone, a BookTube voice, who believes that some of the issue lies with greatness being equated to popularity. Could this be the algorithm issue?
“How many times have we, as a people, as individuals, confused greatness with popularity? I honestly feel like that is the problem with BookTube today, confusing greatness with popularity, and because that’s happening so often and so much who actually like reading great books, despite the genre, despite content. We just like a great story, we’re starting to nit pick to the point where we’re asking for things that we don’t necessarily need. I mean, be honest, do you really need to read a story about someone in your exact situation?”
Could it be that both popularity over good content and race are playing major issues with the Writing/Author/Book-Tube algorithm issues we are seeing today across the board?
Could it really be that the algorithm is stuck in a he or she must be the greatest because past calculations suggest that they are popular and so white, and they were who were succeeding at that time, so the computer thinks the way the discriminatory past tells it to because that’s the only basis it has to go on because the algorithm pulls from a discriminatory past?
The algorithm only does what it was told to do, discriminatory or not. Fix it.