top of page

Covid-19 has made ending online abuse even more urgent

(Click here to read the original Wired article by Vicki Turk)



Seyi Akiwowo started Glitch! to hold tech platforms to account when it comes to online abuse – a mission that is more important than ever.


When the Covid-19 pandemic forced countries into lockdown and more of our lives moved online, alarm bells rang for Seyi Akiwowo, founder and executive director of UK charity Glitch!. “Increased internet usage means increased risk of being abused online,” she says.


Glitch!’s mission is to end online abuse, particularly towards women and non-binary people, who are more likely to be targeted. In September 2020, the charity published the results of a survey it ran in partnership with the End Violence Against Women Coalition that found nearly half of respondents had experienced online abuse during the Covid-19 pandemic, mainly on Twitter, Facebook and Instagram; 38 per cent reported experiencing online abuse in the months prior to the coronavirus outbreak, suggesting an increase during lockdown.


Examples of the types of abuse experienced included trolling, cyberbullying, gender-based slurs and harassment, and threats of violence. Akiwowo points out that domestic violence also increased in lockdown, which can include online elements such as digital stalking, hacking and the non-consensual sharing of private images – “all of those are offshoots of domestic violence”.


Akiwowo, who is 29 years old and based in East London, first had the idea for Glitch! after her own experience with online abuse. In 2016, while working as a councillor for Newham, she took part in a youth event at European Parliament during which she made an impassioned short speech standing up for refugees and suggesting that former empires pay reparations to countries they had colonised. After a video of the speech went online in 2017, Akiwowo received sexist and racist abuse. Recognising a gap in services to support people in her position and hold tech platforms to account, she started Glitch! as a campaign before establishing it as a charity in February 2020.


Glitch! works to tackle online abuse through several channels, including awareness-raising activities and running workshops to help people understand their rights online and how they can protect themselves. The charity advises tech companies – Akiwowo calls Glitch! a “critical friend” of social media platforms – and is part of the Twitter Trust and Safety Council. It also advocates on matters of government regulation, for example around the Online Harms white paper, which seeks greater accountability for tech companies but which Akiwowo thinks should do more to address abuse against women and marginalised communities.


“At the moment, the proposed bill for online harms is very centred around children and terrorism,” she says. “It rarely conveys how women experience the online space.” Glitch! would also like to see ten per cent of the new Digital Services Tax ringfenced for addressing online abuse.


While Glitch! focuses primarily on gender-based abuse, it emphasises the need to view the issue through an intersectional lens, as different aspects of people’s identities — such as race, age, sexual orientation or disability – may combine to affect their experience online. Older people, for example, may be more targeted by online scams, while one concerning trend that Akiwowo has noticed among younger people is password sharing – where people expect or demand access to their partner’s online accounts, so they can monitor and potentially control whom they interact with. The same action can also have a different impact on individuals with different backgrounds: the non-consensual sharing of an intimate image may have greater repercussions for a woman who is part of a conservative religious community than for a woman who is not.


In 2020, Glitch! offered Black Lives Matter activists and campaigners specific training to protect themselves against racist abuse and hate speech online. It also called on social media companies to institute content controls on images and videos of police brutality and violence against Black people, for example by blurring images or warning users that they are about to see graphic content.


Videos of George Floyd’s death were shared widely on social media, and while Akiwowo does not think that such content should be banned, she emphasises the harm that can be caused by feeds full of such distressing imagery, especially to Black viewers. “There's got to be other ways that we can raise awareness of racial injustice, without the community who we should be centring having to be traumatised by that video,” she says.


In the meantime, she suggests that users exercise “digital self-care” by moderating their own feeds, for instance by turning autoplay off on videos and muting accounts, hashtags or keywords that cause distress.


Akiwowo believes that tech companies do have a desire to improve – abuse on their platforms is damaging to their reputation, after all – but that it too often takes a real tragedy to force through change while awareness remains low (she recalls speaking a couple of years ago to representatives at one company who had not previously come across the term ‘intersectionality’).


She is pushing for clearer policies on gender-based violence and greater transparency around reporting mechanisms, including what happens when an abusive account is reported and how long it should take to hear back. On the question of content moderation, she would like social media platforms to provide more information on the numbers and demographics of moderators they employ, and what sort of training and support they receive. “There's so many questions to be answered first, before we can even start critiquing,” she says.


As well as holding tech companies to account, Glitch! wants to entrench into society the idea of digital citizenship, emphasising people’s rights but also their responsibilities online. “We haven't yet had that kind of public conversation around online etiquette and behaviours online,” she says. “There's a real assumption that because we've grown up with the internet, like having Windows 95 at school, means that we know how to navigate ourselves online in a safe way and in a respectful way.”


Akiwowo is convinced that it is possible to make the online space safer for everyone. And with people spending more time online, it’s increasingly urgent to do so. “It’s the online space where a lot of decisions are being made, particularly during Covid,” she says. “And it's not representative if women and girls don't feel able to express themselves.”



18 views0 comments
bottom of page