cyberhate symposium 

Gendered Violence Online – A Scholarly "Slam"

The Red Rattler Theatre, Sydney, Australia, 7 July, 2017


This major research and community engagement symposium was the culmination of my three-year, Australian Federal Government-funded study into the impact of gendered cyberhate. Its aim was to move the conversation about cyberhate away from re-articulating the problem or casting blame, and to focus exclusively on formulating innovative potential solutions and interventions. To do this, I - along with my co-organiser Nicole A Vincent - recruited 60 participants from around Australia, including police, lawyers, activists, representatives from government and media, online moderators, designers, and owners, gamers, coders, academics and cyberhate targets who were places in groups spend a day brainstorming solutions to cyberhate problems.


Commendation by NSW parliament

On August 10, 2017, the NSW parliament unanimously passed a motion congratulating our organisation of the symposium. 

So, what can be done?

In light of the ideas generated at the symposium, we propose the following six concrete interventions to the worsening problem of gendered abuse and harassment online: 

1) Call it what it is: abuse

It is well past time we recognised the real impact of online violence. To do this, we need to refrain from referring to it with vague, innocuous-sounding, and meaningless terminology. Online abuse will not be taken seriously if violent threats such as "Raped? I'd do a lot worse things than rape you!!" and worse keep being referred to with generic labels like "strong language", "in bad taste", or "trolling".

2) Expand the range of offences

Alongside more accurate language, new categories of criminal offences are needed. Provisions must be made to levy fines for recognised offences, like we already do for parking and speeding violations, or for lewd, disruptive, threatening, and menacing conduct on public transport and in public spaces. Civil remedies are also needed. These should include protection orders and litigation against individual offenders, as well as class action law suits against software designers and platform operators who create and maintain unsafe environments (see the fifth point below).

3) Online or not, police have responsibility

Police need reassurance that these offences, despite occurring in the Nowheresville of cyberspace, indeed fall within their jurisdictions. They also need training and clear guidance about what evidence to collect, and the resources to swiftly follow up when victims report online abuse.

4) Technology needs an ethical upgrade

To support all this, our technology is in dire need of an upgrade. Not the sort that produces faster processing, more storage, or yet another suite of cute emoticons - but an ethical upgrade. Consider, for instance, a ban on instant/disposable accounts, accompanied by the slow unlocking of full account functionality on various platforms once we have earned it by proving we are good netizens. While so-called "real name" policies are open to criticism, it is also worth considering the advantages of new account applicants having to provide enough evidence of who they are as real, flesh-and-blood humans. Details that could then be used by authorities to track down offenders, regardless of whether they abandon their accounts after committing abuse.

5) Platforms must consider security

Software designers and platform managers must take responsibility for designing safer spaces — like the safety we build into offline environments (for instance, not designing streets and walkways with dark and dingy nooks and crannies where innocent passers-by can be cornered and attacked).

Why should safety standards only apply to streets and buildings, when online highways and platforms are increasingly becoming the places we spend our time?

Our view is that if software designers and platform managers do not discharge their responsibility with due diligence, then they must accept potential fines, liability, and even criminal sanctions when their patrons get hurt.

6) Include ethics in tech training

To design the right technological solutions, ethics needs to be taught to engineering and design students. Not as a soft subject or "politically correct" inconvenience, but as a way of "baking in" ethical and not just practical functionality into software and platforms. In fact, our case is that learning to build ethical functionality into artefacts and environments should be an integral part of the training of every designer and engineer, and just as important as learning to build and program any other functional requirement.

The technology writer Nilay Patel has observed that we don't do things "on the internet" any more — we just do things. Accordingly, in addition to updating our language, laws, and policing practices, we think an ethical upgrade of our tech via what is known as "value-sensitive design" should become as essential as improving broadband speeds and site clickability.