Editorial comment: We at My Disability Matters fully appreciate, and believe in, the power of social media and online communities to build a sense of belonging, provide peer support, overcome loneliness and much more.
The benefits and good impact can be particularly felt by people with disabilities as the disability community is more likely to suffer from loneliness based on recent research – for example by the UK based Jo Cox Foundation.
Unfortunately, cyberbullying, trolling and harassment are very prevalent online against people with disabilities and that was the primary reason we created the My Disability Matters Club.
If you are a person with a disability, a friend, family member or carer, or in any way connected to the disability sector we invite you to check out and join our online disability community based on tolerance and respect. We believe the MDM Community can provide the benefits of social media for people with disabilities but in a safer, more tolerant and respectful environment.
Do you call that a haircut? I hope you didn’t pay for it.
Oh please this is rubbish, you’re a disgrace to yourself and your profession.
These are just two examples of comments that have followed articles I have written in my career. While they may seem benign compared with the sort of violent and vulgar comments that are synonymous with cyberbullying, they are examples of the uncivil and antisocial behaviour that plagues the internet.
Feeling lonely or just want to make new friends? Come join the MDM Club for free. The Club is our disability and NDIS community where you can chat in a safe, tolerant and respectful environment. Our Club members include people with autism, depression, anxiety, mental illness, blindness, deafness and many other disabilities.
If these comments were directed at me in any of my interactions in everyday life – when buying a coffee or at my monthly book club – they would be incredibly hurtful and certainly not inconsequential.
Drawing on my own research, as well as that of researchers in other fields, my new book “Uncovering Online Commenting Culture: Trolls, Fanboys and Lurkers” attempts to help us understand online behaviours, and outlines productive steps we can all take towards creating safer and kinder online interactions.
Steps we all can take
Online abuse is a social problem that just happens to be powered by technology. Solutions are needed that not only defuse the internet’s power to amplify abuse, but also encourage crucial shifts in social norms and values within online communities.
Recognise that it’s a community
The first step is to ensure we view our online interactions as an act of participation in a community. What takes place online will then begin to line up with our offline interactions.
If any of the cruel comments that often form part of online discussion were said to you in a restaurant, you would expect witnesses around you to support you. We must have the same expectations online.
Know our audience
We learn to socialise offline based on visual and verbal cues given by the people with whom we interact. When we move social interactions to an online space where those cues are removed or obscured, a fundamental component of how we moderate our own behaviour is also eliminated. Without these social cues, it’s difficult to determine whether content is appropriate.
Research has shown that most social media users imagine a very different audience to the actual audience reading their updates. We often imagine our audience as people we associate with regularly offline, however a political statement that may be supported by close family and friends could be offensive to former colleagues in our broader online network.
Understand our own behaviour
Emotion plays a role in fuelling online behaviour – emotive comments can inspire further emotive comments in an ongoing feedback loop. Aggression can thus incite aggression in others, but it can also establish a behavioural norm within the community that aggression is acceptable.
How empathy can make or break a troll
Understanding our online behaviour can help us take an active role in shaping the norms and values of our online communities by demonstrating appropriate behaviour.
It can also inform education initiatives for our youngest online users. We must teach them to remain conscious of the disjuncture between our imagined audience and the actual audience, thereby ingraining productive social norms for generations to come. Disturbingly, almost 70% of those aged between 18 and 29 have experienced some form of online harassment, compared with one-third of those aged 30 and older.
What organisations and institutions can do
That is not to say that we should absolve the institutions that profit from our online interactions. Social networks such as Facebook and Twitter also have a role to play.
User interface design
Design of user interfaces impacts on the ease with which we interact, the types of individuals who comment, and how we will behave.
Drawing on psychological research, we can link particular personality traits with antisocial behaviour online. This is significant because simple changes to the interfaces we use to communicate can influence which personality types will be inclined to comment.
Using interface design to encourage participation from those who will leave positive comments, and creating barriers for those inclined to leave abusive ones, is one step that online platforms can take to minimise harmful behaviours.
For example, those who are highly agreeable prefer anonymity when communicating online. Therefore, eliminating anonymity on websites (an often touted response to hostile behaviour) could discourage those agreeable individuals who would leave more positive comments.
Conscientious individuals are linked to more pro-social comments. They prefer high levels of moderation, and systems where quality comments are highlighted or ranked by other users.
Riot Games, publisher of the notorious multiplayer game League of Legends, has had great success in mitigating offensive behaviour by putting measures in place to promote the gaming community’s shared values. This included a tribunal of players who could determine punishment for people involved in uncivilised behaviour.
Analytics and reporting
Analytical tools, visible data on who visits a site, and a real-time guide to who is reading comments can help us configure a more accurate imagining of our audience. This could help eliminate the risk of unintentional offence.
Providing clear processes for reporting inappropriate behaviour, and acting quickly to punish it, will also encourage us to take an active role in cleaning up our online communities.
We can and must expect more of our online interactions. Our behaviour and how we respond to the behaviour of others within these communities will contribute to the shared norms and values of an online community.
However, there are institutional factors that can affect the behaviours displayed. It is only through a combination of both personal and institutional responses to antisocial behaviour that we will create more inclusive and harmonious online communities.