Digital Behaviour and Risk Exposure
Lockdowns and remote learning have impacted student digital behaviour and exposure to risk online. Some recent reports include:
- Kids Helpline reported a 55 per cent increase in cyberbullying related search terms
- The e-Safety Commissioner revealed that illegal and harmful content increased by 123 per cent and image-based abuse leapt by more than 172 per cent
- Teachers have mentioned increased digital distraction and rising anxiety levels among students returning to school after full days behind screens
- More parents are describing their struggle to manage childrens’ and teenagers’ use of digital technology
- Many children complain to me about their parent’s excessive use of mobile phones.
Explicit and disturbing content is being ‘served up’ to children online and young people themselves are calling on adults in government and in tech companies to step in to protect them.
Experts are calling for a clear duty of care to be established after research conducted with some of society's most vulnerable children and young people found more than 70 per cent have seen content online that they found concerning, including violent and explicit content.
Reporting for The Social Switch Project at the Australian National University's College of Law, Dr Faith Gordon said:
"Adults and the law are always 10 steps behind…. Kids are telling us that they aren't alright online. Yet companies lack transparency and accountability.”
She added: "Alarmingly, we found children as young as two or three years old have been exposed to really violent and sexually explicit content…Children also talked about experiencing unwanted contact, often from adults posing as children or being bombarded with scams."
The report found only 40 per cent of children who experienced online harm reported it to the platforms they were using. Young people who complain typically report feeling ‘re-victimised’ due to no response or automated replies and inaction from online platforms.
Why is it that companies do not appear to be held to account? There needs to be a clear duty of care and companies need to be much more transparent. This needs to be coupled with a legislative framework that upholds and promotes the rights of children.
Children's rights need to be considered in making reforms and that is missing in the frameworks that currently exist in Australia. Although parts of Australia's e-safety model are leading the way internationally, there are grey areas around legal but harmful content online and how rights-based approaches work in practice.
Proposed new privacy legislation will require social media companies, under law, to act in the best interests of children when accessing their data.
A recent US Senate hearing heard that Facebook knows the disruptive consequences that Instagram’s design and algorithms are having on young people in our society, but it has routinely prioritised its own rapid growth over basic safety for our children. It was reported that there exists: “riveting evidence that Facebook knows of the harmful effects of its site on children and that it has concealed those facts and findings.”
We know that children and young people can get around verification processes. There needs to be a more rigorous process for their age group, as well as for adults who should also be verifying their age.
These reports all emphasise the vital need for parents to be supervising the use of technology by their children and modelling safe and aware use of social media and technology themselves.
Further reading for concerned parents can be found at:
Alan Clarke DGPP MAPS MACE