UK’s Ofcom says one-third of under-18s lie about their age on social media

Companies like Instagram are getting been heavily fined (and dragged through the publicity coals) over how they have mishandled children’s privacy on their platforms. But if a recent report from Ofcom is accurate, maybe they are getting off lightly.

The UK media watchdog is publishing research today that found that one-third of all children aged between 8 and 17 are using social media with falsified adult ages, based mainly on them signing up with a fake date of birth.

It also noted that social media use by these younger consumers is extensive: of those using social media aged between 8 and 17, some 77% are using services on one of the larger platforms under their own profile; 60% of those in the younger bracket of that group, aged 8 to 12, have accounts under their own profiles (others use their parents’ it seems).

Up to half of those under age signed up on their own; and up to two-thirds were aided by a parent or guardian.

The three pieces of research, commissioned by Ofcom from three separate organizations — Yonder Consulting, Revealing Reality, and the Digital Regulation Cooperation Forum — are coming out ahead of the UK pushing forward on the Online Safety Bill.

Years in the making (and still being altered, seemingly, with each changing political tide in country), Ofcom expects for the Bill to be ratified finally in early 2023. But the mandate of the bill is a tricky (if not potentially self-contradicting) one, aiming to both “make the UK the safest place in the world to be online” while also “defending free expression.”

In that regard, the research Ofcom is publishing could be viewed as a cautionary signal of what not to overlook, and what could easily spill into mismanagement if not handled correctly, regardless of which platform those younger users are using at the moment. But it also highlights the idea of taking different approaches to different kinds of over-18 content.

Ofcom notes that in even within the area of children and digital content, there seems to be a fundamental grey area as far as adults’ perceptions are concerned: some content marked for “adults” such as social media and gaming is relatively “less risky” than other adult content like gambling and pornography, which are always inappropriate for underage users. The former is more likely to rely on simple verifications (which are easy to skirt around). Parents and children, the research found, were more inclined to favor “hard identifiers” like identification verification for the latter sites.

The choices that parents are making also highlight just how entangled digital platforms have become in the lives of their young people, and how good intentions might land in the wrong way.

Ofcom said that parents noted that in cases where they viewed content as “less risky” — such as on social media or gaming platforms — they were balancing keeping children safe with both the peer pressure their children faced (not wanting to feel left out) and the idea that as they grew older, they wanted them to learn how to manage risks themselves.

But that is not to say that social media is always less risky: the recent court case in the UK investigating the death of a teenaged girl found that self-harm and suicide content the girl found and browsed on Instagram and Pinterest were factors in her death. That highlights how sites like these police the content that appears on their platforms, and how they steer users towards or away from it. And given that children who lie about their age at 8 to get online, are still only 13 five years later, aging out of the problem disconcertingly can take years.

The aim of keeping freedom of expression intact may well increasingly be put to the test. Ofcom notes that it’s coming up to its first full year of regulation of video sharing platforms. Its first report will focus “on the measures that platforms have in place to protect users, including children, from harmful material and set out our strategy for the year ahead.”

You may also like...