We Need Your Help!

We ensure that our young creators are fairly paid for their work, but all the content on VoiceBox remains free for you to enjoy on a safe, ad-free platform. To keep it this way, we rely on the generous support from readers like you.

Please consider making a donation, no matter how small. Every penny goes directly to supporting young creators, and it only takes a minute of your time. Thank you!

VoiceBox's Response to Ofcom's Three Year Media Literacy Strategy

A consultation response
Profile picture of VoiceBox

Created by VoiceBox

Published on Jul 4, 2024
ofcom logo on phone
Ofcom Logo

The advancement of media literacy resources and curriculum is becoming increasingly important as tech developments occur at a rapid pace. Our research shows that the swift integration of AI into everyday platforms highlights this urgency as users must quickly adapt to new technologies.

VoiceBox recently responded to Ofcom’s consultation call about their three-year media literacy strategy. Our response covered the following points: 

  • It is important that AI literacy is a part of future media literacy resources and curriculum 
  • It is crucial that young people are involved in the design of media literacy resources made for them
  • VoiceBox supports Ofcom's initiative to work with big tech companies to promote media literacy, but is concerned about the effectiveness of collaboration among competing tech platforms and urges Ofcom to consider this challenge
  • In order to have effective and accurate media literacy resources, there is a need for greater transparency from tech organisations about how user data is being harvested and utilised
  • It is crucial to have timely media literacy resources for emerging technologies as the current pace is leaving users without the information and support they may need

If you have any questions about our response, please email us at info@voicebox.site.

VoiceBox responses are developed and funded directly by VoiceBox and are entirely independent of any other projects.

You can read our full response below: 

scribble doodle

Introduction

VoiceBox is an international youth content platform built to help young creators thrive. We publish a wide variety of content from young people aged 13-25 worldwide, ranging from entertainment to world events to challenging subjects. All creators are paid for their publications. 

Through our content platform and Ambassador programme, VoiceBox acts as an Early Warning System for decision-makers, bringing their attention to emerging trends and topics that young people care about. We lead a variety of research, projects, and campaigns, including our latest Coded Companions report, which explored the impact of Replika and My AI on young people’s relationships and mental health. We have been commissioned by world leaders in tech, including Meta, TikTok, Nominet, and the ICO, to include young people in their work. 

To learn more about VoiceBox, make sure to visit our website.

For this consultation call, we prioritised and responded to the two questions that we felt were the most impactful on the young people we work with. If you have any questions or want us to elaborate further on our responses, please do not hesitate to reach out. 

Question 2: Do you agree with our proposals in this section for working with platforms? Please explain your reasons and provide any relevant supporting evidence

At VoiceBox, we are a tech-positive enterprise. We believe in working with big tech, not against them, which is why we welcome Ofcom’s steps to engage platforms in the promotion of good media literacy. As a youth organisation, we know that young people are often at the forefront of emerging technologies and trends. This means they are often exposed to new developments in the tech world long before adequate media literacy resources are developed. We can see this concern reflected in Section 3, where there is little mention of platforms integrating AI into their products – Snapchat’s My AI and X’s Grok are key examples of this phenomenon. For many young people, interactions with integrated AI chatbots will be their first exposure to the technology – not sought out, but placed in front of them. It is important they are given the appropriate knowledge and tools to ensure their interactions with AI chatbots are positive. 

We expect asking platforms to be candid about their design and outcomes will be difficult when, from our own AI chatbot research report, Coded Companions, we’ve found that many are pushing AI onto users to collect copious amounts of data. In some cases, data is being used to feed users targeted in-chat advertisements, a new method of advertising that we worry is going to become commonplace as AI chatbots continue to grow. This development exemplifies the need for media literacy methods to adapt quickly to the changing online world. Transparency and adequate education must be achieved: including what data is being collected, how that data is being used to promote adverts and how users can request data deletion.

Through our Briefing Sessions, professionals have told us they struggle to keep up with the new technologies and platforms children are using, a sentiment we’re sure Ofcom can resonate with. Our young people act as an Early Warning System for our stakeholders, keeping them on the cusp of youth issues and online trends, which helps media literacy work actually reflect the lived experiences of young people. We are pleased to see Ofcom recognising the importance of third-party media literacy projects– including our sister organisation Parent Zone’s Be Internet Legends programme – and we implore them not to cast aside the importance of youth voices in the development of future work. Peer-to-peer or community-led approaches can be far more impactful than a top-down education system, and our work with Meta’s Horizon World’s Safety Centre is a testament to consulting young people in the development of media literacy programmes to ensure they remain non-partisan and user-centred.

Finally, we would like to express our concern for Ofcom’s ‘Garden of Eden’ approach to stakeholders working together and sharing learnings. Platforms are, ultimately, in competition with each other, and we are hesitant about presupposed success from tech bosses ‘demonstrating leadership’ when tension goes back many years. Mark Zuckerberg and Tim Cook’s icy relationship is just one demonstration of ongoing squabbles, and we wonder if the small policy teams working hard to improve outcomes for users are enough to foster a collaborative culture. We ask Ofcom, as the regulator, to consider this.

Question 4: Do you agree with our assessment of the potential impact on specific groups of persons?

Our work with young people highlights the need to consult children throughout platform design and the media literacy work that impacts them. So, while we welcome Ofcom’s ‘child-rights based approach’ under section 4, we would also urge Ofcom to define what exactly ‘child-rights based approach’ means. We often see performative language used with little context and no strategy. Young people, in particular, are affected by this – words like ‘consulted’ could be a 15-minute chat with a private school prefect, while ‘child-centered’ is often defined by assumption and may not involve youth voices at all. Ofcom must clarify exactly how they will take children’s views into account. Will SEND children be included? How will diversity be prioritised? Will there be a youth board? Will children be reimbursed for their time?

We are also keen to understand how Ofcom will target groups of children for specific online issues (such as teenage boys and misogyny). Working closely with schools and youth clubs is often an untapped resource and something Ofcom should be thinking about if they aren’t already. We implore them to avoid taking a ‘one size fits all’ approach; access to hardware in disadvantaged communities, for instance, must be taken into consideration (one smartphone per family, for example, will completely alter a child’s online experience in comparison to a family with multiple devices). Other special cases, such as the impact of misogynistic and extreme content on high-risk children, who may not have the correct support system in place, should also be addressed. 

Once again, we are concerned about the lack of emphasis on AI. Our research has already found cases of young people becoming over-attached to AI technology, with AI chatbots filling a void during what is undoubtedly a loneliness epidemic. We feel the neurodiverse community, in particular, might be affected by AI chatbots and potentially limited knowledge from the professionals who support them (a problem that, on its own, illustrates the need for media literacy across age groups and sectors). Neurodiverse young people’s relationships with AI chatbots is something VoiceBox wants to explore further, and we would welcome a discussion from Ofcom to solidify our research. 

Finally, we would like to emphasise that having good media literacy skills doesn’t always mean using them in a positive way. Indeed, those with good media literacy will often spread mis and disinformation, or use Snark pages to doxx anonymous creators. Teaching media literacy is, in fact, only half the story: the ability to create and participate is equally important. Our vibrant network of creators is an example of this – young people from all over the world are using their media literacy skills to express what they are passionate about and showcase their talents in a positive way. Ofcom should ensure that this important branch of development isn’t shelved or cast aside. 

Support Young Creators Like This One! 

VoiceBox is a platform built to help young creators thrive. We believe that sharing thoughtful, high-quality content deserves pay even if your audience isn’t 100,000 strong. 

But here's the thing: while you enjoy free content, our young contributors from all over the world are fairly compensated for their work. To keep this up, we need your help.

Will you join our community of supporters?
Your donation, no matter the size, makes a real difference. It allows us to:

  • Compensate young creators for their work
  • Maintain a safe, ad-free environment
  • Continue providing high-quality, free content, including research reports and insights into youth issues
  • Highlight youth voices and unique perspectives from cultures around the world

Your generosity fuels our mission! By supporting VoiceBox, you are directly supporting young people and showing that you value what they have to say.

More for you