The app TikTok is making headlines for many reasons. Few of them are good. In Montana, the app has been banned for national security reasons. Senators introduced a bill recently to limit young people on all social media to protect their mental health.
I first started worrying about TikTok as a mother, watching my daughter scroll through one video after another. After reading the growing research base linking social media use to all kinds of adverse outcomes, I wanted to learn more about how these apps, specifically TikTok, help shape teens’ understanding of the world around them.
The Reboot Foundation, which I founded in 2018 to promote critical thinking and media literacy, commissioned a survey of 1,000 of TikTok’s youngest users to understand better the usage habits of young TikTok users and how the time spent on social media might influence their knowledge and beliefs, especially as they relate to science. The results of our survey showed that the more time teens spend on the app, the more likely they are to question whether science helps the world or hurts it: 42% disagreed with the statement, “Science helps the world more than it harms it.” Further, 17% of teenage users couldn’t say definitively that the Earth is round; 58% thought astrology might be a science.
The findings beg questions: Does TikTok push young people to anti-science views? Does spending much time on the app increase the likelihood that users will fall for conspiracy theories and other pseudo-science ideas like astrology? These questions aren’t academic, particularly for young people who spend hours on the app, and while TikTok should not be banned, far more should be done to protect young people from the app’s dangers brought about by lax content moderation and the platform’s algorithm that promotes harmful content to young users.
Part of the issue is the nature of social media. It becomes its own media ecosystem.
Research shows that young people seldom seek out news and information from reliable sources beyond their social media apps. In other words, social media apps have become their “Walter Cronkite” – their definitive source of information. Indeed our data found a strong relationship between the amount of time young people spend on TikTok and whether they perceive the content they are viewing as trustworthy: 42% of heavy TikTok users said they thought the information on the app is “reliable,” compared with 23% of those who use it less.
Just as problematic is that social media algorithms are designed to show content with which users are likely to engage. This means that if a child shows interest in anti-science content, they are likely to see more of it in their newsfeeds. TikTok has a particularly effective algorithm, and the app’s “For You” page ensures that young people often aren’t exposed to factual or even different points of view.
Our report is far from the first to show that social media can have a negative impact on people's attitudes toward science. One 2021 study found that people exposed to anti-vaccine content on social media were more likely to be hesitant about getting vaccinated. Another recent study found that people who spend more time on social media are more likely to believe conspiratorial ideas like the government hiding the dangers of 5G cell phone towers.
The solution here is not to ban the app because some aspects of social media benefit young people.
It enables them to connect and communicate with friends, family, and peers worldwide, fostering a sense of belonging, community, and activism. And it provides a platform for creative expression, allowing them to share their talents, ideas, and perspectives while facilitating access to educational resources, news, and diverse viewpoints.
Instead of banning the apps, far more must be done to protect young people.
One idea that is gaining traction across the country and in Congress is age restrictions. In our survey, 62% of adults agreed that children under the age of 16 should be prohibited from opening social media accounts.
Ostensibly, social media platforms already do restrict their use to those older than 13. But platforms make virtually no effort to enforce these standards, and any child can bypass a platform’s age restriction by simply lying about their birthdate when they set up an account.
In a broader sense, it's time to treat social media for what it is: an addictive activity with serious societal implications.
In our survey, an overwhelming 80% of respondents agreed that social media apps should carry warning labels as we see on cigarettes and alcohol.
Products that come with so many serious downsides typically must operate under certain safety standards, and they’re usually required to warn users of the dangers of overuse.
For example, the federal government has implemented regulations that require gambling establishments to display warning messages about the potential risks of gambling and the signs of gambling addiction. Of course, addictive and dangerous products like tobacco and alcohol are highly regulated, age-restricted, and must carry their own warning labels. No such standards or regulations currently apply to social media companies.
This needs to change, and in the end, the nation needs a coordinated and society-wide effort to right-size the role social media apps like TikTok plays in youth's lives and ensure that science remains at the center of society — and not a fringe conspiracy.