Time Limits for Minors on TikTok Insufficient in Curbing Social Media Addiction
Since its creation roughly two decades ago, social media usage has proliferated at a rapid pace. This has prompted concerns about its negative health effects on minors. Studies have linked excessive use of social media to feelings of isolation, hopelessness, and depression. In the past several years, there has been a steadily increasing bureaucratic assault on TikTok. Former President Donald Trump famously signed an executive order to ban TikTok, though it wasn’t effective. TikTok’s decision to limit screen time could be seen as an extension of this same understanding of social media as harmful. Earlier this month, TikTok made a press release announcing a new measure to limit screen time usage among minors. Under this new policy, TikTok users indicating their age as under 18 when creating a profile will be limited to 60 minutes of TikTok usage a day. After 60 minutes, users will be prompted with a password that they must input themselves. For users under the age of 13, guardians will have to input a separate password on their own devices.
Some may argue this decision on TikTok’s part is informed largely by science; however, others may be more inclined to label this as a strategic move to curry favor from lawmakers and civilians alike. In either case, it is my belief that TikTok pushes harmful content onto minors, relies on a sophisticated algorithm that targets youths, and has historically done very little to mitigate the negative effects of its content.
In theory, TikTok’s new measure will reduce these harmful effects. Less social media usage has been proven to boost mental health, productivity, and intelligence. A reduction in minors’ social media should be a good thing.
My own experience with screen time-limiting software is narrow. Like many young people, I faced very few screen time restrictions in my later teens. My parents trusted that I would be able to monitor myself. They were only partially correct. At first, my usage was extreme, and I would spend most of my day on digital interfaces rather than engaging with the world. However, in experiencing firsthand the negative effects of social media on my mental health, I learned how to impose my own restrictions on social media usage.
After the 60-minute time limit, TikTok users would have to input a password. What’s preventing minors from learning the password to bypass the time limit, or simply setting the password themselves? Also, though minors’ brains are not fully developed in many ways, they learn and retain information faster than any age group. What’s stopping them from registering their TikTok accounts with an earlier birthdate, effectively bypassing the screen time monitoring function?
To complicate the issue further, it’s unclear what TikTok’s motives are to begin with. As a social media company, wouldn’t it be in its best interest to maximize user in-app time? It makes more money if more people use its app for longer, so what’s influencing it to employ this policy now? Sure, it may be a morally just, well-guided idea to better the world, but it also may be a PR ploy. After all, TikTok operates on one of the most discreet, clever, manipulative, and complicated algorithms in the social media sphere and has been involved with arguably the most large-scale scandals of any social media platform in recent years. For example, an intensive study conducted by the Center for Countering Digital Hate in the U.K. found that the TikTok algorithm intentionally pushes self-harm and eating disorder content to the forefront of teenagers’ feeds, recommending videos promoting intense weight loss, glorifying razor blades and sharp objects, and discussions about idealized body types and suicidal ideation.
Even assuming TikTok’s decision is in the best interests of its users, it’s still not necessarily effective. According to worldwide statistics from 2022, TikTok was only the sixth most-used social media app of the year. TikTok limiting its screen time usage for minors will only limit a small amount of potential screen time. Many other popular social media sites, such as YouTube and Instagram, have, at most, only minor screen time monitoring functions. Several years ago, Apple developed the Screen Time function to monitor usage for everyone, not just minors, but its efficacy in changing habits is somewhat questionable. I’ve been a part of many conversations in which people have bragged about having the highest average daily phone usage, so it seems like Apple’s Screen Time initiative may have backfired.
If we’re being really honest, the responsibility to monitor screen time in relation to a child’s health should fall on parents, and once a child reaches the age of majority, it effectively becomes their own responsibility to self-regulate and preserve their mental, physical, and emotional health. To me, TikTok’s new measure in some ways represents a sort of inevitable dystopian future — one in which social media and technology creates problems, tries to solve them, and likely fails. Trends in data suggest people will only become more reliant on social media and technology in future years.
So, while on-off measures to curb screen time, like TikTok’s new feature, may help, there’s always the possibility that we’ll never go back to a time when people are unable to regulate their own technology usage, when children ages 8–14 are spending between six and nine hours a day on average on screens. Social media is contributing to poor mental health, mostly among young people, at an increasing rate per year. So, personally, I’m in favor of any measure to reduce it.