Apple and Meta Clash Over Responsibility for Safeguarding Kids Online
News Synopsis
Meta and Apple are embroiled in an escalating dispute over who should bear the responsibility of protecting children and teenagers from the negative impacts of social media, including mental health issues. Meta has been at the center of criticism for contributing to the ongoing mental health crisis among teens, with concerns raised by parents, lawmakers, and various advocacy groups.
Last year, Meta's Antigone Davis, head of global safety, took a proactive approach by advocating for government regulation in a blog post, urging Apple and Google to take charge of the age-gating process. Davis suggested that parental consent should be required through app stores when teens attempt to download social media apps. In her blog post, she wrote:
"Parents should approve their teen's app downloads, and we support federal legislation that requires app stores to get parents' approval whenever their teens under 16 download apps. With this solution, when a teen wants to download an app, app stores would be required to notify their parents, much like when parents are notified if their teen attempts to make a purchase. Parents can decide if they want to approve the download. They can also verify the age of their teen when setting up their phone, negating the need for everyone to verify their age multiple times across multiple apps."
Meta’s stance seems to suggest that the company is seeking to shift some responsibility to Apple and Google by advocating for these tech giants to implement age-gating at the app store level. However, critics argue that Meta is merely attempting to deflect from the mental health crisis it has contributed to with its social media platforms, especially Instagram.
Meta's Case: Age-Gating at the App Store Level
While Meta's proposal to involve app stores in age verification might sound like an attempt to shift responsibility, the company does raise a valid point. It would indeed be easier for Apple and Google to implement age-gating and parental consent mechanisms at the app store level, rather than expecting each individual app to handle this sensitive issue. Such a unified approach could standardize how apps verify user age and streamline parental control processes.
Apple's Pushback: No Appetite for Age-Gating Responsibility
However, Apple is less than enthusiastic about the idea. According to a recent report, when legislators in Louisiana proposed a bill that would make app stores responsible for age verification, Apple quickly assembled a team of lobbyists to push back. Their efforts were successful, and the portion of the bill dealing with app store responsibility was removed.
Apple’s reluctance is understandable. Taking on age verification across the app ecosystem would introduce a host of privacy concerns. Apple already positions itself as a champion of privacy, having introduced features like the "Ask app not to track" function, which severely impacted Meta’s advertising business. Apple likely views taking on this new responsibility as not only burdensome but also unnecessary, given that it already offers parental controls and age ratings for apps in the App Store.
Ongoing Feud Between Meta and Apple
This is just the latest clash in an ongoing rivalry between Meta and Apple. Meta has long harbored resentment towards Apple for its privacy-centric policies, such as the aforementioned app tracking transparency feature, which has significantly hurt Meta’s advertising revenue. Additionally, Meta has criticized Apple's 27% commission on some in-app purchases, further adding fuel to the fire between the two tech giants. Meanwhile, Apple continues to position itself as the more privacy-focused company, often taking a subtle jab at Meta’s approach to data privacy.
The Larger Problem: Social Media's Impact on Teens
Amid this corporate feud, the issue at hand remains unresolved. There is a broad consensus that social media is contributing to the growing mental health crisis among teens, but there is no clear solution in sight. Some states, like New York, have already passed laws targeting the algorithms behind social media apps, aiming to limit their addictive nature for young users. Other states, including California, Arkansas, and Utah, have attempted to regulate social media for teens through various means, though many of these laws face scrutiny and legal challenges on First Amendment grounds.
The current debate around age verification is not without its challenges. Many users, including teens and parents, are wary of uploading sensitive personal data, like driver’s licenses, to verify their age. At the same time, merely banning kids under 13 or 15 from social media does little to address the underlying issues. Social media has been shown to have a profound impact on the mental health of young people, even those above the legal age to join these platforms.
State-Level Solutions and School Policies
In the absence of comprehensive federal regulation, individual states are taking a piecemeal approach. While this has led to some progress, the inconsistency in laws creates confusion and fails to address the problem holistically. One practical, immediate solution that some advocates are pushing for is banning cell phones in schools, a move that could potentially reduce the amount of time teens spend on social media, particularly during school hours. This could improve classroom engagement and lessen the pressure young people feel from constantly being connected online.
Conclusion: Shared Responsibility
The feud between Meta and Apple is emblematic of a larger issue—who is responsible for protecting young people from the negative effects of social media? Both companies could be doing more to tackle the problem, and government intervention may be necessary to create comprehensive solutions. For now, parents remain at the front line, navigating the challenges of keeping their children safe in an increasingly connected world.
You May Like