Florida Moves Ahead with New Laws to Restrict Teen Use of Social Media Apps

by TexasDigitalMagazine.com


Florida is the latest U.S. state to implement its own provisions around social media use, with Florida Governor Ron DeSantis signing a new bill that will ban children aged under 14 from social media platforms entirely, while also making it compulsory that 14 and 15 year old users gain explicit parental permission to sign up.

Which could add some new checks and balances for the major social apps, though the specific wording of the bill is interesting.

The main impetus, as noted, is to stop youngsters from using social media entirely, in order to protect them from the “harms of social media” interaction.

Social platforms will be required to terminate the accounts of people under 14, as well as those of users aged under 16 who don’t have parental consent. And that seemingly applies to dedicated, underage focused experiences as well, including TikTok’s younger users setting.

Which could prove problematic in itself, as there are no perfect measures for detecting underage users who may have lied about their age at sign up. Various systems have been put in place to improve this, while the bill also calls on platforms to provide improved verification measures to enforce this element.

Which some privacy groups have flagged as a concern, as it may reduce anonymity in social platform usage.

Whenever an underage user account is detected, the platforms will have 10 business days to remove such, or they could face fines of up to $10,000 per violation.

The specific parameters of the bill state that the new rules will apply to any online platform of which 10% or more of its daily active users are younger than 16.  

There’s also a specific provision around the variance between social platforms and messaging apps, which are not subject to these new rules:

The term does not include an online service, website, or application where the exclusive function is e-mail or direct messaging, consisting of text, photographs, pictures, images, or videos shared only between the sender and the recipients, without displaying or posting publicly or to other users not specifically identified as the recipients by the sender.”

That could mean that Meta’s “Messenger for Kids” is excluded, while also, depending on your definition, enabling Snapchat to avoid restriction.

Which seems like a gap, especially given Snapchat’s popularity with younger audiences, but again, the specifics will be clarified over time.

It’s another example of a U.S. state going it alone on its social media rules, with both Utah and Arkansas also implementing rules that impose restrictions on social media use for youngsters. In a related push, Montana sought to ban TikTok entirely within its borders last year, though that was less about protecting kids and more due to concerns around its links to China, and the potential use of the app as a spying tool for the C.C.P. Montana’s TikTok ban was rejected by the District Court back in December.

The concern here is that by implementing regional rules, each state could eventually be tied to specific parameters, as implemented by the ruling party at the time, and there are wildly varying perspectives on the potential harm of social media and online interaction.

China, for example, has implemented tough restrictions on video game time among youngsters, as well as caps on in-app spending, in order to curb negative behaviors associated with gaming addiction. Heavy handed approaches like this, as initiated by regional governments, could have a big impact on the broader sector, forcing major shifts as a result.

And really, as Meta has noted, such restrictions should be implemented on a broader national level. Like, say, via the app stores that facilitate app access in the first place.

Late last year, Meta put forward its case that the app stores should take on a bigger role in keeping young kids out of adult-focused apps, or at the least, in ensuring that parents are aware of such before they download them.

As per Meta:

US states are passing a patchwork of different laws, many of which require teens (of varying ages) to get their parent’s approval to use certain apps, and for everyone to verify their age to access them. Teens move interchangeably between many websites and apps, and social media laws that hold different platforms to different standards in different states will mean teens are inconsistently protected.”

Indeed, by forcing the app providers to include age verification, as well as parental consent for downloads by youngsters, that could ensure greater uniformity, and improved protection, via systems that would enable broader controls, without each platform having to initiate its own processes on the same.

Thus far, that pitch doesn’t seem to be resonating, but it would, at least in theory, solve a lot of key challenges on this front.

And without a national approach, we’re left to regional variances, which could become more restrictive over time, depending on how each local government approaches such.

Which means more bills, more debates, more regional rule changes, and more custom processes within each app for each region.

Broader policy seems like a better approach, but coordination is also a challenge.  





Source link

You may also like