Meta's Teen Accounts Are Sugar Pills for Parents, Not Safety for Kids

(overturned.substack.com)

55 points | by kellystonelake 21 hours ago ago

32 comments

  • Argonaut998 20 hours ago ago

    I generally hate social media and their antics but the one thing I’m taking their side on, is this. Children’s exposure to content on the internet is a parenting issue. It is impossible for Meta, ByteDance etc to filter any and all possible content that may not be suitable for minors. Parents should know that.

    Zuckerberg famously doesn’t (didn’t?) let his children use Facebook. Perhaps everyone else should take a hint.

    • bartread 19 hours ago ago

      > Zuckerberg famously doesn’t (didn’t?) let his children use Facebook. Perhaps everyone else should take a hint.

      And I think he was right to do this.

      But he’s also the person who created Facebook, who allowed it to be used by children, so it’s extremely hypocritical, no?

      Why doesn’t he make Facebook for over 18s only?

      Money is why.

      He’ll protect his own children whilst at the same time he thinks it’s fine to harm other peoples’. He think this whilst knowing full well that not all parents are created equal and that many parents won’t (or perhaps can’t) do what is best for their children’s welfare relating to social media. And this is because he values making money more, and because he’s an out of touch and elitist manchild who refuses to take responsibility for what he’s created.

      So, yes, it’s Meta’s - and specifically Mark Zuckerberg’s - fault that children are exposed to harmful content, and both they and he should absolutely be held to account for it.

      Fundamentally social media companies are media companies. Just because it’s a new form of media doesn’t mean they should get to dodge the scrutiny and responsibilities other media companies are subject to.

      • gruez 17 hours ago ago

        >But he’s also the person who created Facebook, who allowed it to be used by children, so it’s extremely hypocritical, no?

        >Why doesn’t he make Facebook for over 18s only?

        Should non-smokers be allowed to be tobacco executives? I think everyone agrees that smoking is bad, but whether the CEO partakes in it shouldn't really be a relevant factor either way.

        • anang 17 hours ago ago

          Is it ok if a tobacco executive downplays risks with smoking while at the same time forbidding their own children from smoking?

          I think that’s a more accurate analogy, and I think it also would be reprehensible behavior.

    • Waterluvian 20 hours ago ago

      In concept you're not necessarily wrong. But I think this is one of those "why can't people just <do something that's impossible for people who don't live the life you imagine everyone to live>?"

      Back in the day, hordes of kids were just set loose on the city to find empty lots to fuck around in, because a lot of families are just scraping by and the whole concept of full-time supervision of children is laughably naive. Now they're on the TikToks and the Facestagrams, which has its own set of advantages and disadvantages.

      • jhanschoo 19 hours ago ago

        This is something that I want to emphasize. While of course, all this discussion is centered around the present-day US context. It is not realistic to expect parents to curate their children's world, it is too much supervision to expect from two people, and historically not the case. Parents do have a domain in which they are able to supervise children, but beyond that, they rely on a tripartite social contract with their communities immediate (school, family friends, neighbors, etc.) and distal (Internet, government social programs, parenting/baby products, entertainment) to help provide a safe enough environment for their children to apprehend the world in even in times when supervision is not possible.

        In this case, Meta's product that is says are suitable for teens simply doesn't meet the expectations of safety that most parents expect and it's good that there is reporting on it to inform them.

    • dfxm12 19 hours ago ago

      This is naive, given the situation. Meta is promoting a product under reportedly false pretenses to convince parents to allow their kids into the Meta platform early. In a way, it's like advertising cigarettes with cartoons (except here the advertising is targeting the parents, not the kids).

    • a456463 18 hours ago ago

      It is totally a parenting issue. Nothing to be solved by regulation or anybody else. That is the "BIG BAG WORLD". And this is from someone who absolutely loathes all Big Tech companies, Apple, FB, Google, MSFT and AMAZON. And I work in tech for 18+ years

      EDIT: What adults need to understand is that IG is an ad platform first and not an IM or connection platform. And leave it. If you can't convince adults then children are totally lost cause. Adults also face the same problems. These don't go away when you are not a teen. Where your teen goes and what they do, is up to you as a parent.

      • kellystonelake 11 hours ago ago

        It would still be wrong for a company to be offering safety products that aren’t effective to parents and marketing them as effective, right?

    • pjc50 20 hours ago ago

      How is it possible for parents of average technological ability and limited means to do what a multi billion dollar platform cannot?

      • Argonaut998 19 hours ago ago

        My parents didn’t know how to turn on a PC or use a phone and they knew what I was looking at or talking to until I was 16 years old.

        There are all kinds of services that parents can use now to filter this even further than what was possible 10-15 years ago

        • pjc50 18 hours ago ago

          My parents let me go and hang out with friends and commute to school by train. I guess everyone is expected to be full helicopter these days.

          One of my friends was busted at school for distributing porn downloaded from BBS on floppy disks.

        • kellystonelake 19 hours ago ago

          The point of this research is that these services are often ineffective at "filtering" yet, as your comment demonstrates, makes people (parents, regulators) feel like the platforms are more safe.

          • Argonaut998 19 hours ago ago

            Parents should be mature enough to not take a scandalous social media company too seriously without any scrutiny.

            Note I’m not saying that Facebook don’t profit from being as ineffectual as possible, but ultimately that parents should know better.

            • a456463 18 hours ago ago

              I think it is wild for people to understand what you are saying. Taking responsibility for your children instead of finding excuses. I had full internet access growing up and still I new better.

    • observationist 19 hours ago ago

      It's a parenting issue. There may be kids who can use it responsibly, to a specific purpose. It's up to parents to decide and monitor and parent that, and I think it's fine for companies to not take it upon themselves to act for the parents. Part of the goals of parenting is to teach children how to engage with the world responsibly, and for better or worse, social media is now a fixture.

      In a perfect world, the internet shouldn't be used by anyone under 18 without monitoring by their parents. That doesn't mean we should legislate criminal penalties for parents who fail to "correctly" parent, nor should we penalize companies whose service or product is used poorly or whose use results in a negative outcome.

      Some things are cultural and social, and government isn't the right tool to fix problems. The cost of governance - loss of privacy and agency, unnecessary intrusion and bureaucracy, mistakes and waste of money and time - far exceeds the cost of letting society figure its shit out.

      Yeah, there will be tiktok casualties, zombies, people with feed-fried brains. There will even be suicides and tragedies and stupid avoidable deaths from meme challenges. That's better than the alternatives, and it's not ok to put the responsibility for those negative social outcomes on companies when it's a failure of parenting. It's tragic and terrible, but we shouldn't let sympathy for parents allow shifting the blame to social media or AI companies.

      That being said, there should be guardrails - no algorithm tuning to target children, rapid detection, reporting, expulsion, and lifetime bans for predators on a platform, no advertising to children, and so forth. Require parental consent and monitoring, with weekly activity summaries emailed to parents, or things like that - empowering parents and guardians and giving them tools to make their life easier, that'd be admirable.

      With platforms like Roblox, they're effectively so large that it's impossible to responsibly moderate, so they get infested with predators and toxic situations.

      I think it's probably going to require society to limit how big platforms and companies can get - if the service or product or platform cannot be managed responsibly at a certain scale, then they're not allowed to operate at that scale anymore.

    • jchw 19 hours ago ago

      It's kind of amazing just how horrifically wrong this is all playing out. It actually feels like it's playing out worse than the worst case scenarios I could come up with.

      First there is the obvious question: who is giving teenagers unfettered access to the Internet? Phones cost money. Home Internet costs money. Mobile data costs money. Best you can say is kids could get online using McDonalds WiFi with a phone they bought for lawnmower money, but we don't have to play pretend. We know how they got phones and Internet, the same way kids and teens were exposed to TV. Apparently though, despite this obvious first step in accountability, it's just absolutely all shoulders. This step is apparently so unimportant it has been completely not worth mentioning. I hate to just bitch about parents because I absolutely know parenting isn't easy, is important for society, and that the conditions we're in make it hard to feel "in control" anymore. On the other hand, this isn't really exactly a new problem. All the way back in 1999, South Park: Bigger, Longer and Uncut basically addressed the same god damn thing. And I don't mean to equate obscenity on TV with the kinds of safety risks that the Internet can pose, but rather particularly the deflection of blame that parents engage in for things they directly facilitated. Seriously, the "Blame Canada" lyrics somehow feel as prescient as ever now:

          Blame Canada
          Shame on Canada
          For the smut we must stop, the trash we must smash
          The laughter and fun must all be undone
          We must blame them and cause a fuss
          Before somebody thinks of blaming us
      
      Though honestly, I absolutely think that this doesn't mean social media companies aren't to blame. Everyone knew we were selling sex and ads to kids. I think that Twitter and Tumblr were extremely violently aware that they were basically selling sex to kids, and if anything just tried as much as possible to ensure their systems had no way for them to be able to account for it (on Twitter many people have always wanted an option to mark their accounts as 18+ only, but as far as I know you still can't. A few years ago I think they added a flag they can apply to accounts that hides them, but it's still not possible. And although Twitter has sensitive content warnings... It doesn't actually let you specify that something is explicit. Only that it is sensitive, or maybe even that it contains nudity. I think this blanketing is intentional. It provides some deniability.) For their role in intentionally blurring lines so they can sell sex and ads to kids while mining their data, they should be penalized.

      But instead, we're going to destroy the entire Internet and rip it apart. Even though the very vast majority of the Internet really never had any problems that crop up with kids on social media to any particularly alarming scale, we're going to basically apply broad legislation that enforces ISP level blocks. In my state there was a proposal to effectively ban VPNs and Internet pornography altogether!

      I think ripping apart the Internet like this is basically evil for all parties involved. It's something regulators can do to show that they really care about child safety. It's something parents can support in stead of taking accountability for their role in not doing the thing called "parenting". And I bet all of the massive social media companies will make it out of this just fine, essentially carving their own way around any legislation, while the rest of the Internet basically burns down, as if we really needed anything to help that along.

      We will never learn.

    • HWR_14 18 hours ago ago

      Why not just prevent people under 18 from using social media?

      • 2OEH8eoCRo0 17 hours ago ago

        It opens HNs favorite other can of worms called age verification.

  • softwaredoug 20 hours ago ago

    Is there anyone else like me that just has teens disinterested in social media? My teen spends a lot of time online, but mostly its group texts / chats with friends.

    • paxys 20 hours ago ago

      Do you define TikTok and Snapchat as social media? Because teens are definitely interested in those.

    • kellystonelake 20 hours ago ago

      Over 20% of adolescents met criteria for “pathological” or addictive social media use, with an additional ~70% at risk of mild compulsive use. Teens themselves often recognize the problem—many say social media makes them feel “addicted” or unable to stop scrolling, even when it negatively affects their mood. The Surgeon General highlighted that teens commonly report social media makes them feel worse “but they can’t get off of it“

      • softwaredoug 19 hours ago ago

        One lilfehack a parent told me about -- instead of buying your kid a phone, buy them a cellular capable smart watch they can still text / call to some extent.

  • tartoran 20 hours ago ago

    Social media is toxic for adults and especially for children. Parents should get themselves off social media and then children will likely follow suit.

  • cynicalsecurity 20 hours ago ago

    I don't quite understand why those activists expect companies to watch kids. It's parents' job. Facebook and Instagram are like big malls. Don't leave your child unattended. A mall is not a kindergarten. Educate parents on how they can protect their kids and make them responsible for it. That is how it's supposed to work. Now internet companies are forced to become nannies for both children and their parents, this is ridiculous. And we must suffer with the the UK fascist laws regarding internet and upcoming EU chat control nonsense because some parents can't watch their children.

    • pjc50 20 hours ago ago

      Do people think it is practical to watch children 24/7, including while the child is at school and the parent is at work?

      • bonoboTP 19 hours ago ago

        Half their ailments are from social media, the other half from obsessive helicopter parenting.

    • kellystonelake 20 hours ago ago

      Meta markets products as safe for kids and offers products to reinforce this for parents and regulators but they’re not actually keeping kids safe. Meanwhile, kids die. That’s less an issue of parenting and more one of corporate responsibility.

  • keanb 20 hours ago ago

    This is a bit one-sided innit. It doesn’t include the explanation given by Meta, just their rejection of the claims.