Study: Social media probably can't be fixed

(arstechnica.com)

143 points | by todsacerdoti 16 hours ago ago

94 comments

  • Zak 9 hours ago ago

    The study is based on having LLMs decide to amplify one of the top ten posts on their timeline or share a news headline. LLMs aren’t people, and the authors have not convinced me that they will behave like people in this context.

    The behavioral options are restricted to posting news headlines, reposting news headlines, or being passive. There’s no option to create original content, and no interventions centered on discouraging reposting. Facebook has experimented[0] with limits to reposting and found such limits discouraged the spread of divisive content and misinformation.

    I mostly use social media to share pictures of birds[1]. This contributes to some of the problems the source article[2] discusses. It causes fragmentation; people who don’t like bird photos won’t follow me. It leads to disparity of influence; I think I have more followers than the average Mastodon account. I sometimes even amplify conflict[3].

    [0] https://www.socialmediatoday.com/news/internal-research-from...

    [1] https://social.goodanser.com/@zaktakespictures/

    [2] https://arxiv.org/html/2508.03385v1#S3

    [3] https://social.goodanser.com/@zaktakespictures/1139481946021...

    • lemming 5 hours ago ago

      LLMs aren’t people, and the authors have not convinced me that they will behave like people in this context.

      This was my initial reaction as well, before reading the interview in full. They admit that there are problems with the approach, but they seem to have designed the simulation in a very thoughtful way. There really doesn't seem to be a better approach, apart from enlisting vast numbers of people instead of using LLMs/agent systems. That has its own problems as well of course, even leaving cost and difficulty aside.

      There’s no option to create original content...

      While this is true, I'd say the vast majority of users don't create original content either, but still end up shaping the social media environment through the actions that they did model. Again, it's not perfect but I'm more convinced that it might be useful after reading the interview.

      • Zak 4 hours ago ago

        I'm not sure the experiment can be done other than to try interventions on real users of a public social media service as Facebook did in the article I linked. Of course people running those services usually don't have the incentives to test harm reduction strategies and certainly don't want to publicize the results.

        > the vast majority of users don't create original content

        That's true now at least most of the time, but I think it's as much because of design and algorithmic decisions by the platforms to emphasize other types of content. Early Facebook in particular was mostly original content shared between people who knew each other. The biggest problem with that was it wasn't very profitable.

      • nradov 4 hours ago ago

        Nonsense. The vast majority of my Facebook friends post at least some original content.

        • lemming 3 hours ago ago

          Fortunately we don't have to rely on your anecdata, people actually study this stuff:

          https://news.gallup.com/poll/467792/social-media-users-incli...

          U.S. adults commonly engage with popular social media platforms but are more inclined to browse content on those websites and apps than to post their own content to them. The vast majority of those who say they use these platforms have accounts with them, but less than half who have accounts -- and even smaller proportions of all U.S. adults -- post their own content.

          https://www.pewresearch.org/internet/2019/04/24/sizing-up-tw...

          Most users rarely tweet, but the most prolific 10% create 80% of tweets from adult U.S. users

          https://www.pewresearch.org/internet/2021/11/15/the-behavior...

          The analysis also reveals another familiar pattern on social media: that a relatively small share of highly active users produce the vast majority of content.

          • nradov 2 hours ago ago

            That's junk science and doesn't refute the specific point I made. Facebook users are far more likely to post original content than X users. It might just be some blurry backlit vacation photos but it is original content.

            • smackeyacky 2 hours ago ago

              They post but it doesn’t get read, all their friends feeds are just swamped with crap like theirs is.

  • unsignedint 8 hours ago ago

    Social media as a vessel for diverse discussion is a tall order. It’s too public, too tied to context, and ultimately a no-win game. No matter how carefully you present yourself, you’ll end up being the “bad guy” to someone. The moment a discussion touches even lightly on controversy, healthy dialogue becomes nearly impossible.

    Think of it this way: you’re hosting a party, and an uninvited stranger kicks the door open, then starts criticizing how you make your bed. That’s about what it feels like to try to “fix” social media.

    • delichon 6 hours ago ago

      Whitelisting solves the problem for me. I curate every tweet I see with a browser extension. Strangers can't kick down the door. I only see content from my direct follows. It dramatically reduces the stress. Maybe a little like horse blinkers.

      • Scrounger 4 hours ago ago

        > Whitelisting solves the problem for me. I curate every tweet I see with a browser extension. Strangers can't kick down the door. I only see content from my direct follows. It dramatically reduces the stress. Maybe a little like horse blinkers.

        This extension? https://github.com/rxliuli/mass-block-twitter

      • AuthAuth 5 hours ago ago

        How do you find new things to follow? If everyone did this it would be extremely rare to encounter new content.

        • owisd an hour ago ago

          We'd just go back to human curation, you'd whitelist a few curators you liked, people wanting to promote their content would email a link to a curator, if they thought their audience would like it they'd share it, you'd see it via your whitelist and if you liked the look of it you'd whitelist them.

      • throwawayq3423 5 hours ago ago

        Do you mind sharing this extension? I would prefer if it also shows the retweets of people you follow as that is an endorsement no matter what people say.

    • Scrounger 5 hours ago ago

      > Social media as a vessel for diverse discussion is a tall order. It’s too public, too tied to context, and ultimately a no-win game. No matter how carefully you present yourself, you’ll end up being the “bad guy” to someone. The moment a discussion touches even lightly on controversy, healthy dialogue becomes nearly impossible.

      Worth reading Jaron Lanier's Ten Arguments for Deleting Your Social Media Accounts Right Now book:

      https://www.amazon.com/Arguments-Deleting-Social-Media-Accou...

    • dlachausse 7 hours ago ago

      I still think old school linear forums are the best format for online discussion. They’re not perfect by any means, but I think they still beat all the alternatives I’ve tried.

      • baubino 6 hours ago ago

        The old school forums also centered around a single topic or interest, which I think helped keep things focused and more civil. Part of the problem with social media is that it wants to be everything for everyone.

      • AraceliHarker 3 hours ago ago

        The internet has become a primary battlefield for making money, and we can't go back to the days when it was just a non-commercial hobby that people enjoyed. To make money online, it's crucial to spread content as widely as possible, and the most effective methods for this are clickbait and ragebait. That's why the enshittification of the internet was inevitable.

      • bluefirebrand 5 hours ago ago

        I like old school forums with like an optional chat room for people to sync in real time if they want

        The era of sites with a phpbb forum and an irc channel was really fun for me and I miss it a lot

        I made lots of friends that way in the past, close friends, and it's unlike anything I've encountered since then with social media

  • hansmayer 16 minutes ago ago

    Well you can't by definition fix something that is a rigged game. The social media exist to maximise the ad dollar, not to benefit you.

  • duxup 16 hours ago ago

    A lot of talk goes into how Facebook or other social media use algorithms to encourage engagement, that often includes outrage type content, fake news, rabbit holes and so on.

    But here's the thing ... people CHOOSE to engage with that, and users even produce that content for social media platforms for free.

    It's hard to escape that part.

    I remember trying Bluesky and while I liked it better than Twitter, for me it was disappointing that it was just Twitter, but different. Outlandish short posts, same lame jokes / pithy appeals to our emotions, and so on. People on there want to behave the same way they wanted to on Twitter.

    • KaiserPro 15 hours ago ago

      > But here's the thing ... people CHOOSE to engage

      Kinda, but they also don't really realise that they have much more control over the feed than they expect (in certain areas)

      For the reel/tiktok/foryou-instagram feeds, it shows you subjects that you engage with. It will a/b other subjects that similar people engage with. Thats all its doing. continual a/b to see if you like what ever flavour of bullshit is popular.

      Most people don't realise that you can banish posts from your feeds by doing a long press "I don't like this" equivalent. It takes a few times for the machine to work out if its an account, groups of accounts of theme that you don't like, and it'll stop showing it to you. (threads for example took a very long time to stop showing me fucking sports.)

      Why don't more people know this? because it hurts short term metrics for what ever bollocks the devs are working on. so its not that well advertised. just think how unsuccessful the experiments in the facebook app would have been if you were able to block the "other posts we think you might like" experiments. How sad Zuckerberg would be that his assertion was actually bollocks?

      • RiverCrochet 15 hours ago ago

        There's definitely a mass of people who can't/won't/don't get past passive/least-effort relationships with things on screens. These would be the type that in the TV days would simply leave the TV on a specific channel all day and just watch whatever was on, and probably haven't changed their car radio dial from the station they set it to when they bought the car. In modern times they probably have their cable TV they still pay for on a 24 hour news channel and simply have that going all day.

        To be fair, in times far past, you really didn't have much choice in TV or radio channels, and I suspect it's this demographic that tend to just scroll down Facebook and take what it gives without much thought other than pressing Like on stuff.

        • Mouvelie 14 hours ago ago

          Yup. Knowing the exact percentage of those people would be hurtful to my soul I think, but I suspect they drive a meaningful percentage of business. Like that time when Netflix displayed shows on, because some people couldn't be bothered to actually choose something to watch ?

      • CrimsonCape 14 hours ago ago

        Transparency would prove or disprove this. Release the algorithm and let us decide for ourselves. In my experience, Instagram made an algorithm change 3-4 years ago. It used to be that my feed was exactly my interests. Then overnight my feed changed. It became a mix of 1. interracial relationship success stories 2. scantily clad women clickbait, 3. east asian "craft project" clickbait, and just general clickbait. It felt as if "here's what other people like you are clicking on" became part of the algorithm.

    • notTooFarGone 15 hours ago ago

      >people CHOOSE to engage with that

      brains are wired that way. Gossip and rage bait is not something that people actively decide for, it's subconscious. It's weird saying that this is the problem of individuals - propaganda is effective not because people are choosing to believe it.

      • prisenco 6 hours ago ago

        Right. When we're talking about the scale of humanity itself, we've moved far past individual actions.

        At the scale we're operating, if only 1% is susceptible to these algorithms, that's enough to translate to noticeable social issues and second-order effects.

        And it's not 1%.

    • avgDev 13 hours ago ago

      Current social media have basically found the "bliss point" of online engagement to generate revenue and keep the eyes attached. These companies found a way to keep people hooked, and strong emotions seem to be a major tool.

      It really isn't a choice. It is very accessible. Many friends are on social networks and you slowly get sucked into shorts. Then, it becomes an addiction as your brain crave the dopamine hits.

      Similar to what Howard Moskowitz did with food.

      • pixl97 8 hours ago ago

        Another way to put it is, social media is an unregulated drug.

    • PaulHoule 15 hours ago ago

      What gets me about some platform is all the text-in-images and video with senseless motion. I've been dipping my toes into just about any social where I could possibly promote my photography and the worst of them all is Instagram where all the senseless motion drives me crazy.

      • duxup 15 hours ago ago

        Yeah I miss geocities. The pages were ugly, but they were that users ugly ... gloriously personal ugly.

        Facebook is not my page, it looks nothing like I want... my content is in many ways the least important thing featured.

    • PaulHoule 14 hours ago ago

      Personally I really enjoy Mastodon and Bluesky but I am very deliberate at avoiding negative people, I do not follow and often mute or block “diss abled” people who complain about everything or people who think I make their life awful because I am cisgender or who post 10 articles an hour about political outrage. The discover page on Bluesky is algorithmic and respects the “less like this” button and last time I looked has 75% less outrage than the following page. (A dislike button that works is a human right in social media!)

      Once I get my database library reworked, a project I have in the queue is a classifier which filters out negative people so I can speed follow and not add a bunch of negativity to my feed, this way I get to enjoy real gems like

      https://mas.to/@skeletor

      Cross posting that would cure some of the ills of LinkedIn!

      • Scrounger 4 hours ago ago

        > Bluesky

        FWIW, I've been consistently posting quality stuff on Bluesky for the last year, and despite having a few hundred followers, I get ZERO engagement.

        People in the Bluesky subreddit tell me it's not a "post and ghost" platform in that you have to constantly interact with people if you want to earn engagement, but that's too time consuming.

        In other words, the discovery algorithm(s) on BlueSky sucks.

        • immibis an hour ago ago

          It's just Twitter 2. It's the same as Twitter, made by the same people who made Twitter, doing the same thing as Twitter in the same way as Twitter, with the same culture as Twitter, plus a fig leaf to decentralisation.

    • derbOac 15 hours ago ago

      > while I liked it better than Twitter, for me it was disappointing that it was just Twitter, but different

      I feel exactly the same way.

      I think there needs to be a kind of paradigm shift into something different, probably something that people in general don't have a good schema for right now.

      Probably something decentralized or federated is necessary in my opinion, something in between email and twitter or reddit? But there's always these chicken and egg issues with adoption, who are early adopters, how that affects adoption, genuine UX-type issues etc.

      • 9rx 15 hours ago ago

        > Probably something decentralized or federated is necessary in my opinion, something in between email and twitter or reddit?

        So, Usenet? The medium is the message and all that, sure, but unless you change where the message originates you are ultimately going to still end up in the same place.

  • taeric an hour ago ago

    This seems somewhat disproven by the existence of places like this? Strict moderation really does work wonders to prevent some of the worst behaviors.

    Not that you won't have problems, even here, from time to time. But it is hard to argue that things aren't kept much more civil than in other spots?

    And, in general, avoiding direct capital incentives to drive any questionable behavior seems a pretty safe route?

    I would think this would be a lot like public parks and such. Disallow some commercial behaviors and actually enforce rules, and you can keep some pretty nice places?

    • magzter an hour ago ago

      I generally agree that strict moderation is the key but there's obviously a certain threshold of users and activity that is hit where this becomes unfeasible - ycombinator user activity is next to nothing compared to sites like Facebook/twitter/reddit. Even on Reddit, you see smaller subreddits able to achieve this.

      But just like a public park, if 2 million people rock up it's going to be next to impossible to police effectively.

  • thrown-0825 4 hours ago ago

    The problem is people.

    As a species we are greedy, self serving, and short sighted.

    Social Media amplifies that, and we are well on our way to destroying ourselves.

    • mindwok 3 hours ago ago

      This is true as individuals, but importantly as a society we have far more agency than sometimes it feels like when you watch us all acting out our own individual self-destruction.

      Banning CFCs, making seatbelts a legal requirement, making drink driving illegal, gun control (in countries outside the USA), regulations on school canteens. These are all examples of coordination where we've solved problems further upstream so that individuals don't have to fight against their own greedy, self-serving, short-sighted nature.

      We do have the ability to fix this stuff, it's just messy.

      • thrown-0825 2 hours ago ago

        If you don’t think societies can be greedy, self serving, and short sighted I don’t know what to say.

        We have raped this planet into a coma and our children will have to scrape together whatever remains when we are done.

        • mindwok an hour ago ago

          I didn't mean to imply that, they definitely can be those things and far worse. But there are many examples of societal coordination that achieve the exact opposite of that (Scandinavian countries are of course the canonical example).

          Things can change.

  • PaulHoule 15 hours ago ago

    I work in survey research and I'm rather appalled at how many people would rather survey a sample of AIs than a sample of people and claim they can come to some valid conclusion as a result.

    There are many ways AIs differ from real people and any conclusions you can draw from them are limited at best -- we've had enough bad experiments done with real people

    https://en.wikipedia.org/wiki/Stanford_prison_experiment#Int...

    • jerf 15 hours ago ago

      Appalling. The entire question of "fixing social media", for any definition of "fixing", involves not just the initial reaction to some change but the second-and-greater-order effects. LLMs are point-in-time models and intrinsically can not be used for even guessing at second-order effects of a policy over time. This shouldn't have gotten past the proposal phase.

    • reactordev 15 hours ago ago

      I trust your judgement more than Ars Technica.

      For us layman, the flaw of using AI trained on people for surveys is, human. Humans have a unique tendency to be spontaneous, wouldn’t you say?

      How would a focus group research team approach this when they’re bombarded by AI solutions that want their research funds?

      • PaulHoule 14 hours ago ago

        The worst problems with people these days seem to be they don’t pick up the phone. Probability-based polls are still pretty good about most things unless they involve Donald Trump —- it seems some Trump supporters either don’t pick up the phone or lie to pollsters. Some polls correct for this with aggressive weighting but how scientific it really is is up in the air.

    • lispisok 13 hours ago ago

      >I work in survey research and I'm rather appalled at how many people would rather survey a sample of AIs than a sample of people and claim they can come to some valid conclusion as a result.

      A YC company just launched doing exactly that.

      https://news.ycombinator.com/item?id=44755654

    • add-sub-mul-div 15 hours ago ago

      "We trained a model on Twitter and Reddit content and were shocked to discover it generates a terrible community."

      It's so weird to live in a time when what you just said needs to be said.

    • Mouvelie 14 hours ago ago

      Genuine question : are you scared for your job ? I see this tendency to use "synthetic personas" growing and frankly, having to explain why this sucks is insulting in itself. Decision makers are just not interested in having this kind of thought argument.

      • PaulHoule 14 hours ago ago

        Not really. Sales is doing better than it ever has since I’ve been here. For one thing, AI folks want our data. Despite challenges in the industry, public opinion is more relevant than ever and the areas where we are really unsurpassed is (1) historical data and (2) the most usable web site, the latter one I am a part of.

    • naravara 13 hours ago ago

      It doesn’t surprise me if they found that the emergent behaviors didn’t change given their method. Modifying the simulation to make them behave differently would mean your rules have changed the model’s behavior to “jump tracks” into simulating a different sort of person who would generate different outputs. It’s not quite analogous to having the same Bob who likes fishing responding to different stimuli. Sort of like how Elon told Grok to be “unfathomably based” and stop caring about being PC” and suddenly it turned into a Neo-Nazi Chan-troll. Changing the inputs for an LLM isn’t taking a core identity and tweaking it, it’s completely altering the relationships between all the tokens it’s working with.

      I would assume there is so much in the corpus based on behavior optimized for the actual existing social media we have that the behavior of the bots is not going to change because the bot isn’t responding to incentives like a person would it’s mimicking the behavior it’s been trained on and if there isn’t enough training data of behavior under the different inputs you’re trying to test you’re not actually applying the “treatment” you would think you are.

    • richardubright 14 hours ago ago

      Wait what? Is there an article on this. That sounds absolutely insane.

  • mediumsmart 2 hours ago ago

    Social media is a few people selling the data of many people looking at content made by some people selling something.

    There is also research and promotion of values going on and the thing as a whole is entertaining and can be rigged or filtered on various levels by all participants.

    It’s kind of social. The general point system of karma or followers applies and people can have a career and feeling of accomplishment to look back on when they retire. The cosmic rule of anything. too much, no good applies.

    It’s not really broken but this is the age of idiots and monsters, so all bets are off.

  • SkepticalWhale 15 hours ago ago

    I'd like to see more software that amplifies local social interactions.

    There are apps like Meetup, but a lot of people just find it too awkward. Introverts especially do not want to meet just for the sake of meeting people, so they fallback on social media.

    Maybe this situation is fundamentally not helped by software. All of my best friendships organically formed in real-world settings like school, work, neighborhood, etc.

    • fellowniusmonk 15 hours ago ago

      I ran a co-working space social club that resolved this issue for many introverts in 2015-2017.

      This is at core a 3rd places issue, haven't had the capital to restart it post covid.

      • barbazoo 15 hours ago ago

        That sounds interesting. How did that work, did you rent a place for coworking and then opened it up for the social aspect?

        • fellowniusmonk 14 hours ago ago

          My tech (I was founding engineer and CTO) company took over a co-working space and expanded it, we ran that portion of it at break even.

          We intentionally set out to create a social club/co-working space. A lot goes into it. I'm a non-theist who comes from a multi generational group of theist church planters (like 100s of churches, just over and over), it's a multi factorial process with distinct transitions in space-time and community size, where each transition has to be handled so you don't alienate your communities founding members (who will be very different from later members) and still are able to grow.

          People don't do it because they can't see the value while they are in the early mess of it. You have to like people to pull it off, you have to NOT be a high control person who can operate at high control at certain developmental stages. You have to have a moral compass everyone understands and you are consistent with, tech people like 0 trust. You have to create a maximum trust environment which means NOT extracting value from the community but understanding that the value is intrinsic in the community.

          You have to design a space to facilitate work and play. It's not hard but you have to get everything right, community can't have mono culture and it must be enjoyable/uncomfortable, and you must design things so people can choose their level of engagement and grow into the community. It's easier once it has enough intertia that they understand they are building a thing with real benefits.

          Even things like the flow of foot traffic within the space, chokepoints narrowing, these kinds of thing all effect how people interact.

          • AuthAuth 5 hours ago ago

            I've been wanting to setup something like a 3rd place that tries only to break even. I'm unfortunately not a very social person.

            Because these 3rd spaces are open to anyone and probably bringing people in from internet commmunties. What do you do when someone comes along and they're not breaking any rules but its clear that no one likes them? I've seen it drive entire groups away but because the person has done nothing wrong I cant/dont want to just say "fuck off kid no likes your weird ass"

          • Physkal 3 hours ago ago

            I woul love to hear more about this. I am in need of a 3rd place, but unfortunately the only meetups around are sports or churches here. What did the members in your group do after you shut down? Were you open to members offering donations to keep your 3rd place going?

      • WaltPurvis 15 hours ago ago

        That's very interesting. Do you have time to elaborate a bit?

    • mindwok 3 hours ago ago

      This isn't a technology problem. Technology can help accessibility, but fundamentally this is an on-the-ground, social coordination problem.

      Functioning, welcoming, and well-ran communities are the only thing that solves this. Unfortunately, technology often makes this worse, because it creates such a convenient alternative and also creates a paradox of choice. I.e. people think "when there's 1000 meetups to check out, and this one isn't perfect, I'll just move onto the next one" when actually it's the act of commitment that makes a community good.

    • abnercoimbre 8 hours ago ago

      I call it technology for touching grass e.g. look at The Offline Club [0]

      [0] https://www.theoffline-club.com/

  • seanwilson 2 hours ago ago

    Any interesting work on using LLMs to moderate posts/users? HN is often said to be different because of its moderation, couldn't you train an LLM moderator on similar rules to reduce trolls, ragebait, and low effort posts at scale?

    A big problem I see is users in good faith are unable to hold back from replying to bad faith posts, a failure to follow the old "don't feed the trolls rule".

  • SoftTalker 15 hours ago ago

    > Only some interventions showed modest improvements. None were able to fully disrupt the fundamental mechanisms producing the dysfunctional effects.

    I think this is expected. Think back to newsgroups, email lists, web forums. They were pretty much all chronological or maybe had a simple scoring or upvoting mechanism. You still had outrage, flamewars, and the guy who always had to have the last word. Social media engagement algorithms probably do amplify that but the dysfunction was always part of it.

    The only thing I've seen that works to reduce this is active moderation.

  • positron26 4 hours ago ago

    point-to-point communication between every human on Earth to every other human on Earth flattens communication hierarchies that used to amplify expertise and a lot of other behaviors. We created new hierarchies, but they are mostly demagogues pandering to the middle. Direct delegation is sort of like trying to process an image without convolution. Nobody knows what anyone else thinks, so we just trust that one neuron.

  • 627467 8 hours ago ago

    > Ars Technica: I'm skeptical of AI in general, particularly in a research context, but there are very specific instances where it can be extremely useful. This strikes me as one of them, largely because your basic model proved to be so robust.

    You can't accuse them of hiding their bias and contradictions.

    How can a single paper using a unproven (for this type of research) tech disprove such (alleged) skepticism.

    People bending over backwards to do propaganda to harvest clicks.

  • standardUser 14 hours ago ago

    > They then tested six different intervention strategies...

    None of these approaches offer what I want, and what I think a lot of people want, which is a social network primarily of people you know and give at least one shit about. But in reality, most of us don't have extended social networks that can provide enough content to consistently entertain us. So, even if we don't want 'outside' content (as if that was an option), we'll gravitate to it out of boredom and our feeds will gradually morph back into some version of the clusrterfucks we all deal with today.

  • burnte 14 hours ago ago

    Social media isn't the problem, people are the problem, and we still working on how to fix them.

    • mritterhoff 13 hours ago ago

      I think that's an oversimplification. People have problems sure, but just like alcohol, social media can and does exacerbate them. The answer to dealing with the former is regulation. I'm not sure that is feasible for the latter.

      • throwawayq3423 5 hours ago ago

        I guess you could say the problem is that the wrong things are rewarded and amplified, but that just goes back to people.

    • _DeadFred_ 10 hours ago ago

      Social media leveraging the billions spent on marketing over the years, the skills of knowledgeable experts in multiple disciplines, basically thousands of human years of experts at manipulating people against a random person with zero guard up that wants to chat with their friends/make new friends isn't a people problem.

      • throwawayq3423 5 hours ago ago

        Designing social media as a positive place was and continues to be a choice that no one is making. Because it's too damn profitable to make a hellhole / attention vacuum that people can't stop using.

  • bentt 13 hours ago ago

    If you could plug into the inner thoughts of millions of people around the world at once, it would not be pleasant.

    Social media has turned out to basically be this.

  • getnormality 8 hours ago ago

    > ...the dynamics that give rise to all those negative outcomes are structurally embedded in the very architecture of social media. So we're probably doomed...

    No specific dynamics are named in the remainder of the article, so how are we supposed to know if they're "structurally embedded" in anything, let alone if we're doomed?

  • AuthAuth 5 hours ago ago

    I think this problem is partly due to greedly algos and party due to these sites being so large they have no site culture.

    Site culture is what prevents mods from having to step in and sort out every little disagreement. Modern social media actively discourages site culture and post quality becomes a race to the bottom. Sure its harder to onboard new users when there are social rules that need to be learnt and followed but you retain users and have a more enjoyable experience when everyone follows a basic etiquette.

  • ElijahLynn 15 hours ago ago

    I'm reading Tim Urban's book titled "What's Our Problem".

    It definitely explains the different types of thinking that I'm making up our current society, including social media. I haven't got to the part yet where he suggests what to do about it, but it's fascinating insight into our human behavior in this day and age.

  • farceSpherule 15 hours ago ago

    Social media is the new smoking...

    Widespread adoption before understanding risks - embraced globally before fully grasping the mental health, social, and political consequences, especially for young people.

    Delayed but significant harm - can lead to gradual impacts like reduced attention span, increased anxiety, depression, loneliness, and polarization

    Corporate incentives misaligned with public health - media companies design platforms for maximum engagement, leveraging psychological triggers while downplaying or disputing the extent of harm

    • RiverCrochet 15 hours ago ago

      Not an accurate analogy in my opinion, but close.

      - Smoking feels good but doesn't provide any useful function.

      - Some social media use feels good and doesn't provide any useful function, but social media is extremely useful to cheaply keep in touch with friends and family and extremely useful for discovering and coordinating events.

      Fortunately the "keep in touch" part can be done with apps that don't have so much of the "social media" part, like Telegram, Discord, and even Facebook Messenger versus the main app.

      • mvieira38 14 hours ago ago

        I think most of the social media power users don't connect with friends and family at all through the platforms. Young Gen Zers just scroll Tiktok (or whatever clone they prefer) and share the ones they like through snapchat/discord/telegram/messenger/sms/whatsapp. Some will post stuff for their friends to see through "close friends" or whatever, but it's much less personal than it once was with Facebook groups and whatnot

        • RiverCrochet 14 hours ago ago

          Agreed. And it's not necessary when you have so many apps. They're using Tiktok for scrolling and Discord when they actually want to chat with their friends.

      • _DeadFred_ 10 hours ago ago

        'Smoking get's me taking breaks, going outside more, and makes me more social chatting with my coworkers/others on smoke breaks'

    • mvieira38 14 hours ago ago

      This analogy undersells the negative impact of social media. Smoking wasn't a propaganda machine at the hands of a few faceless corpos with no clear affiliation, for example, nor did it form a global spynet

  • MeIam 14 hours ago ago

    The main reason that it can't be fixed is that it has political or corporate operators and propaganda bots have taken over. There is always an agenda running through threads of social media even for mundane topics that seeking supremacy.

  • KaiserPro 15 hours ago ago

    Social media can be fixed, its just the incentives are not aligned.

    To make money, social media companies need people to stay on as long as possible. That means showing people sex, violence, rage and huge amounts of copyright infringements.

    There is little advantage in creating real-world consequences for bad actors. Why? because it hurts growth.

    There was a reason why the old TV networks didn't let any old twat with a camera broadcast stuff on their network, why? because they would get huge fines if they broke decency "laws" (yes america had/has censorship, hence why the simpsons say "whoopee" and "snuggle")

    There are few things that can cause company ending fines for social media companies. Which means we get almost no moderation.

    Until that changes, social media will be "broken"

    • wk_end 15 hours ago ago

      > Social media can be fixed, its just the incentives are not aligned.

      So social media can't be fixed. Incentives are what matter.

      • amanaplanacanal 15 hours ago ago

        Incentives can be changed though, through law.

  • IAmGraydon 15 hours ago ago

    Social media in a profit-seeking system can't be fixed. Profit-seeking provides the evolutionary pressure to turn it into something truly destructive to users. The only way it can work is via ownership by a benevolent non-profit. However, that would likely eventually give in to corruption if given enough time. Outlawing it completely, as well as regulating the algorithmic shaping of the online experience, is probably the inevitable future. Unfortunately, it won't come until the current system causes a complete societal facture and collapse.

    • RiverCrochet 14 hours ago ago

      If enough users are destroyed, advertisers (social media's real customers) won't have sufficient markets for their products, and profits will fall. Social media can't destroy its users and survive.

      Seriously though, I disagree. Social media in a profit-seeking system can work if the users are the ones who pay. The easiest way for this to work-now that net neutrality is no longer a thing-is bundling through user's phone bills. If Facebook et al. were bundled similarly to how Netflix, Hulu and other streaming apps are now packaged with phone plan deals, then the users would be the focus, not the advertisers. This might require that social media be legislatively required to offer true ad-free options, though.

      • moskie 6 hours ago ago

        I think you're on the right track, but not getting to what I view as the logical conclusion: publicly funded options, free at the point of service to everyone. I've also humored the idea of taking it one level of abstraction further: a publicly funded cloud computing infrastructure, access to which is free (up to a level of usage). People could then choose to use these cloud computing resources to host, say, federated instances of open social networks.

        I mean, it will never happen, but I think it's a path that resolves a lot of problems, and therefore a fun thought experiment.

  • westurner 15 hours ago ago

    Do all of these points apply to the traditional media funhouse mirror that we love to hate, too?

    > "The [structural] mechanism producing these problematic outcomes is really robust and hard to resolve."

    I see illegal war, killing without due process, and kleptocracy. It's partly the media's fault. It's partly the peoples' fault for depending on advertising to subsidize free services, for gawking, for sharing without consideration, for voting in ignorance.

    Social media reflects the people; who can't be "fixed" either.

    If you're annoyed with all of these people on here who are lesser than and more annoying than you, then stop spending so much time at the bar.

    Can the bar be fixed?

    • cwmoore 15 hours ago ago

      Sure can!

      “No smoking, gambling, or loose women.”

      TaDAaaah!