I found that idea interesting. Will we consider it the norm in the future to have a “firewall” layer between news and ourselves?

I once wrote a short story where the protagonist was receiving news of the death of a friend but it was intercepted by its AI assistant that said “when you will have time, there is an emotional news that does not require urgent action that you will need to digest”. I feel it could become the norm.

EDIT: For context, Karpathy is a very famous deep learning researcher who just came back from a 2-weeks break from internet. I think he does not talks about politics there but it applies quite a bit.

EDIT2: I find it interesting that many reactions here are (IMO) missing the point. This is not about shielding one from information that one may be uncomfortable with but with tweets especially designed to elicit reactions, which is kind of becoming a plague on twitter due to their new incentives. It is to make the difference between presenting news in a neutral way and as “incredibly atrocious crime done to CHILDREN and you are a monster for not caring!”. The second one does feel a lot like exploit of emotional backdoors in my opinion.

  • Uriel238 [all pronouns]@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    0
    ·
    6 months ago

    I look forward to factchecker services that interface right into the browser or OS, and immediately recognize and flag comments that might be false or misleading. Some may provide links to deep dives where it’s complicated and you might want to know more.

          • Alexstarfire@lemmy.world
            link
            fedilink
            arrow-up
            0
            ·
            6 months ago

            That really just reenforces my point. Most people aren’t setting that up themselves. The app defaults do that. I.e. someone/something else is making that determination. Sure, maybe you can still check out the post if you really want, but how many will do that? Can you change how it works? Depends on the app.

            If people want to opt-in to it then I don’t really care. I mostly HATE being forced to opt-out of things though.

      • Worx@lemmynsfw.com
        link
        fedilink
        English
        arrow-up
        0
        ·
        6 months ago

        Either it’s you deciding as you see it (ie there is no filter), or it’s past you who’s deciding in which case it’s a different person. I’ve grown mentally and emotionally as I’ve got older and I certainly don’t want me-from-10-years-ago to be in control of what me-right-now is even allowed to see

        • NounsAndWords@lemmy.world
          link
          fedilink
          arrow-up
          0
          ·
          6 months ago

          Or you can update it when you see fit, or go periods without filters to ensure you are still seeing something approximating reality, or base it on people you know personally and currently who you trust, or half a dozen other things that aren’t off the top of my head. The point was it’s less black and white than you’re painting it.

          • Worx@lemmynsfw.com
            link
            fedilink
            English
            arrow-up
            0
            ·
            6 months ago

            If you’re never allowed to see things you don’t like, how will you grow and change? If you never grow and change, why would you update your filters?

            • NounsAndWords@lemmy.world
              link
              fedilink
              arrow-up
              0
              ·
              6 months ago

              Then you should probably allow yourself to see some things you don’t like. I guess the answer lies somewhere in a middle ground where you both see things you don’t agree with and also filter out people known to spout untrue information or unnecessarily emotion-fueled sentiments? I don’t like genocide, but that doesn’t mean my options are fully head-in-the-sand or listen to non-stop Holocaust deniers…

              Pretty close to exactly what we do right now, really but supercharged for the fast-approaching/already here world of supercharged fake news.

      • neuracnu@lemmy.blahaj.zone
        link
        fedilink
        English
        arrow-up
        0
        ·
        6 months ago

        Just like diet, some people prefer balancing food types and practicing moderation, and others overindulge on what makes them feel good in the moment.

        Having food options tightly controlled would restrict personal liberty, but doing nothing and letting people choose will lead to bad outcomes.

        The solution is to educate people on what kinds of choices are healthy and what are not, financially subsidize the healthy options so they are within reach to all, and only use law to restrict things that are explicitly harmful.

        Mapping that back to news and media, I’d like to see public education promoting the value of a balanced media and news diet. Put more money into non-politically-aligned news organizations. Look closely at news orgs that knowingly peddle falsehoods and either bring libel charges against them or create new laws that address the public harm done by maliciously spreading misinformation.

        But I’m no lawyer, so I don’t know how to do that last part without creating some form of tyranny.

    • halfway_neko@lemmy.dbzer0.com
      link
      fedilink
      arrow-up
      0
      ·
      6 months ago

      isn’t that what the upvote/downvote buttons are for? although to be fair, i’d much rather the people of lemmy decide which things are good and interesting than some “algorithm”

      • fine_sandy_bottom@discuss.tchncs.de
        link
        fedilink
        arrow-up
        0
        ·
        6 months ago

        There’s a real risk to this belief.

        There are elements of lemmy who use votes to manipulate which ideas appear popular, with the intention of manipulating discourse rather than having open discussions.

        • halfway_neko@lemmy.dbzer0.com
          link
          fedilink
          arrow-up
          0
          ·
          6 months ago

          yeah. you’re right.

          it’s not like i blindly trust the votes to tell me what’s right and wrong, but they still influence my thoughts. i could just sort by new, but i feel like that’s almost as easy to manipulate.

          i guess it comes back to the topic of the post. where and how i get my information is always going to affect me.

          i’m sure other platforms are no better than lemmy with manipulating content, but maybe for different reasons. i just have to choose the right places to spend my time.

          • fine_sandy_bottom@discuss.tchncs.de
            link
            fedilink
            arrow-up
            0
            ·
            6 months ago

            Yeah this is an “unpopular opinion” but I don’t believe the lemmyverse in it’s current form is sustainable for this reason.

            Instances federate with everyone by default. It’s only when instances are really egregious that admins will defederate from them.

            Sooner or later Lemmy will present more of a target for state actors wishing to stoke foment and such. At that time the only redress will be for admins to defederate with other instances by default, and only federate with those who’s moderation policies align with their own.

            You might say, the lemmyverse will shatter.

            I don’t think that’s necessarily a bad thing.

            End rant.

    • TrickDacy@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      6 months ago

      Yeah, op seems to think minds are weak and endlessly vulnerable. I don’t believe that, not about myself at least

      • xxd@discuss.tchncs.de
        link
        fedilink
        arrow-up
        0
        ·
        6 months ago

        I think you’re too optimistic as to how difficult it is to influence people. Just think of the various, obviously false, conspiracy theories that some people still believe. I think that for every person there is some piece of information/news, that is just believable enough without questioning it, that is going to nudge their opinion just ever so slightly. And with enough nudges, opinions can change.

        • TrickDacy@lemmy.world
          link
          fedilink
          arrow-up
          0
          ·
          6 months ago

          You’re referring to fringe groups. There are a lot of them, but they’re also in the vast minority. But even so, treating adults like especially fragile children isn’t going to help

          • xxd@discuss.tchncs.de
            link
            fedilink
            arrow-up
            0
            ·
            edit-2
            6 months ago

            Yes, only fringe groups believe outlandish conspiracies, but it’s unrealistic to believe that most people, including you, can’t be influenced. Just think of ads or common misconceptions. everyone is susceptible to this to some degree, no one can have their guard up 24/7, regardless of being a child or an adult. Having a “firewall” for everything isn’t a good solution I’d say, but it’s not as if everybody is as resilient as you think.

      • Dave.@aussie.zone
        link
        fedilink
        arrow-up
        0
        ·
        6 months ago

        Yeah, op seems to think minds are weak and endlessly vulnerable. I don’t believe that, not about myself at least

        Your mind is subject to cognitive biases that are extremely difficult to work around. For example, your statement is an example of egocentric bias.

        All you need is content that takes advantage of a few of those biases and it’s straight in past your defences.

        • TrickDacy@lemmy.world
          link
          fedilink
          arrow-up
          0
          ·
          6 months ago

          Yeah I understand people are pretty flawed, and vulnerable to some degree of manipulation. I just think that the idea proposed in this post is not only an overreaction, underestimates people’s ability to reject bullshit. We can’t always tell what’s bullshit, sure, but we don’t need to be treated like we’re too fragile to think for ourselves. Once that happens, we would literally become unable to do so.

        • keepthepace@slrpnk.netOP
          link
          fedilink
          arrow-up
          0
          ·
          6 months ago

          I am fairly armored intellectually, but emotionally, I find it draining to be reminded that war is at my doorsteps and that kids are dying gruesome deaths in conflicts I barely know about.

  • Desmond373@slrpnk.net
    link
    fedilink
    arrow-up
    0
    ·
    6 months ago

    People are thinking of the firewall here as something external. You can do this without outside help.

    Who is this source. Why are they telling me this. How do they know this. What infomation might they be ommiting.

    From that point you have enough infomation to make a judgement for yourself what a point of infomation is.

  • adderaline@beehaw.org
    link
    fedilink
    English
    arrow-up
    0
    ·
    6 months ago

    i have a general distaste for the mind/computer analogy. no, tweets aren’t like malware, because language isn’t like code. our brains were not shaped by the same forces that computers are, they aren’t directly comparable structures that we can transpose risks onto. computer scientists don’t have special insight into how human societies work because they understand linear algebra and network theory, in the same way that psychologists and neurologists don’t have special insight into machine learning because they know how the various regions of the human brain interact to form a coherent individual mind, or the neural circuits that go into sensory processing.

    i personally think that trying to solve social problems with technological solutions is folly. computers, their systems, the decisions they make, are not by nature less vulnerable to bias than we are. in fact, the kind of math that governs automated curation algorithms happens to be pretty good at reproducing and amplifying existing social biases. relying on automated systems to do the work of curation for us isn’t some kind of solution to the problems that exist on twitter and elsewhere, it is explicitly part of the problem.

    twitter isn’t giving you “direct, untrusted” information. its giving you information served by a curation algorithm designed to maximize whatever it is twitter’s programmers have built, and those programmers might not even be accurately identifying what it is that they’re maximizing for. assuming that we can make a “firewall” that maximizes for neutrality or objectivity is, to my mind, no less problematic than the systems that already exist, because it makes the same assumption: that we can build computational systems that reliably and robustly curate human social networks in ways that are provably beneficial, “neutral”, or unbiased. that just isn’t a power that computers have, nor is it something we should want as beings with agency and autonomy. people should have control over how their social networks function, and that control does not come from outsourcing social decisions to black-boxed machine learning algorithms controlled by corporate interests.

    • keepthepace@slrpnk.netOP
      link
      fedilink
      arrow-up
      0
      ·
      6 months ago

      I really think that as the 20th century so the rise of basic hygiene practices we are putting in place mental hygiene practices in the 21st.

  • fine_sandy_bottom@discuss.tchncs.de
    link
    fedilink
    arrow-up
    0
    ·
    6 months ago

    Not really. An executable controlled by an attacker could likely “own” you. A toot tweet or comment can not, it’s just an idea or thought that you can accept or reject.

    We already distance ourselves from sources of always bad ideas. For example, we’re all here instead of on truth social.

    • keepthepace@slrpnk.netOP
      link
      fedilink
      arrow-up
      0
      ·
      6 months ago

      Kind of, but the guy being a prominent LLM researcher, it kind of hints at the ability of not inflicting it on humans nor suffering from having to design an apolitical structure for it.

  • Lemvi@lemmy.sdf.org
    link
    fedilink
    arrow-up
    0
    ·
    edit-2
    6 months ago

    I think the right approach would be to learn to deal with any kind of information, rather than to censor anything we might not like hearing.

    • keepthepace@slrpnk.netOP
      link
      fedilink
      arrow-up
      0
      ·
      6 months ago

      We are already having tons of filters in place trying to serve us information we are interested in, knowledgeable enough to digest, not-spammy, in the correct language, not porn or gore, etc… He is just proposing another interesting dimension. For instance, I am following AI news and news about the Ukraine conflict but I prefer to keep them separate and to not be distracted by the other when I get my fill on one.

      The only way I found with Twitter (and now Mastodon) to do it is to devote twitter only to tech news.

      • Lemvi@lemmy.sdf.org
        link
        fedilink
        arrow-up
        0
        ·
        6 months ago

        I don’t think he is proposing another dimension, but rather another scale. As you already said, we already filter the information that reaches us.

        He seems to take this idea of filtering/censorship to an extreme. Where I see filtering mostly as a matter of convenience, he portrays information as a threat that people need to be protected from. He implies that being presented with information that challenges your world view is something bad, and I disagree with that.

        I am not saying that filtering is bad. I too have blocked some communities here on Lemmy. I am saying that it is important not to put yourself in a bubble, where every opinion you see is one you agree with, and every news article confirms your beliefs.

  • I’ve thought about this since seeing Ghost in the Shell as a kid. If direct neural interfaces become common place, the threat of hacking opens up from simply stealing financial information or material for blackmail; they may be able to control your entire body!

  • YoFrodo@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    6 months ago

    Reading, watching, and listening to anything is like this. You accept communications into your brain and sort it out there. It’s why people censor things, to shield others and/or to prevent the spread of certain ideas/concepts/information.

    Misinformation, lies, scams, etc function entirely on exploiting it

    • fruitycoder@sh.itjust.works
      link
      fedilink
      arrow-up
      0
      ·
      6 months ago

      I’ve, since I was young, had mantra. If you don’t why your are doing something someone else does. Its not always conspiracy or malicious its literally the basis of the idea of memetics, shareable and spreadable ideas that form the basis of who we are and what we know.

    • Sorgan71@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      6 months ago

      that which influences you is more powerful than your will. You cant really choose what to do.

  • AwkwardLookMonkeyPuppet@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    6 months ago

    You are responsible for what you do with the information you process. You’re not supposed to just believe everything you read, or let it affect you. We don’t need some government or organization deciding what can be shown online. Read history and see what follows mass censorship.

    • keepthepace@slrpnk.netOP
      link
      fedilink
      arrow-up
      0
      ·
      6 months ago

      I am bewildered that so many people contrive this as suggesting it should be a government or a company deciding what to show you. Obviously any kind of firewall/filter ought to be optional and user controlled!

      • Cris@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        6 months ago

        In fairness social media already has a problem with creating an echo chamber that serves only to reinforce and exaggerate your existing world view. I think to some extent exposure to perspectives other than one’s own is important and healthy.