Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.

Any awful.systems sub may be subsneered in this subthread, techtakes or no.

If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be)

Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

Last week’s thread

(Semi-obligatory thanks to @dgerard for starting this)

  • Steve@awful.systems
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 months ago

    I might be wrong but this sounds like a quick way to make the web worse by putting a huge computational load on your machine for the purpose of privacy inside customer service chat bots that nobody wants. Please correct me if I’m wrongWebLLM is a high-performance in-browser LLM inference engine that brings language model inference directly onto web browsers with hardware acceleration. Everything runs inside the browser with no server support and is accelerated with WebGPU.  WebLLM is fully compatible with OpenAI API. That is, you can use the same OpenAI API on any open source models locally, with functionalities including streaming, JSON-mode, function-calling (WIP), etc.  We can bring a lot of fun opportunities to build AI assistants for everyone and enable privacy while enjoying GPU acceleration.  You can use WebLLM as a base npm package and build your own web application on top of it by following the examples below. This project is a companion project of MLC LLM, which enables universal deployment of LLM across hardware environments.

    • self@awful.systems
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 months ago

      I do believe that’s the chrome-only horseshit that Proton uses for their local LLM, and reputedly it’s very slow and fairly unreliable

      • Steve@awful.systems
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        2 months ago

        chrome-only

        The whole concept of responsive really died in the arse with the onset of the full stack web developer.

        • self@awful.systems
          link
          fedilink
          English
          arrow-up
          0
          ·
          2 months ago

          the web platform is great!

          • your apps run everywhere (that a modern version of chrome runs)
          • every API is pointlessly terrible because your app is simultaneously a document and an Angular monstrosity
          • progressive enhancement! (is dead and they’re Weekend at Bernie’s-ing the body around knowing most web developers won’t notice)
          • it’s an open platform! (controlled almost entirely by Google, with Apple’s only role being to slow down the terrible fucking ideas coming out of the standards process, and all other parties being effectively Google mouthpieces)
            • self@awful.systems
              link
              fedilink
              English
              arrow-up
              0
              ·
              2 months ago

              Mighty comes in at $30 with a 9% discount with a 12 month prepayment. I don’t think Mighty is for everyone. […] However, for people who use resource intensive applications regularly and either prefer or don’t mind using web apps it seems like a no-brainer. I think there is also a legitimate argument for corporations to provide Mighty for their employees purely based on the productivity boost, especially for tech employees.

              It might be hard to get purists to buy into the “browser = OS” value proposition, but in the meantime I’ll be enjoying my 40 GB of RAM.

              oh it’s the web operating system again! things that make a web browser an operating system:

              • it’s extremely expensive every month
              • it rehashes the awful cloud rendering browser shit Amazon tried and gave up on for their underpowered Kindle tablets
              • it bundles a bunch of basic shit you can do in ordinary browser plugins
              • 40GB of RAM and 8 virtual cores! (that’s all? my current work machine unfortunately has 64GB, and my desktop from 2020 is a 32+32GB split between native and a VM. 8 KVM cores is also not fantastic. none of this should be required for a fucking web browser though but here we are)
              • it’s using a data center’s fantastic internet connection which is probably why it’s quick, but it of course requires a perfectly stable connection on your end or it’s gonna suck
              • Steve@awful.systems
                link
                fedilink
                English
                arrow-up
                0
                ·
                2 months ago

                One other thing, also correct me if I’m wrong, but isn’t it a giant key logger as well?

                • self@awful.systems
                  link
                  fedilink
                  English
                  arrow-up
                  0
                  ·
                  2 months ago

                  …huh, that is true! so another bullet point and this one’s shared by more than one “web operating system”:

                  • it irrevocably breaks the browser’s security model implementing pointless functionality

                  in this case it’s pretty bad, cause it’s got the same issue as all hosted VMs in that if the host or hypervisor is compromised so are all the VMs, but also effectively anyone on Mighty’s side with access to the event stream would have enough data to compromise your entire existence

    • carlitoscohones@awful.systems
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 months ago

      I read this twice as LLM interference engine and was hoping for something like SETI or Folding@Home except my computer could interfere with ChatGPT somehow.