• trslim@pawb.social
    link
    fedilink
    English
    arrow-up
    5
    ·
    3 hours ago

    AI is just a marketing term, there’s nothing intelligent about it. Its simply Large Language Models, databases that predict what should go next. Its like asking the prediction bar when you are typing to write a story.

  • hperrin@lemmy.ca
    link
    fedilink
    English
    arrow-up
    26
    ·
    6 hours ago

    I think the reason so many AI bros are conservative is that conservatives have historically had really bad taste in art/media, so they see the drivel AI creates and think, “oh wow, it looks just like what the artists make,” not realizing that they don’t have the eye to see what it’s missing.

  • kibiz0r@midwest.social
    link
    fedilink
    English
    arrow-up
    82
    ·
    9 hours ago

    I like the way Ted Chiang puts it:

    Some might say that the output of large language models doesn’t look all that different from a human writer’s first draft, but, again, I think this is a superficial resemblance. Your first draft isn’t an unoriginal idea expressed clearly; it’s an original idea expressed poorly, and it is accompanied by your amorphous dissatisfaction, your awareness of the distance between what it says and what you want it to say. That’s what directs you during rewriting, and that’s one of the things lacking when you start with text generated by an A.I.

    There’s nothing magical or mystical about writing, but it involves more than placing an existing document on an unreliable photocopier and pressing the Print button.

    I think our materialist culture forgets that minds exist. The output from writing something is not just “the thing you wrote”, but also your thoughts about the thing you wrote.

    • MoonMelon@lemmy.ml
      link
      fedilink
      English
      arrow-up
      48
      ·
      9 hours ago

      The dialog pushing AI media seems to start from this assumption that I consume media just to have colors and words and sounds enter my face holes. In fact, I consume art and media because I like hearing, seeing, and reading about how other humans experience the same world I do. It’s a form of communication. I like the product but also the process of people trying to capture the bonkers, ineffable experience we all seem to be sharing in ways I would never think of, but can instantly verify.

      What’s funny is, due to the nature of media, it’s kind of impossible to not communicate something, even if the artwork itself is empty. When I see AI media I see the communication of a mind that doesn’t know or give a shit about any of this. So in their attempt make filler they are in fact making art about how inarticulate they are. It’s unintentional, corporate dadaism.

      • queermunist she/her@lemmy.ml
        link
        fedilink
        English
        arrow-up
        2
        ·
        4 hours ago

        The people pushing AI don’t like like hearing, seeing, and reading about how other humans experience the world. They actually do just want flashing colors and sounds poured into their face holes. They’re basically incapable of understanding art.

      • Coelacanth@feddit.nu
        link
        fedilink
        English
        arrow-up
        16
        ·
        9 hours ago

        Yes, this is it right here. The whole point of art is communication and connection with another human being.

        • apotheotic (she/her)@beehaw.org
          link
          fedilink
          English
          arrow-up
          8
          ·
          7 hours ago

          Not even necessarily a human being! I’d appreciate the fuck out of art if any species made it. But there must be more than uncaring, unfeeling, probabilistic interpretation of input data.

    • voracitude@lemmy.world
      link
      fedilink
      English
      arrow-up
      14
      ·
      9 hours ago

      Your first draft isn’t an unoriginal idea expressed clearly; it’s an original idea expressed poorly

      I like this a lot. I’m going to thieve it.

      • CheeseNoodle@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        4 hours ago

        Tangentially related, the easiest way to come up with a unique and cool idea is to come up with a unique and dumb idea (which is way easier) and then work on it until it becomes cool. (Think how dumb some popular franchises concepts are if you take the raw idea out of context.)

  • BaraCoded@literature.cafe
    link
    fedilink
    English
    arrow-up
    26
    ·
    11 hours ago

    Writer having toyed with AI, here : yeah, AI writing sucks. It is consensual and bland, never goes into unexpected territory, or completely fails to understand human nature.

    So, we’d better stop calling AI “intelligence”. It’s text-prediction machine learning on steroïds, nothing more, and the fact that we’re still calling that “intelligence” says how gullible we all are.

    It’s just another speculative bubble from the tech bros, as cryptos were, except this time the tech bros have made their nazi coming out.

    • RobotsLeftHand@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      ·
      edit-2
      6 hours ago

      I remember reading a longer post on lemmy. The person was describing their slow realization that the political beliefs they were raised with were leading down a dark path. It was a process that took many years, and the story was full of little moments where cracks in his world view widened and the seed of doubt grew.

      And someone who was bored/overwhelmed with having to read a post over three sentences long fed the story into AI to make a short summary. They then posted that summary as a “fixed your post, bro” moment. So basically all the humanity removed. Reminds me of that famous “If the Gettysburg Address were a PowerPoint” https://norvig.com/Gettysburg/

      • NotMyOldRedditName@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        3 hours ago

        That’s really sad.

        I’ve used AI to help clean up my sentence structure for copy, but if I am not super explicit with it to not rewrite what I wrote, it will do as you said and take the human element out of it.

  • rafoix@lemmy.zip
    link
    fedilink
    English
    arrow-up
    42
    ·
    12 hours ago

    Tons of shit games are going to have lots of dialogue written by AI. It’s very likely that those games would have had shit dialogue anyway.

  • Maiznieks@lemmy.world
    link
    fedilink
    English
    arrow-up
    5
    ·
    9 hours ago

    Nah, can’t agree. I have postponed few ideas for years, was able to vibe them in a week during evenings, now i have something usable. 70% of it was vibed, just had to fix stupid stuff that was partially on my queries.

  • BlackLaZoR@fedia.io
    link
    fedilink
    arrow-up
    6
    ·
    10 hours ago

    You can make it stylized dialogue but it’s just surface mannerisms. Underneath it’s still the same bland AI

      • saltesc@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        ·
        12 hours ago

        Implying people are happy to buy the shit, which isn’t likely, especially in a competitive environment.

        • Leon@pawb.social
          link
          fedilink
          English
          arrow-up
          8
          ·
          11 hours ago

          People buy AAA games all the time. Look at Starfield. Garbage game, still sold well.

          • Mister_Feeny@fedia.io
            link
            fedilink
            arrow-up
            3
            ·
            8 hours ago

            Starfield is estimated to have sold 3 millions copies. Baldur’s gate 3, 15 million. Microsoft/Bethesda marketing budgets makes a difference, but not being garbage makes a much bigger difference.

            • Leon@pawb.social
              link
              fedilink
              English
              arrow-up
              2
              ·
              7 hours ago

              Well yeah, I’m not going to argue that a well made game that respects the player isn’t going to do well. But that doesn’t matter to the publishers and their shareholders when they can pump out AI slop garbage year after year and still have people that drink it up. Just look at the yearly shooters and sports games, they sell enough.

              Besides, what happens when this sort of slop has been normalised? Look at the mobile market, no one bats an eye at the intensely predatory microtransactions, and you’ll even find people defending things like gacha games.

              There was a time where people scoffed at the notion of paying $2~ for some shitty cosmetics, but now people don’t even blink at the idea. Hell, it’s downright cheap in some cases. The AAA industry just has to slop things up for long enough for people to stop caring, because they will stop caring and then continue to shell out for the dubious privilege of guzzling their mediocre, uninspiring garbage.

  • Lemming6969@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    8 hours ago

    I can see how it could be useful, or mandatory in future rpgs. It can generate a framework for a real writer, with extremely large amounts of logical branching, a billion times faster. Then you go over the top of it and use the framework as concepts to use or revise. This streamlines the process, unifies the creative vision, and allows for such a large game without procedural generation that would haven taken a team 10 years or not at all, done in 2.

    • xthexder@l.sw0.com
      link
      fedilink
      English
      arrow-up
      8
      ·
      edit-2
      7 hours ago

      This is based on the assumption that the AI output is any good, but the actual game devs and writers are saying otherwise.

      If the game is too big for writers to finish on their own, they’re not going to have time to read and fix everything wrong with the AI output either. This is how you get an empty, soulless game, not Balders Gate 3.

      • Lemming6969@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        7 hours ago

        It’s assuming the ai output isn’t very good. It assumes it can create a framework that necessarily still needs the actual writers, but now they don’t have to come up with 100% of the framework, but instead work on the actual content only. Storyboarding and frameworking is a hodgepodge of nonsense anyway with humans. The goal is to achieve non-linear scaling, not replace quality writers or have the final product Ai written.

        • xthexder@l.sw0.com
          link
          fedilink
          English
          arrow-up
          5
          ·
          7 hours ago

          This sounds like it takes away a huge amount of creative freedom from the writers if the AI is specifying the framework. It’d be like letting the AI write the plot, but then having real writers fill in details along the way, which sounds like a good way to have the story go nowhere interesting.

          I’m not a writer, but if I was to apply this strategy to programming, which I am familiar with, it’d be like letting the AI decide what all the features are, and then I’d have to go and build them. Considering more than half my job is stuff other than actually writing code, this seems overly reductive, and underestimates how much human experience matters in deciding a framework and direction.

          • Lemming6969@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            4 hours ago

            Even in programming there are common feature frameworks. Having a system enumerate them based on a unified design vision from a single source architect rather than 50 different design ideas duct taped together could help a lot. I’ve seen some horrendous systems where you can tell a bunch of totally separate visions were frankenstein’d together, and the same happens in games where you can tell different groups wrote different sections.

  • melsaskca@lemmy.ca
    link
    fedilink
    English
    arrow-up
    4
    ·
    10 hours ago

    That means in about 6 months or so the AI content quality will be about an 8/10. The processors spread machine “learning” incredibly fast. Some might even say exponentially fast. Pretty soon it’ll be like that old song “If you wonder why your letters never get a reply, when you tell me that you love me, I want to see you write it”. “Letters” is an old version of one-on-one tweeting, but with no character limit.

    • Arkthos@pawb.social
      link
      fedilink
      English
      arrow-up
      4
      ·
      5 hours ago

      I doubt that. A lot of the poor writing quality comes down to choice. All the most powerful models are inherently trained to be bland, seek harmony with the user, and generally come across as kind of slimy in a typically corporate sort of way. This bleeds into the writing style pretty heavily.

      A model trained specifically for creative writing without such a focus would probably do better. We’ll see.

    • xthexder@l.sw0.com
      link
      fedilink
      English
      arrow-up
      10
      ·
      edit-2
      7 hours ago

      What improvements have there been in the previous 6 months? From what I’ve seen the AI is still spewing the same 3/10 slop it has since 2021, with maybe one or two improvements bringing it up from 2/10. I’ve heard several people say some newer/bigger models actually got worse at certain tasks, and clean training data is pretty much dried up to even train more models.

      I just don’t see any world where scaling up the compute and power usage is going to suddenly improve the quality orders of magnitude. By design LLMs are programmed to output the most statistically likely response, but almost by definition is going to be the most average, bland response possible.

    • Skua@kbin.earth
      link
      fedilink
      arrow-up
      13
      ·
      10 hours ago

      Only if you assume that its performance will continue improving for a good while and (at least) linearly. The companies are really struggling to give their models more compute or more training data now and frankly it doesn’t seem like there have been any big strides for a while

      • kibiz0r@midwest.social
        link
        fedilink
        English
        arrow-up
        12
        ·
        9 hours ago

        Yeah… Linear increases in performance appear to require exponentially more data, hardware, and energy.

        Meanwhile, the big companies are passing around the same $100bn IOU, amortizing GPUs on 6-year schedules but burning them out in months, using those same GPUs as collateral on massive loans, and spending based on an ever-accelerating number of data centers which are not guaranteed to get built or receive sufficient power.