AI is just a marketing term, there’s nothing intelligent about it. Its simply Large Language Models, databases that predict what should go next. Its like asking the prediction bar when you are typing to write a story.
I think the reason so many AI bros are conservative is that conservatives have historically had really bad taste in art/media, so they see the drivel AI creates and think, “oh wow, it looks just like what the artists make,” not realizing that they don’t have the eye to see what it’s missing.
I like the way Ted Chiang puts it:
Some might say that the output of large language models doesn’t look all that different from a human writer’s first draft, but, again, I think this is a superficial resemblance. Your first draft isn’t an unoriginal idea expressed clearly; it’s an original idea expressed poorly, and it is accompanied by your amorphous dissatisfaction, your awareness of the distance between what it says and what you want it to say. That’s what directs you during rewriting, and that’s one of the things lacking when you start with text generated by an A.I.
There’s nothing magical or mystical about writing, but it involves more than placing an existing document on an unreliable photocopier and pressing the Print button.
I think our materialist culture forgets that minds exist. The output from writing something is not just “the thing you wrote”, but also your thoughts about the thing you wrote.
The dialog pushing AI media seems to start from this assumption that I consume media just to have colors and words and sounds enter my face holes. In fact, I consume art and media because I like hearing, seeing, and reading about how other humans experience the same world I do. It’s a form of communication. I like the product but also the process of people trying to capture the bonkers, ineffable experience we all seem to be sharing in ways I would never think of, but can instantly verify.
What’s funny is, due to the nature of media, it’s kind of impossible to not communicate something, even if the artwork itself is empty. When I see AI media I see the communication of a mind that doesn’t know or give a shit about any of this. So in their attempt make filler they are in fact making art about how inarticulate they are. It’s unintentional, corporate dadaism.
The people pushing AI don’t like like hearing, seeing, and reading about how other humans experience the world. They actually do just want flashing colors and sounds poured into their face holes. They’re basically incapable of understanding art.
Yes, this is it right here. The whole point of art is communication and connection with another human being.
Not even necessarily a human being! I’d appreciate the fuck out of art if any species made it. But there must be more than uncaring, unfeeling, probabilistic interpretation of input data.
Your first draft isn’t an unoriginal idea expressed clearly; it’s an original idea expressed poorly
I like this a lot. I’m going to thieve it.
Tangentially related, the easiest way to come up with a unique and cool idea is to come up with a unique and dumb idea (which is way easier) and then work on it until it becomes cool. (Think how dumb some popular franchises concepts are if you take the raw idea out of context.)
Writer having toyed with AI, here : yeah, AI writing sucks. It is consensual and bland, never goes into unexpected territory, or completely fails to understand human nature.
So, we’d better stop calling AI “intelligence”. It’s text-prediction machine learning on steroïds, nothing more, and the fact that we’re still calling that “intelligence” says how gullible we all are.
It’s just another speculative bubble from the tech bros, as cryptos were, except this time the tech bros have made their nazi coming out.
I remember reading a longer post on lemmy. The person was describing their slow realization that the political beliefs they were raised with were leading down a dark path. It was a process that took many years, and the story was full of little moments where cracks in his world view widened and the seed of doubt grew.
And someone who was bored/overwhelmed with having to read a post over three sentences long fed the story into AI to make a short summary. They then posted that summary as a “fixed your post, bro” moment. So basically all the humanity removed. Reminds me of that famous “If the Gettysburg Address were a PowerPoint” https://norvig.com/Gettysburg/
That’s really sad.
I’ve used AI to help clean up my sentence structure for copy, but if I am not super explicit with it to not rewrite what I wrote, it will do as you said and take the human element out of it.
Sounds like an interesting read, got a link to said post?
Tons of shit games are going to have lots of dialogue written by AI. It’s very likely that those games would have had shit dialogue anyway.
Sure, but human-written shit still had that human touch. It could be unintentionally funny, it could be a mixed bag that reaches unexpected heights at times. AI writing is just the bland kind of bad, not the interesting kind of bad.
All your base are belong to us.
I should have been the one to fill your dark soul with liiiiight!
Great point. There’s no opportunity for “so bad it’s good”. The Room wouldn’t have been a thing if Tommy used AI.
You’re absolutely right, Lisa!
It would probably have been less shit though.
It would probably just have been less dialogue
They already said less shit.
Which would be totally fine.
Eh, we’re talking about the bottom of the barrel here. I’m thinking there will be fewer typos, but also an occasional “as a LLM” slip-up, so about the same quality as before.
Nah, can’t agree. I have postponed few ideas for years, was able to vibe them in a week during evenings, now i have something usable. 70% of it was vibed, just had to fix stupid stuff that was partially on my queries.
That’s the difference between an amateur writer and a professional writer.
You can make it stylized dialogue but it’s just surface mannerisms. Underneath it’s still the same bland AI
Guy says robot can’t replace him. News at 11.
I mean it can, 10% quality and units sold at 1% cost increases your profits by 10x
Implying people are happy to buy the shit, which isn’t likely, especially in a competitive environment.
People buy AAA games all the time. Look at Starfield. Garbage game, still sold well.
Starfield is estimated to have sold 3 millions copies. Baldur’s gate 3, 15 million. Microsoft/Bethesda marketing budgets makes a difference, but not being garbage makes a much bigger difference.
Well yeah, I’m not going to argue that a well made game that respects the player isn’t going to do well. But that doesn’t matter to the publishers and their shareholders when they can pump out AI slop garbage year after year and still have people that drink it up. Just look at the yearly shooters and sports games, they sell enough.
Besides, what happens when this sort of slop has been normalised? Look at the mobile market, no one bats an eye at the intensely predatory microtransactions, and you’ll even find people defending things like gacha games.
There was a time where people scoffed at the notion of paying $2~ for some shitty cosmetics, but now people don’t even blink at the idea. Hell, it’s downright cheap in some cases. The AAA industry just has to slop things up for long enough for people to stop caring, because they will stop caring and then continue to shell out for the dubious privilege of guzzling their mediocre, uninspiring garbage.
Get your slop 'ere! Fresh from the data center!
I can see how it could be useful, or mandatory in future rpgs. It can generate a framework for a real writer, with extremely large amounts of logical branching, a billion times faster. Then you go over the top of it and use the framework as concepts to use or revise. This streamlines the process, unifies the creative vision, and allows for such a large game without procedural generation that would haven taken a team 10 years or not at all, done in 2.
Aand you end up with… ta-da-m, same old things, just rebranded. Very creative (no)
This is based on the assumption that the AI output is any good, but the actual game devs and writers are saying otherwise.
If the game is too big for writers to finish on their own, they’re not going to have time to read and fix everything wrong with the AI output either. This is how you get an empty, soulless game, not Balders Gate 3.
It’s assuming the ai output isn’t very good. It assumes it can create a framework that necessarily still needs the actual writers, but now they don’t have to come up with 100% of the framework, but instead work on the actual content only. Storyboarding and frameworking is a hodgepodge of nonsense anyway with humans. The goal is to achieve non-linear scaling, not replace quality writers or have the final product Ai written.
This sounds like it takes away a huge amount of creative freedom from the writers if the AI is specifying the framework. It’d be like letting the AI write the plot, but then having real writers fill in details along the way, which sounds like a good way to have the story go nowhere interesting.
I’m not a writer, but if I was to apply this strategy to programming, which I am familiar with, it’d be like letting the AI decide what all the features are, and then I’d have to go and build them. Considering more than half my job is stuff other than actually writing code, this seems overly reductive, and underestimates how much human experience matters in deciding a framework and direction.
Even in programming there are common feature frameworks. Having a system enumerate them based on a unified design vision from a single source architect rather than 50 different design ideas duct taped together could help a lot. I’ve seen some horrendous systems where you can tell a bunch of totally separate visions were frankenstein’d together, and the same happens in games where you can tell different groups wrote different sections.
That means in about 6 months or so the AI content quality will be about an 8/10. The processors spread machine “learning” incredibly fast. Some might even say exponentially fast. Pretty soon it’ll be like that old song “If you wonder why your letters never get a reply, when you tell me that you love me, I want to see you write it”. “Letters” is an old version of one-on-one tweeting, but with no character limit.
Wake me up when that happens. Like literally, @mention me somewhere
I doubt that. A lot of the poor writing quality comes down to choice. All the most powerful models are inherently trained to be bland, seek harmony with the user, and generally come across as kind of slimy in a typically corporate sort of way. This bleeds into the writing style pretty heavily.
A model trained specifically for creative writing without such a focus would probably do better. We’ll see.
What improvements have there been in the previous 6 months? From what I’ve seen the AI is still spewing the same 3/10 slop it has since 2021, with maybe one or two improvements bringing it up from 2/10. I’ve heard several people say some newer/bigger models actually got worse at certain tasks, and clean training data is pretty much dried up to even train more models.
I just don’t see any world where scaling up the compute and power usage is going to suddenly improve the quality orders of magnitude. By design LLMs are programmed to output the most statistically likely response, but almost by definition is going to be the most average, bland response possible.
I love how you idiots think this tech hasn’t already hit its ceiling. It’s been functionally stagnant for some time now.
Hey. I’m just one idiot. Who else are you talking about?
You’re not just one, you’re one of many. All saying the same shit.
Lots and lots of people have told me that.
That tracks.
My hot take. It will never get to even a 6/10. I bet it will just spit out 3/10 faster and faster, most likely.
Only if you assume that its performance will continue improving for a good while and (at least) linearly. The companies are really struggling to give their models more compute or more training data now and frankly it doesn’t seem like there have been any big strides for a while
Yeah… Linear increases in performance appear to require exponentially more data, hardware, and energy.
Meanwhile, the big companies are passing around the same $100bn IOU, amortizing GPUs on 6-year schedules but burning them out in months, using those same GPUs as collateral on massive loans, and spending based on an ever-accelerating number of data centers which are not guaranteed to get built or receive sufficient power.












