• 49 Posts
  • 602 Comments
Joined 2 years ago
cake
Cake day: July 5th, 2023

help-circle

  • Theres a solid foundation, but there just isn’t the userbase. For example, I still have to go to Reddit if I want to discuss Dota, CS2, or War Thunder theorycrafting. These are massive, internationally popular games with compex mechanics and shifting metas that encourgae constant discussion, and yet there is barely a post a month across all three communities combined, nonetheless meaningful discussion. If we can’t even manage a baseline pulse for popular and dedicated niches that appeal to the Fediverse’s nerdy userbase, what could we offer for regular users, looking for smaller niches, nonetheless local topics.

    Don’t get me wrong, I hope the Fediverse succeeds. I am still here, and contributing to the best of my ability, but not everyone wants to try and kickstart a new platform in their spare time. For most users, Lemmy will need to grow tenfold before it can even compete with Reddit, nonetheless replace it.







  • Using your clones example, the Slay the Spire “clones” that give roguelike deckbuilders a bad name aren’t Inscryption or Monster Train or Balatro. Its things like Across the Obelisk and Wildfrost, that are good, but fail to capture what makes others great, and the numerous low-effort copies you’ve likely never heard of that viewed it as an easy way to make a good game without understanding it. Its not that Roguelike Deckbuilders are bad, obviously, its that lazy, or thoughtless use of the mechanics that is. A game isn’t one mechanic, and trying to treat it as such just results in a messy or bad game.


  • Its a crutch because its expected to hold the game up, rather than the game supporting its own weight. In your bullet hell example, dodging isn’t a crutch, it’s the foundational mechanic. A better example would be a slot machine system (something that is near-inherently engaging) being added to a bullet hell game, not because it fits but because its fun independently and helps distract from the fact that they haven’t put any effort into the core gameplay. The mechanic isn’t a crutch, its inclusion as a tacked-on addition is.


  • The mechanic itself isn’t the issue, but how it is implemented.

    It depends on how (and where) its implemented is his point. It needs to be woven into the comvat system as it is in FromSoft, Batman, Ultrakill, or Cuphead, not tacked on because its easy or popular. Each of those uses parrying in a different way to enhance its combat. On the other hand, if you take these mechanics without the greater context or understanding of why it works, then it’ll tends to stand out as bad, or remain unused. Doom Eternal is an example that immediately comes to mind. The whole game is about fast paced combat, with a plethora of new mobility mechanics, that is, until you encounter one of the enemies you need to parry. Then, the game comes to a grinding halt while you wait for the enemy to take action, so you are able to react, completely opposite the rage-fueled persona and the mobility focus of every other mechanic. Compare that to Ultrakill, where parrying isn’t just a reactive way to mitigate damage, its a situational attack that allows you to keep moving and keep up your carnage.

    Game mechanics work best when they’re cohesive. Parrying, due to its simplicity can be tacked on easily, breaking this cohesiveness if not given the same weight as the rest of the mechanics.


  • My point of contention is that the arguments you’re using are flawed, not your intentions. OpenAI, Meta, Disney, ect. are in the wrong because they pirate/freeboot and infringement on independent artist’s licenses. It’s not their use of technology or the derivative nature of the works it produces that are the problem: making AI the face of the issues moves the blame away from the companies, and allows them to continue to pirate/freeboot/plagiarize (or steal, as you define it) from artists.

    Yes, part of my point is that capitalism is bad, but thats further up the chain than what I was arguing. My point is that copyright law and more importantly, its implementation and enforcement is broken. Basically all your issues originate not with AI but with the fact that independent artists have no recourse when their copyrights are violated. AI wouldn’t be an issue if AI compananies actually paid artists for their work, and artists could sue companies who infringe on their rights. The problem is that artists are being exploited and have no recourse.

    Using allegory to hopefully make my point a bit more clear: Imagine you have a shop of weavers (artists). The comapny running the shop brings in a loom (AI), and starts chaining their workers to it and claiming its an Automatic Weaver™ (pirating and violating artists rights). The problem isn’t the loom, and blaming it shifts blame away from whoever it was that decided to enslave their workers. Trying to ban the loom doesn’t prevent the shop from just chaining the workers to their desks, as was often done in the past, nor does it prevent them from bringing in Automatic Potters™. If you want to stop this, even ignoring the larger spectre of capitalism, it should be slavery that is outlawed (already done) and punished (not done), not the use of looms.

    If you are trying to fix/stop the current state of AI and prevent artists from being exploited by massive companies in this way, banning AI will only slow it and will limit potentially useful technology (that artists should be paid for). Rather than tackle one of the end results of rhe problem, you need to target it closer to its root - the fact that large companies can freely pirate, freeboot, and plagiarize smaller artists.


  • It isn’t current AI voice tech that was an issue. It was the potential for future AI they were worried about. AI voices as they are now, are of similar quality to pulling someone off the street and putting them in front of a mid-range mic. If you care about quality at all, (without massive changes to how AI tech functions) you’ll always need a human.

    And to be clear, what about AI makes it the problem, rather than copyright? If I can use a voice synthesizer to replicate an actors voice, why is that fine and AI not? Should it not be that reproduction of an actor’s voice is right or wrong based on why its done and its implications rather than because of the technology used to replicate it?

    Edit: And to be clear, just because a company can use it as an excuse to lower wages, doesn’t mean its a viable alternative to hiring workers. Claims that they could replace their workers with AI is just the usual capitalist bullshit excuses to exploit their workers.


  • Big movie studios will use it to generate parts (and eventually all) of a movie. They can use this as leverage to pay the artists less and hire fewer of them. Animators, actors, voice actors.

    Only if its profitable, and given that AI output is inherently very limited, it won’t be. AI can only produce lower quality, derivative works. In isolation, some works might not be easy to distinguish, but thats only on a small scale and in isolation.

    If a movie studio pirated work and used it in a film, that’s against copyright and we could sue them under current law.
    But if they are paying openAI for a service, and it uses copyrighted material, since openAI did the stealing and not the studio then it’s not clear if we can sue the studio.

    You can sue the studio. In the same way, you would sue the studio if an artist working there (or even someone directing artists) creates something the violates copyright, even by accedent. If they publish a work that infringes on copyright, you can sue them.

    Seems like it’s being argued that because of the layer of abstraction that is created when large quantities of media is used, rather than an individual’s work, that it’s suddenly a victimless crime.

    By that logic, anything that takes inspiration, no matter now broad, or uses anothers work in any way, no matter how transformative, should be prevented from making their own work. That is my point. AI is just an algorithm to take thousands of images and blends them together. It isn’t evil, any more than a paint brush is. What is, is piracy for commercial use, and non-transformative copyright infringement. Both of these are already illegal, but artists can’t do anything about it, not because companies haven’t broken the law, but rather because an independent author trying to take, for example, Meta to court is going to bankrupt themselves.

    Edit: Also notable in companies using/not using AI, is the fact that even transformative and “”“original”“” AI work cannot be copyrighted. If Disney makes a movie thats largely AI, we can just share it freely without paying them.


  • No, it is theft. They use an artist’s work to make an image they would otherwise pay the artist to make (a worse version, but still). And given how I’ve seen an image with a deformed patreon logo in the corner, they didn’t pay what they should have for the images. They stole a commission.

    But were they (the AI users) going to pay for the content? I have never paid for a Patreon, given that I don’t really have any disposable income. Why would I start, just because AI exists? Just because a sale may be made in some contexts, doesn’t mean it has been made.

    And it is copyright violation. There have been successful lawsuits over much less than a direct image of RDJ in the iron man suit with the infinity stones on his hand.

    Its a copyright violation when material is made that violates existing copyright. It isn’t copyright infringement to take data from media, or to create derivative works.

    And if they won’t pay an artist’s rates, there’s no way they’d pay whatever Disney would charge them

    Disney has lawers. Small artists don’t.

    AI is a nazi-built, kitten blood-powered puppy kicking machine built from stolen ambulance parts. Even if stealing those ambulance parts is a lesser sin than killing those kittens, it’s still a problem that needs to be fixed. Of course, AI will never be good, so we need to get rid of the whole damn thing.

    Banning AI doesn’t stop the Nazis from running the government or influencing the populus, it doesn’t stop them burning the planet, it doesn’t stop them from pirating work and otherwise exploiting artists. Hell, politicians have been doing all of these things without repercussions for a century. If you want the rich and powerful to stop pirating and freebooting artist’s work, maybe the first step is to ban that (or rather, enforce it) rather than a technology two steps removed?


  • AI images try to replicate the style of popular artists by using their work, often including work that was behind a paywall and taken without payment, thus denying the artists revenue. AI has taken something from the artist, and cost the artist money. Until such a time as we come up with a new word for this new crime, we’ll call it by the closest equivalent: theft.

    I’d argue it’s much closer to piracy or freebooting. Generally, its use doesn’t hurt artists, seeing as a random user isn’t going to spend hundreds or thousands to hire a talented artist to create shitposts for them. Doesn’t necessary make it okay, but it also doesn’t directly hurt anyone. In cases of significant commercial use, or copyright infringement, I’d argue its closer to freebooting: copying another’s work, and using it for revenue without technically directly damaging the original. Both of these are crimes, but both are more directly comparable and less severe than actual theft, seeing as the artist loses nothing.

    Also, someone did an experiment and typed “movie screenshot” into an AI and it came back with a nearly identical image from Endgame. Not transformative enough to be anything but copyright infringement.

    Copyrighted material is fed into an AI as part of how it works. This doesn’t mean than anything that comes out of it is or is not copyrighted. Copyrighted matterial is also used in Photoshop, for example, but as long as you don’t use Photoshop to infringe on somsone else’s copyright, there isn’t anything intrinsically wrong with Photoshop’s output.

    Now, if your compaint is that much of the training data is pirated or infringes on the licensing its released under, thats another matter. Endgame isn’t a great example, given that it can likely be bought with standard copyright limitations, and ignoring that, its entirely possible Disney has been paid for their data. We do know huge amounts of smaller artists have had their work pirated to train AI, though, and because of the broken nature of our copyright system, they have no recourse - not through the fault of AI, but corrupt, protectionist governments.

    All that said, theres still plenty of reasons to hate AI (and esspecially AI companies) but I don’t think the derivative nature of the work is the primary issue. Not when they’re burning down the planet, flooding our media with propaganda, and bribing goverments, just to create derivative, acceptable-at-best “”“art”“”. Saying AI is the problem is an oversimplification - we can’t just ban AI to solve this. Instead, we need to address the problematic nature of our copyright laws, legal system, and governments.