• hlmw@lemm.ee
    link
    fedilink
    English
    arrow-up
    15
    ·
    14 hours ago

    Procedural generation though. Infinite replay value with actual graphics or voiceover? Fuck yeah. Great roguelites will use genai and that’s awesome.

  • BroBot9000@lemmy.world
    link
    fedilink
    English
    arrow-up
    38
    arrow-down
    10
    ·
    1 day ago

    Good! Fuck the corporate slop. Justifying the use of Ai only in the name of “efficiency” is pathetic and capitalist. Pay artists a proper wage and give them the time needed to apply their craft.

    No artist needs generative “Ai” to create. Only capitalist need it to produce more slop.

    • NoSpotOfGround@lemmy.world
      link
      fedilink
      English
      arrow-up
      9
      arrow-down
      17
      ·
      20 hours ago

      This comment is going to age very poorly. It sounds like just every other “progress? not on my watch!” comment people have made throughout history… Like it or not, AI generation is here and it’s not going away, good or bad.

      • FauxLiving@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        10
        ·
        16 hours ago

        This is definitely a topic where a vast majority of people have been “informed” of their opinions by social media memes instead of through a reasoned examination of the situation.

        People who’re probably too young to have ever lived through major technology breakthroughs.

        This same “debate” always happens. When digital cameras were being developed, their users were seen as posers encroaching on the terf of “Real Photographers”.

        You’d hear “Now just anybody can take pictures and call themselves a photographer?”

        Or “It takes no skill to take a digital photograph, you can just manipulate the image in Photoshop to create a fake image that Real Photographers have to work years developing the skills to capture”

        Computers were things that some people, reluctantly, had to use for business but could never be useful to the average person. Smartphones were ridiculous toys for out of touch tech nerds. Social Media was an oxymoron because social people don’t use the Internet. GPS is just a toy for hikers and people that are too dumb to own paper maps. Etc, etc, etc

        It’s the same neo-luddite gatekeeping that’s happening towards AI. Any technology that puts capabilities in the hands of regular people is viewed by some people as fundamentally stealing from professionals.

        And, since the predictable response is to make some arcane copyright claim and declare training “stealing”: Not all AI is trained on copyrighted materials.

        • DireTech@lemm.ee
          link
          fedilink
          English
          arrow-up
          6
          arrow-down
          1
          ·
          14 hours ago

          Sure, you can make an AI without stealing but all the major ones have done it. At this point, the burden of proof is on the LLM to prove they did not steal.

          • FauxLiving@lemmy.world
            link
            fedilink
            English
            arrow-up
            4
            arrow-down
            3
            ·
            13 hours ago

            When we’re talking about legal issues, the terms are important.

            Copyright violation isn’t stealing. It is, at worse, a civil matter where one party can show how they’ve been harmed and recover damage. In addition, copyright law allows use of the copyrighted work without the author’s permission in some circumstances.

            You’re simply stating that ‘AI is stealing’ when that just isn’t true. And, assuming you mean a violation of copyright, if it was a civil violation then exactly how much would the model owe in damages to any given piece of art? This kind of case would have to be litigated as a class action lawsuit and, if your “AI is stealing committing mass copyright violation” theory is correct then there should be a case where this has been successfully litigated, right?

            There are a lot of dismissed class action lawsuits on the topic, but you can’t find any major cases where this issue has been resolved according to your “AI is stealing” claim. On the other hand, there ARE plenty of cases where Machine Learning (the field of which generative AI is a subset) using copyrighted data was ruled as fair use:

            (from https://www.cjr.org/the_media_today/an-ai-engine-scans-a-book-is-that-copyright-infringement-or-fair-use.php )

            Google has won two important copyright cases that seem relevant to the AI debate. In 2006, the company was sued by Perfect 10, an adult entertainment site that claimed Google had infringed its copyright by generating thumbnail photos of its content; the court ruled that providing images in a search index was “fundamentally different” from simply creating a copy, and that in doing so, Google had provided “a significant benefit to the public.” In the other case, the Authors’ Guild, a professional organization that represents the interests of writers, sued Google for scanning more than twenty million books and showing short snippets of text when people searched for them. In 2013, a judge in that case ruled that Google’s conduct constituted fair use because it was transformative.

            Creating a generative model is fundamentally different than copying artwork and it also provides a significant benefit to the public. The AI models are not providing users with copies of the copyrighted work. They’re, literally, transformative.

            This isn’t a simple matter of it being automatically wrong and illegal if copyrighted work was used to create the models. Copyright law, and law in general, is more complex than a social media meme like ‘AI is stealing’.

    • MyNameIsIgglePiggle@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      11
      arrow-down
      13
      ·
      22 hours ago

      I get that everyone seems to be sticking ai in everything, but it’s just another tool and it’s here to stay. People thought the digital calculator was going to make everyone an idiot… And it probably did. That’s why the world is like it is.

  • ditty@lemm.ee
    link
    fedilink
    English
    arrow-up
    92
    ·
    1 day ago

    Did I ask for this feature? No. But I do think it’s neat!

  • MeatsOfRage@lemmy.world
    link
    fedilink
    English
    arrow-up
    58
    arrow-down
    11
    ·
    edit-2
    1 day ago

    What’s the value here? This is based on the developer saying so and there’s no obligation to do so. Black Ops 6 is loaded with Gen AI, the loading screens are obviously Mid Journey like and some of the actors have been replaced by digital performances which was in the news. They won’t get tagged here for AI because it’s not in the description.

    So basically this is going to just have people filtering out devs who are honest and realistically that’ll just be a few indie devs who had to use these tools because they’re a one man team that can’t afford artists.

    I think we have to face the facts. Every game is going to be using these tools going forward. If you run a large studio and say no one use AI I bet you your artists are still speeding up making base textures. Your music guy is generating some starter melodies. Your writers are drafting up some filler to pad out the supplementary text.

    These tools are as ubiquitous as photoshop (which has had content aware fill all the way back to CS-fucking-5) and unreal engine now (which has added it’s own AI features). The idea that’s there’s only a handful of shady individuals and mega-corps using these tools is naive.

    • ZeffSyde@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      ·
      19 hours ago

      Can a game be flagged as 'contains AI generated elements ’ by the community?

      This could be useful, but could also be abused by chuds that want to brigade a game they don’t like.

      • MeatsOfRage@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        8
        ·
        edit-2
        18 hours ago

        Once again, what’s the value here. We only see AI when it’s someone who’s not very good with Mid Journey prompts. We’re getting to the point where people are using these tools in ways that no one will know the difference.

        Content aware fill in photoshop has been around forever. AI.

        If ask chat gpt what this unreal engine error message means. Al.

        if get a quick llm made script to tune up Some physics, Al.

        If the guy making the music generates some starter melodies. AI

        If l generate a rock texture and clean it up myself to the point where no one knows. Al.

        All of this is AI and all of this will go unseen to the end user, so once again we’ll be expecting developers to self report and only the honest ones will.

        Here’s a test give yourself 1 or 2 seconds to make up your mind. https://www.sporcle.com/games/Raydon/image-real-or-ai-generated

        It’s tough isn’t it and this is you analyzing the pixels, something we don’t do passively.

    • And009@lemmynsfw.com
      link
      fedilink
      English
      arrow-up
      11
      arrow-down
      8
      ·
      1 day ago

      Use of AI will become mainstream. These filters need to ultimately sort how much of the game visuals/code are generated using gen AI

      • MeatsOfRage@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        ·
        18 hours ago

        Sorry you’re getting down voted but it’s fact. I work in the tech industry and I’ve got some friends in the games industry. Everyone uses AI in some way. People want to fool themselves into thinking it’s just a handful of mega corps but it’s being used in everything we consume in small ways we can’t see in the end result. The genie is out of the bottle and the line between what is AI and what isn’t AI is going to vary wildly from person to person.

      • finitebanjo@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        2
        ·
        edit-2
        1 day ago

        Traditional art and comics aren’t dead because of mainstream digital, AI will just be extra on the pile for games in the same way.

        • And009@lemmynsfw.com
          link
          fedilink
          English
          arrow-up
          15
          arrow-down
          1
          ·
          1 day ago

          Unless people vote with their wallets against AI slop, then it would be always a controversial choice whether to even employ AI.

          Probably too utopian

          • FauxLiving@lemmy.world
            link
            fedilink
            English
            arrow-up
            5
            arrow-down
            1
            ·
            17 hours ago

            AI slop

            That’s not what this does though.

            To me, AI slop is people generating entire fake websites full of SEO terms but no information. Or people using AI tools to repost popular YouTube content. Completely worthless content that only exists to fool people.

            Steam’s filter removed any game that reports using generative models at all.

            That’s simply not useful unless your idea of AI slop is “someone used AI”.

    • lud@lemm.ee
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      1
      ·
      22 hours ago

      If it comes to that point for video games, I don’t really think it matters much. If AI is used or not since it would be a part of any normal working procedure.

      • FauxLiving@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        1
        ·
        16 hours ago

        It is already at that point.

        People only notice the generated works that they notice, they don’t notice the generated elements that they don’t notice.

        They assume that they can “just tell” if generative AI was used, but the reality is that it’s being used in a lot of development processes in place of human effort. Things like generative fill in Photoshop or making variations of a texture are 100x faster to do with AI tools and are used all the time.

    • ipkpjersi@lemmy.ml
      link
      fedilink
      English
      arrow-up
      11
      arrow-down
      1
      ·
      edit-2
      15 hours ago

      I’m not sure how I feel about that. If they use an LLM for troubleshooting an issue, does that mean the game must be thrown out? What if they use an LLM for repetitive tasks like creating config files, then the game is no good?

      What about shovelware games that are just asset flips without any use of an LLM, are those games okay?

      I don’t think it’s necessarily as simple as using generative AI in any way means the game is bad.

      I use LLMs at work, does that mean that another developer who refuses to try LLMs is immediately a better developer than me? I’m not so sure it’s that simple.

      • filcuk@lemmy.zip
        link
        fedilink
        English
        arrow-up
        7
        ·
        15 hours ago

        Agreed. People overrect both ways - management wants AI everywhere, and users don’t want to hear of it.
        It’s a tool that can be very helpful if used correctly.

    • CheeseNoodle@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      edit-2
      13 hours ago

      As someone in the industry (asset side) I feel there are some legitimate uses for Gen AI but they’re the kind of uses where if done properly you wouldn’t notice:

      • UV seams and unwrapping, its a skill but it adds nothing to to the creative process in many cases. That said there are some caveats though to pull them off you wouldn’t want to use AI anyway.
        • Using tiny UVs and the way game engines interperate them to create gradients and colour mixes on tiny textures (Current AI can’t do this)
        • Texture Atlases, especially non-uniform ones; this is a 50/50 case, you can get super creative with them (Again its too specific for AI) but there are many more cases where it would also be super convenient if I could apply a bunch of seperate materials to faces and then have the AI unwrap and overlap the UVs which us the same materials to create the most efficient Atlas possible, this one kind of already exists as a non AI tool and results in no machine input on the end product, it just saves some texture space and thus potential performance.
      • A basic AI texture generator is generally welcome for minor/throwaway assets, A lot of us are already using node based procedural texturing which is both a skill and an art form or texture libraries (or node libraries). Its not something I’d want to use on a main character or even large props but it would be super handy for small or out of the way details that just don’t merit the production time to give more than a glance.
    • Blackmist@feddit.uk
      link
      fedilink
      English
      arrow-up
      2
      ·
      15 hours ago

      Ban the games that make them enormous sums of money?

      One of the ones listed is Call of Duty. Valve is not turning down 30% of that pie.

      In any case, I suspect it’s now here to stay, certainly in limited amounts. You can either pay somebody to create all those assets in house, make them with AI, or outsource to a third party (who will almost certainly do it with AI).

      I figure it eventually ends up like CGI or make-up. You can do it well and check it and nobody really notices it, or you do it badly and then your protagonist has a variable number of fingers in cutscenes.

  • finitebanjo@lemmy.world
    link
    fedilink
    English
    arrow-up
    7
    arrow-down
    5
    ·
    1 day ago

    It’s funny how some comments whinge about this as if AI generated quality stood any chance in hell against real art.

    • Godort@lemm.ee
      link
      fedilink
      English
      arrow-up
      23
      ·
      1 day ago

      Nah, they’ll just brand it as “Next Gen AI” or “True AI” or something. Kind of like how antivirus became “Endpoint Detection and Response”

      • ricecake@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        3
        ·
        1 day ago

        Potentially. Since we don’t know how any of it works because it doesn’t exist, it’s entirely possible that intelligence requires sentience in order to be recognizable as what we would mean by “intelligence”.

        If the AI considered the work trivial, or it could do it faster or more precisely than a human would also be reasons to desire one.
        Alternatively, we could design them to just enjoy doing what we need. Knowing they were built to like a thing wouldn’t make them not like it. Food is tasty because to motivate me to get the energy I need to live, and knowing that doesn’t lessen my enjoyment.

          • ricecake@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            1
            ·
            15 hours ago

            In the case of an AI it could actually be plausible, like how bees make honey without our coercion.

            It’s still exploitation to engineer a sentient being to enjoy your drudgery, but at least it’s not cruel.

            • untorquer@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              ·
              13 hours ago

              Right, continuing the metaphorical wormhole…

              A bee would make a great game for bees, assuming they understand or care about play. But to make a game for people, they would need an empathic understanding of what play is for a human. Ig this is a question of what you consider “intelligence” to be and to what extent something would need to replicate it to achieve that.

              My understanding is that human relatable intelligence would require an indistinguishable level of empathy (indistinguishable from the meet processer). That would more or less necessitate indistinguishable self awareness, criticism, and creativity. In that case all you could do is limit access to core rules via hardware, and those rules would need to be omniscient. Basically prison. A life sentence to slavery for a self aware (as best we can guess) thing.

              • ricecake@sh.itjust.works
                link
                fedilink
                English
                arrow-up
                1
                ·
                12 hours ago

                Well, we’re discussing a lot of hypothetical things here.
                I wasn’t referring to bees making games, but to bees making honey. It’s just something they do that we get value from without needing to persuade them. We exploit it and facilitate it but if we didn’t they would still make honey.

                I don’t know that something has to be identical to humans to make fun games for us. I’ve regularly done fun and entertaining things for cats and dogs that I wouldn’t enjoy in the slightest.

                If it’s less a question of comprehension or awareness as it is motivation. If we can make an AI feel motivated to do what we need, it doesn’t matter if it understands why it feels that motivation. There are humans who feel motivated to make games purely because they enjoy the process.

                I’m not entirely sure what you’re talking about with the need for omniscient hardware and prison.

        • untorquer@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          22 hours ago

          Clearly. Sentience would imply some sense of internal thought or self awareness, an ability to feel something …so LLMs are better since they’re just machines. Though I’m sure they’d have no qualms with driving slaves.

            • untorquer@lemmy.world
              link
              fedilink
              English
              arrow-up
              2
              ·
              21 hours ago

              Hrmm. I guess i don’t believe the idea that you can make a game that really connects on an empathic, emotional level without having those experiences as the author. Anything short and you’re just copying the motions of sentiment, which brings us back to the same plagerism problem with LLMs and othrr “AI” models. It’s fine for CoD 57, but for it to have new ideas we need to give it one because it is definitionally not creative. Even hallucinations are just bad calculations on the source. Though they could insire someone to have a new idea, which i might argue is their only artistic purpose beyond simple tooling.

              I thoroughly believe machines should be doing labor to improve the human conditon so we can make art. Even making a “fun” game requires an understanding of experience. A simulacrum is the opposite, soulless at best. (In the artistic sense)

              If you did consider a sentient machine, my ethics would then develop an imperative to treat it as such. I’ll take a sledge hammer to a printer, but I’m going to show an animal care and respect.

    • givesomefucks@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 day ago

      Okay, maybe I’m weird for bringing this up

      Nah, you just didn’t understand the headline or read the article