• carl_dungeon@lemmy.world
    link
    fedilink
    English
    arrow-up
    65
    arrow-down
    6
    ·
    1 day ago

    Yeah I mean, that sounds reasonable. There is a big difference between generating all your game assets with AI and using Claude to refactor methods and write docs.

    • artyom@piefed.social
      link
      fedilink
      English
      arrow-up
      48
      arrow-down
      11
      ·
      1 day ago

      Big difference but I would argue both require disclosure because I will opt out of any of it. Add it to the long list of bullshit in the gaming industry I will not condone with my money.

      • sudoku@programming.dev
        link
        fedilink
        English
        arrow-up
        30
        arrow-down
        1
        ·
        1 day ago

        The problem is that it’s unenforcable. I bet that’s one of the reasons valve is rephrasing.

        • cecilkorik@piefed.ca
          link
          fedilink
          English
          arrow-up
          9
          ·
          1 day ago

          Even pure AI art is unenforceable unfortunately. Like any form of cheating, some will be amateurish and obvious. But others will be sophisticated, skilled, and will simply blend into a gray area where you can’t easily define a line.

          How much “AI tool assistance” does it take before it’s called “AI generated content”? It’s totally arbitrary, and in many cases it’s going to be completely unenforceable.

          That doesn’t mean it has no value, but it does mean it’s not a silver bullet and no amount of tweaking is going to make it one. We can quickly use it to take out the obvious slop, the well-crafted examples will pass beneath anyone’s notice, and when examples fall into the gray area we’ll all bounce around inside with arguments about who we believe and how much is normal and acceptable until we eventually reach an arbitrary, per-game consensus, or maybe adjust the “rules” a little to accommodate them, but nothing really changes, we’ll probably be arguing about whether games contain “too much AI” for decades now and there will never be a clear solution or answer.

        • artyom@piefed.social
          link
          fedilink
          English
          arrow-up
          7
          ·
          1 day ago

          Sometimes it is, sometimes it’s not. Better to make the rule and enforce it where they can than to just forget about it. Maybe some honest devs will disclose it.

      • Drigo@sopuli.xyz
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        6
        ·
        1 day ago

        Just uninstall all games made after 2022 then, because I can assure you llm’ have been used for code in some capacity in every game. But I would argue there is a big difference in using AI for assert generation. And using it to help read docs or getting ideas for refactoring some code etc

    • MountingSuspicion@reddthat.com
      link
      fedilink
      English
      arrow-up
      10
      arrow-down
      4
      ·
      1 day ago

      Can I ask why you think that? AI has stolen code and art and is regurgitating both without any credit or attribution to the originators. What makes art different from code in your opinion?

    • kibiz0r@midwest.social
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      3
      ·
      1 day ago

      There is a big difference, and I’d argue the Claude refactoring is worse. Content was already pursuing the common denominator. But open source was a place where you could actually bring some nuance, examine things in detail, and build a shared understanding of deeper truths. But why bother with the icky social factors of working together to build something with people all around the world that can evolve and last for 10+ years, when you can boil a swimming pool to produce a half-baked one-off solution instead?

    • [deleted]@piefed.world
      link
      fedilink
      English
      arrow-up
      18
      arrow-down
      1
      ·
      1 day ago

      As long as it is used as a tool that has human refinement as part of the process it would be comparable to CGI replacing background matte paintings and motion capture replacing manual manipulation of CGI to make movement. Gollum worked because of the blend of technologies that were replacing existing practices, but as a new approach and not a cost cutting measure.

      The problem is entirely about using the output directly as a replacement for humans to cut costs, not having another tool that can be used as part of a process. This is coming from someone who absolutely hates LLM and genAI slop, which is taking the horrible approach.

      • iamthetot@piefed.ca
        link
        fedilink
        English
        arrow-up
        16
        arrow-down
        2
        ·
        edit-2
        1 day ago

        That is, in fact, not the problem in its entirety.

        The move to CGI didn’t require stealing the artwork of the matte painters. The move to mocap didn’t require raping the land of all its water. The move to either didn’t require all of the world supply of computing power, leaving it only affordable for the world’s richest. The move to either didn’t create a corporate circle jerk that the damn near whole world economy was propped up by.

        • [deleted]@piefed.world
          link
          fedilink
          English
          arrow-up
          5
          arrow-down
          1
          ·
          1 day ago

          So the thing is, the server farms for doing Toy Story and ILM server farms for rendering CGI were and are massive, but they were built over time and focused on a particular purpose. The same thing can be done with localized models of LLMs and genAI.

          The giants who are buying all the parts and choosing to strain the grid and add polluting energy methods in order to stuff absolutely everything into massive all in one models are a related, but distinct issue, with different goals that encourage shitty practices. Yes, if a game company used the LLM and genAI slop producers tools that is a negative. If they use homegrown or at least dedicated models that are closer in scale to what is already used for CGI then it isn’t automatically a negative.

          It is like advertising. A little is fine, because awareness is needed for people to know something exists. Massively invasive methods of jamming advertising into literally every moment of the day is a problem. ChatGPT and OpenAI are the latter and a problem. Or how Nestlé doing literally anything is horrible even though other companies do the same thing without being nearly as horrible.

          I would prefer to know what AI tools were used, so that I can avoid the ones using the AI slop machines that are a negative.

    • CarbonIceDragon@pawb.social
      link
      fedilink
      English
      arrow-up
      12
      arrow-down
      1
      ·
      1 day ago

      I mean, if were talking about efficiency tools for artists to use rather than straight up automatically generating the assets (not sure what those tools would be at the moment, but I’ve not been following what the AI industry actually releases for awhile because it’s always seemed a bit useless), then the result should be an increase in the output of those artists rather than replacing them with statistical amalgam.

      • NotMyOldRedditName@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        1
        ·
        edit-2
        8 hours ago

        It’s to cover things like code completions built into the development tooling, or asking Claude/ChatGPT questions.

        It covered everything before, and unless you’re a solo dev, not using anything at this point within the entire game studio, is pretty much impossible.

        And even as a solo dev truly trying to not use any AI even googling anything and reading the AI summary at the top would qualify you as having used AI and required disclosure. It was unenforceable and everyone would lie or were lying to themselves.

        This is a much better policy and more enforceable.

    • dukemirage@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      ·
      1 day ago

      Tools like SpeedTree won‘t get dropped because the efficiency gains are enormous and the downside negligible.

    • FishFace@piefed.social
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      4
      ·
      22 hours ago

      If you don’t like automation in video games then I trust you’re opposed to IntelliSense-style refactoring, IDEs in general, and in fact that you work through every single instruction executed by the computer in your own head.

  • 58008@lemmy.world
    link
    fedilink
    English
    arrow-up
    9
    arrow-down
    9
    ·
    21 hours ago

    The Hottest of Takes:

    If we’re talking artistic credibility (as opposed to job security, plagiarism, and environmental impact), I want anti-AI people to uninstall their desktop graphics applications like Photoshop and GIMP. If you depend on buttons, value inputs and algorithms to get the art you want out of the machine, as opposed to using an easel and scanning your work into the PC without minimal touch-ups after the fact, then you’re no better than the person typing book-length prompts to get what they want. If you animate with key frames instead of hand-drawing every frame, you’re likewise just as credible (or not) as the prompt jockey. Hell, if you at any point use CTRL+Z, CTRL+C, CTRL+X or CTRL+V, you’re as artistically incredible as Paulie Promptnuts.

    Just to be clear, I don’t think any of those things. But if you’re dismissing art on the basis that AI was used at some stage in its development, you should be thinking those things.

    • alessandro@lemmy.caOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      41 minutes ago

      As I am replying, you got 9 upvotes and 9 downvotes; looks like the perfect “storm” to put my hot take too.

      We got psychotic people on both side, when you get this grade of polarization people usually lose the perspective.

      AI is a technology, an human logical entity like math: AI works on very advanced (probabilistic) math. Math is not the evil… but an actual evil does exist.

      There’s a difference between a LLM chatbot that runs on your local GPU… and one in the cloud.

      The chatbot on your GPU is “trapped” by your questions, your needs, your choices.

      Today the chatbot on the cloud will tell you that Elon Musk is a controversial person, tomorrow it will tell you Elon Musk is the savior of the Earth and you’re not worthy to kiss his feet.

      People seeing absolute evil in AI, are against you running your chatbot locally, on your PC.

      People enthusiastic about AI will accept any “gift” (or AI GF) Elon Musk will give them.

    • baronofclubs@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      9 hours ago

      I used autocorrect to write this sentence, which is a language model trained on copywritten works. It just so happens to have been developed in the 2000s instead of the 2020s.

    • Buffy@libretechni.ca
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      4
      ·
      17 hours ago

      There are definitely lines being arbitrarily drawn around AI, and there’s not a sensical flow to what any individual might believe is an acceptable AI use case. Nobody has really sat down and made a full documentation of every modern AI use and weighed the benefits and detriments they have on society. I think many of the outward haters of AI are likely just as ignorant as the blind defenders of it.