Mod+ Ai generated content in RPGs

Best Selling RPGs - Available Now @ DriveThruRPG.com
Generative AI is going to be a substantial part of all our work lives, I think. I just hope that it gets sensibly regulated - you should be required to license your training sets, for example, and it should be completely illegal to use any kind of machine learning model to make decisions about, e.g. loan approvals and the like. We shall see.

More on topic. I think it's clear that companies will start allowing AI, apart from anything else, it's going to become part of the normal workflow of most digital artists. Maybe not the prompted image generation side of thing so much, but getting AI to fill in the background when you move something or want to expand the canvas and so on are just going to become completely normal and the line between "generative art" and "using the AI to clean up a bit" is going to be completely blurred.
Here's something I did about 9 months ago using Stable Diffusion. I wanted this clip from a YouTube video to be a little less wide to fit more nicely onto a web page. I just cut a chunk out of it, put the two sides together, and had the AI model touch it up. It's not perfect, but I think it's passable. This is definitely on the "clean it up a bit" end of the spectrum, but you're 100% right this is just going to be part of the standard toolkit at some point.

before_and_after.png
 
I just saw this on DTRPG:


Selling AI generated Art? In Art packs? This seems a bit questionable...
It is not new. David Kelly is known as a strong proponent of using AI-generated art. He has told numerous folks to go pound salt when criticized.
 
Selling AI generated Art? In Art packs? This seems a bit questionable...

I don't think there's an inherent problem with doing that, so long as its clear what you are buying. However, I do wonder what will happen with the legalities around it depending on which generator he used.
 
As long as it isn’t misrepresented I don’t see the issue. AI is here to stay whether people like it or not. It is used heavily by businesses and is making headway into government to include public safety.
 
I don't think there's an inherent problem with doing that, so long as it’s clear what you are buying. However, I do wonder what will happen with the legalities around it depending on which generator he used.
That is a valid point and the main issue I could see, if a specific generator is found to be culling images illegally I could see having to take down any products that came from it.
 
Why? It isn’t like you have to buy it and it takes effort to work with the AI generated images to assemble a collection to put up for sale.
The largest issue with AI is provenance. If you don't know the provenance, you don't know if it was trained on works that the AI didn't have rights to. The more you lengthen that provenance stream, the harder it is to police.
 
It is not new. David Kelly is known as a strong proponent of using AI-generated art. He has told numerous folks to go pound salt when criticized.
Given DTRPGs stance on AI art, I was just surprised.
 
As long as it isn’t misrepresented I don’t see the issue. AI is here to stay whether people like it or not. It is used heavily by businesses and is making headway into government to include public safety.
Machine learning has been around for longer than most people realize. It has even been involved in healthcare for a while now. Health insurance companies use AI models to determine which treatments they will cover. At least, that's what they're being sued for. :grin:
 
Machine learning has been around for longer than most people realize. It has even been involved in healthcare for a while now. Health insurance companies use AI models to determine which treatments they will cover. At least, that's what they're being sued for. :grin:

Weapons of Math Destruction is a rather good book on the problems of Big Data driven algorithms.
 
I have purchased a few rpg''s that have used AI in them, but they also state - in some cases, that it was then manipulated by a human artist.

Yellow Byte definitely use AI art, and what is in their books is absolutely amazing, but so is their content and it is this level by amateurs that resulted in teaching myself Adobe InDesign (I bought an instruction book too).
 
The largest issue with AI is provenance. If you don't know the provenance, you don't know if it was trained on works that the AI didn't have rights to. The more you lengthen that provenance stream, the harder it is to police.
Training an AI is Fair Use. What an AI generates is public domain.
 
Training an AI is Fair Use. What an AI generates is public domain.
That is incorrect. They haven't ruled whether you can train an AI on copyrighted material. That's what all of the hubub is about. The EFF can make whatever interpretations they want. The truth of the matter is they are not trial court. Which is the reason for the terminology they use.

Like copying to create search engines or other analytical uses, downloading images to analyze and index them in service of creating new, noninfringing images is very likely to be fair use. When an act potentially implicates copyright but is a necessary step in enabling noninfringing uses, it frequently qualifies as a fair use itself.

They use those words for a reason.
 
They haven't ruled whether you can train an AI on copyrighted material.



They use those words for a reason.
It's fair use until someone rules that it isn't, which would be easy to do if opponents could prove any of their claims. Which they have not. Nor are they likely to do so, we're talking about microsoft and adobe, they have teams of lawyers, not some ai copyright cowboys.
 
It's fair use until someone rules that it isn't, which would be easy to do if opponents could prove any of their claims. Which they have not. Nor are they likely to do so, we're talking about microsoft and adobe, they have teams of lawyers, not some ai copyright cowboys.
Thats... not how copyright works. And you can believe if it's ruled against, people that think that are going to be scrambling.
 
Thats... not how copyright works. And you can believe if it's ruled against, people that think that are going to be scrambling.
My personal get is even if they rule it's not fair use , companies will figure out what is being learned and teach them to come to the same results another way

The cheapest might be to just hire a bunch of artists to work for hire reproduce the works. Like mechanical turk.
 
Thats... not how copyright works. And you can believe if it's ruled against, people that think that are going to be scrambling.
If you believe someone violated your copyright with a derivative work, you file a claim. Then the supposed violator has to prove that it isn't derivative by proving it's a transformative work, and the results of that system have been really unpredictable on what's fair use and what isn't. That would cover the output of ai, and work would have to be judged on a case by case basis under the current system.

The way denoising generation works is by analysis of an art's properties by a model that then informs another model how to draw. A lot of opponents claim that this is use. I see this as learning, especially since the model that draws never "sees" source images, and because the information about source material is so small (8 bits). But my opinion has very little to do with copyright law.

The fact is that someone is going to have to decide that the indexing done by the learning model is in fact Use, and legally distinct from the protection afforded search engines like google in the DMCA. If that's the case, it's likely to be ruled not fair use, as a non-human can't create a copyrightable work, or (I think) a transformative one. But I'm not a copyright lawyer, and it's probably going to be eventually decided by a bunch of geezers in robes who don't understand tech at all.
 
If you believe someone violated your copyright with a derivative work, you file a claim. Then the supposed violator has to prove that it isn't derivative by proving it's a transformative work, and the results of that system have been really unpredictable on what's fair use and what isn't. That would cover the output of ai, and work would have to be judged on a case by case basis under the current system.

The way denoising generation works is by analysis of an art's properties by a model that then informs another model how to draw. A lot of opponents claim that this is use. I see this as learning, especially since the model that draws never "sees" source images, and because the information about source material is so small (8 bits). But my opinion has very little to do with copyright law.

The fact is that someone is going to have to decide that the indexing done by the learning model is in fact Use, and legally distinct from the protection afforded search engines like google in the DMCA. If that's the case, it's likely to be ruled not fair use, as a non-human can't create a copyrightable work, or (I think) a transformative one. But I'm not a copyright lawyer, and it's probably going to be eventually decided by a bunch of geezers in robes who don't understand tech at all.
But what I'm saying, is that you aren't protected by Fair use just de facto. Several people have found this out the very hard way.
 
Copyright seems like an antiquated concept in a world where 5.5 billion people have access to the Internet, and a rudimentary form of generative AI is emerging. Maybe they'll have to start compensating some artists for using their work as training data. Still, there will be an inflection point where the total output of generative AI reaches orders of magnitude beyond the recorded works of humanity. The claim that human artists have over the inputs' provenance will be diluted to nothing. We can have philosophical conversations about the meaning of art and subjectivity, but chances are most real people will prefer the content that AI produces because human preferences will guide its development.
 
Machine learning has been around for longer than most people realize. It has even been involved in healthcare for a while now. Health insurance companies use AI models to determine which treatments they will cover. At least, that's what they're being sued for. :grin:
I remember when Amazon dropped their AI recruiting algorithm because they couldn't get it to stop being sexist.
 
That is incorrect. They haven't ruled whether you can train an AI on copyrighted material. That's what all of the hubub is about. The EFF can make whatever interpretations they want. The truth of the matter is they are not trial court. Which is the reason for the terminology they use.
Yeah. And even if it ends up being ruled as fair use (there's no way this doesn't eventually end up in court) here in the US, I find it extremely unlikely that the EU courts would follow suit. Which means selling AI art packs becomes a special case where business rules have to be applied at the point of sale. I'm willing to bet that's more hassle than small sites like DTRPG want to get into.
 
Yeah. And even if it ends up being ruled as fair use (there's no way this doesn't eventually end up in court) here in the US, I find it extremely unlikely that the EU courts would follow suit. Which means selling AI art packs becomes a special case where business rules have to be applied at the point of sale. I'm willing to bet that's more hassle than small sites like DTRPG want to get into.
Yeah, I am 90% sure the EU will rule against on this and will so will the UK. Any exceptions carved out will be for non commercial/academic use. Even with the US having more generous fair use laws I'm not sure the AI companies will win, especially for commerical use. IANAL, but this seems pretty murky to me.
 





Totally made my playlist.

Built from the ground up with no outside music sources so no copyright issues of any kind.
 
I have said quite a bit concerning gen-ai tools, also stating that I am not 100% opposed to their use, so long as they are trained without using copyrighted works without consent, credit, or compensation. But unfortunately we still have a very long way to go before the tools are indeed ethical or safe for commercial use.

The personal risks aside, my extensive research has uncovered that gen-ai tools, and the companies that are profiting off of them (to the tune of billions $), are relying on exploitive labor. Not only have they stolen the copyrighted works of countless writers and artists, they are also reliant on sweatshop labor, prison labor, and child labor, paid pennies, and sometimes not paid at all, in order to have what we refer to today as "ai tools."

Simply put, without this exploited labor, these tools would not exist in their current form. Regardless of "whataboutism's," these companies have billions of dollars, they can afford to pay both for copyrighted data, and they can afford to pay workers as well, but they chose the path that most corporations take, the shortest path towards profits.

There is no judgement here, I am not saying you are bad if you use ai tools, I am simply bringing the facts to light. Even if you don't like these facts, it doesn't change the fact that they are indeed the truth. What you do with that truth is up to you, but I personally will not support the use of ai tools until this has been resolved. Since it is likely that this will not change in the near future, I will not be engaging with the tools. That's my choice, what you choose to do is up to you, just be careful about your choices as copyright law will likely take years to catch up, and when that happens, I wouldn't want to be left holding that particular bag.

I am only trying to inform the public about the inherent risks. As these companies have no benefit in warning the public that you may be putting yourself and your business at risk by using them (as this would be an obvious admission of guilt on their part). Transparency is needed, as well as some level-headed oversight, these things take time though.

Here are several articles presenting evidence on the subject.

Uncovering the Labor Exploitation That Powers AI

Child Sweatshops Power the AI Industry

The Invisible Human Prisoners Training AI

Hope you have a good day. Thank You For Your Time.

Art
 
Facts can’t be copyrighted. Under US Copyright Law anybody can take a copyrighted work and prepare a index and sell that Index without seeking permission from the original author. In fact that an author of a index enjoys a copyright to that specific index. It can’t stop someone else from making their own index but it can’t be a duplicate of the an existing index.

The training of AI relies on statistical analysis of works. Once that done it company creating the model can be distributed without the original works needing to be distributed as well. Just like a index can be distributed without needing to copy the original work as well.

Until copyright law is changed to allow only the original author to prepare indexes then lawsuits trying to stop AI companies from training on copyrighted works will continue to fail.

Where these lawsuits will succeed is when a company is caught using pirated works to trained their models. While the training itself can’t be stopped the fact they pirated troves of works on a epic scale leave them vulnerable to garden variety copyright infringement along with damages. And if they engage in labor abuses to develop their models then they should be held accountable for that as well.

But as a cautionary note if you change the law to allow authors to control how facts about their works are distributed then you will open the gate to all kinds of unintended consequences. For example shutting down reviews. Sites like goodreads and so on.
 
If you want a more nuanced discussion of the wages in the countries impacted you can see it here:

basically the wages called "slave wages" in the articles turn out to be average wages in the countries where the work is being done so not horrible pay for what amounts to a job requiring minimal skills.

People living in these countries, from the comments in the discussion, generally see the income as a good thing and raising the standard of living leading to eventual rises in the overall wages for the countries if they last long enough.

The articles themselves say the pay is above the minimum wage and use baseless claims of "digital slavery" while showing only limited examples of force (prisons/internships) being used to support that. In low wage countries it appears this is a choice and pays relative to other jobs in country average.

If the reporters here did a better job of providing context and support I would be more moved by the information but as it is they seem focused on hyperbole.
 
Training an AI is Fair Use. What an AI generates is public domain.

This is kinda interesting because I'm currently reading Golgotha by Greg Saunders which is littered with AI art (from Midjourney).
Now I wouldn't be interested in this but surely this could mean anyone could copy out these art pieces and re-use them?
 
This is kinda interesting because I'm currently reading Golgotha by Greg Saunders which is littered with AI art (from Midjourney).
Now I wouldn't be interested in this but surely this could mean anyone could copy out these art pieces and re-use them?
Yes, but only if they are unmodified. If a person took the AI image and reworked them then that work is probably copyrightable. But it would probably have to pass the transformative standard. I.E., adds "new expression, meaning, or message"

The most likely thing that the courts will do when dealing with this is apply the same standards as one would do with the use of traditional public domain art.
 
Banner: The best cosmic horror & Cthulhu Mythos @ DriveThruRPG.com
Back
Top