Add an "ai-assisted" tag or something similar, especially for Resources people make

I think this is a simplification of what is going on. A guy in 2018 tried to register a piece of AI art. Last week, in 2026, the US Supreme Court refused to hear the case when it finally got to them. This guy was clearly just testing the waters. The Supreme Court upheld the lower court decision.

But this guy is not a multi-billion, or multi-trillion dollar corporation. Assuming that this is going to be how it is for all time is short sighted in my opinion. An similar example is Roe v. Wade, which for decades was the settled law of the land in the U.S., until someone decided to bring another case and get it in front of the Supreme Court, and the Supreme Court decided to hear it. They then overturned that decision.

Sure, it gives us some guidance for now, but do you really think that if someone like Elon Musk (Grok) decides to throw their hat in the ring that this won’t be revisited?

2 Likes

My point was on topic of the OP, that when here is a tag for “ai-assisted”, that this has some more implications, eg. that there is no ownership of the AI assets that someone could offer here. And that there is the possibility, that when someone else uses the asset (eg. GenAI music from Suno, or imageGen textures), offered here for free, that this could lead to problems for the user down the road.
It wouldn’t be just a tag, it would imply copyright questions.

EDIT: As an argument pro tag, against the “non-AI” tag. And if there should be consequences, if you don’t tag your AI assets.

EDIT2: And that it wouldn’t be just a “religious” or “philosophical” motivation to add such tag.

1 Like

Another technicality to add here:
When the Supreme Court refuses to hear a case it is functionally the same as upholding the lower court decision.
The lower courts decision, however, is then specific to the case and is does not set precedence for other cases as it would if the Supreme Court decided on it.

1 Like

I agree. I am just too afraid people would avoid using it. I know on Roblox when we wanted to open a bug report, there was an entire process. It could be a great idea to have a post creation process that explicitly requires the user to mark whether or not the listed content is covered by the different situations you mentioned.

Is it just me or is this post above the pinned post that welcomes you to the Godot forum?!?

What do you mean xD

I did nothing by me and this post is above the welcome to the Godot forum threadm th. Never seen this

oohhh
got it,pin posts are hidden after your inactivity because it’s odd if there’s a post always in the top.

Oh ok. Never saw that before lol. My bad ig lol

1 Like

this would be perfect, and as someone who hates ai I feel it would help me filter out the slop from the good stuff (as ai tends to function worse)

1 Like

What I find hillarious though, to make AI code detectors not detect code as AI one has to make the code appear more sloppy. Too bad for pendanticly good organized coders.

1 Like

AI Code detectors are really inaccurate because… surprise surprise, they also use AI :slight_smile:

At one point I did a test, I wrote a simple utility function, ran it through AI code detector, and it came back as 97% likely AI because it “Had no comments” and “No business specific logic”

So I added a bunch of comments without changing the code, and it was suddenly “100% likely human”

6 Likes

I asked Opus 4.6 to write a code that could not be detected as AI. First attempt got 50% AI. I just told Opus that and it allready assumed correctly by it self which parts most likely still made it AIish. Second attempt was 0% AI detection. I didn’t check what the code did though. I just wondered about that since I know that many universities try to catch code plagiating an AI using students.

Okay, let me expand on it a bit more.

I gave my own C++ networking library to ChatGPT’s AI code detector, and the results were AWFUL : (
(FYI: most of the code is from the Winsock doc, with a little tweaks from my little C++ knowledge)

It says that the code is human-written because of:

  1. Casual / personal comments
  2. Natural developer thoughts

And the reasons it says it might be AI-generated is:

  1. Clean namespace structure
  2. Consistent error checks

At the end of the day, it mostly looks at the comments.

And I guess…it’s because of the power of my comments xD

// Error detection is a key part of successful networking code - Fellow Microsoft employee

Btw,I got my networking lib work!

2 Likes

At this point I am so confused at how humans use AI nowadays.
In my opinion, Artificial Intelligence has depleted humanity’s intelligence dramatically and the fact that so many individuals rely on AI to do simple operations is honestly sad.

I use AI to an extent, I do not see any logical reason to trust AI with my entire codebase; and I
do not allow nonsensical code in my workspaces. I will honestly admit that being 4 months into GDScript development it seems to do a cleaner job than me in some aspects,
nonetheless, I prefer to just simple-prompt ask whatever small proper-syntax lines of code i am not sure about and then build the puzzle myself along with the amount of proficiency I currently posses. I dont provide AI assets at all, and i dont intend to, but
I say that AI has made us programmer’s lives a lot harder because the general public expects us to make FAR more than what we are physically capable of now.

My first programming language was JS, and later C#, I have gotten the hang of GDScript mostly, and as long as I use AI like the tool it is SUPPOSED to be, I am perfectly fine. Nonetheless, I dont really see the point in pointing out AI as long as the person using it knows what they are doing fully.
I wont judge anyone if they use AI or not, but for humanity’s IQ, I would hope that those who do arent controlled by it. (I dont really like to use other people’s assets for stuff in my own works, and I dont really care to brag either, this doesnt exactly matter much in my perspective, but its a conversation.)

*Apologies if i misworded things

2 Likes

I think we have to look at programming relative to the possitilities of the times we live in.
Back when I got interessted in computers there was Basic, Pascal and Assembler. You had to dig deep into the hardwar to get anything “fast” done on your computer.
Fast forwarding, today there are game engines like Godot that takes the burden of low level programming for those mainly interessted in getting their game running. The other problem is the forrest of possibilities when it comes to programming languages. Python, GDScript, C#, C++… how to even try to keep up ? or becoming an ant like specialist ? or becoming a hardcore purist (only what I do my self is worth anything).
I don’t think that AI takes away anyones IQ. Mainly because we’re born with a certain intelligence. The environment we live in does to some degree effect how we can use our IQ.
A few year ago I felt almost lost in my IT job since things have become too complicated and too intertwined ( too many dependecies). First when I tried out GPT 3.5 on a load of 1000+ log lines and it could actually point me where the issue was, I felt some relieve. Now maybe I’m getting old and there should be a max age for IT workers :wink: But I’m curious where this AI thing leads to. I see the downsides but also some hope.

2 Likes