Is it bad to use CHATGPT to help me learn specific situations in Godot?

Im new to godot and coding, while GDScript helped me a lot in general basics but chatGPT has helped me learn about specific situations in my game like how hitboxes and hurtboxes work for the characters. I dont copy paste the code cos it might ruin the game so i would just ask the AI to explain to me about specific line of code i do instead of crtl c crtl v the code they give.

I tried chatgpt and other LLM and I can say you that they confuse Godot 4 and Godot 3 functions. Sometimes I received completely invented functions. Personally I query a LLM like that when I’m desperate, because some complicate algebra formula doesn’t work. The math teacher behind chatgpt is good, while the chatgpt programmer is less smart, in my opinion!

2 Likes

I have the same experience as the previous comment. It will “hallucinate” all over the place, confidently give you code that doesn’t work.

2 Likes

For general things like understanding concepts, LLMs like ChatGPT are ok. Just keep in mind that if it doesn’t have an answer, it will make one up. It will also make up names of object and functions that don’t exist in Godot, or confuse version of Godot as @fogenp said.

Recent real world studies of using LLM AIs to code have shown that on average using one slows experienced developers down by about 20%. It also generates up to 10 times the normal amount of security vulnerabilities as not using one. Here’s an article published yesterday about how the reality of AI programming have fallen far short of predictions.

I also wouldn’t trust it on math. While AI has gotten better at math, it can still “hallucinate” (which is just a marketing word that means has lots of bugs that create wrong answers). Any output you get from an LLM must be checked by a human. That’s easier to do with things like AI-generated pictures, videos and music. If you don’t like what it has created, you ask it to create it again. With things like writing, programming, etc. if you don’t know more than the LLM about the subject, you cannot determine if the answer it gives you if what you want - because you do not know how to determine if it is correct.

11 Likes

its ok if u need to fix a bug or someting but if u will want a script from him like a full script of any kind then no

thats real and ty for knowleging me

1 Like

Never ever use these…
When I was a noob and newbie ,I used it,but it doesn’t know anything properly about Godot! It mixes godot 3 with godot 4 functions and a lot more nonsense…

And also somehow it knows that godot’s gdscript is kinda very high level(Some times) and it uses functions like Is_Player_On_Floor_And_Moving_Forward (LOL)

2 Likes

I don’t think it’s fair advice to say never in this day and age. Treat LLM’s as they are. A certain tool to do a certain job.

As mentioned a few times in this thread, don’t rely on it generating scripts for you, try to avoid it doing things for you you don’t understand (as you can’t verify it’s output), but do use it to learn.

Ask it for the best design patterns to use for certain use cases in your game, ask it to explain things to you in pseudo code, link to certain documentation that you didn’t know existed, do mundane repeating tasks to speed up processes, explain code to you you found online, like on stackoverflow, or explain documentation you don’t quite understand etc.

Use the right tool for the right task. You might use a hammer to smash a screw into wood and it might work, but there are better tools for that.

4 Likes

That’s true.But for coding ,That sucks!
I use A.I for game plans and more but not for coding,not at all .Cause Chatgpt just sucks at Gdscript.Sometimes it can’t even explain codes.

well we have godot fourm!

Have Fun!!!

2 Likes

Perplexity is pretty useful for godot and c#.
I use it all the time and the quality of the responses are great.

Maybe from a learning perspective: using a LLM to generate (prototype) code for more complex problems is very close to any act of copying in an educational context. You cut a corner and miss a foundation that will make your life easier a few steps ahead.

But as with any tool or very confident friend: as long as you acknowledge the potential faults they will make and your bias towards believing them, you will do fine. Try to judge by yourself: do I really understand this? Must I understand this in the future? If yes, then you’re good. If no, well, do the work.

As a side note: LLMs and the influx of new software designers is still quite new and the patience of the Godot community here is - frankly - amazing. The forum and even more discord get flooded with really basic and simple questions, packed in weirdly complex code. I don’t want to be pessimistic, but I saw communities grow sour by answering the same questions every day again and again. Though there is proper documentation, a heap of learning resources and a board filled with already answered questions.
LLMs might foster a certain laziness or ignorance towards proper learning - I’m not too deep in current studies about this.
But I can say: I love the feeling of a few hours trying things out, testing, failing and finally understanding a concept. Getting this feeling is easier in the beginning, so I would never skip it :wink:

2 Likes

I’d add to what @ximossi said by saying that I think when you are learning, you benefit a LOT more by following a YouTube tutorial or paid class than you do asking ChatGPT or any LLM. Because that person or people spent time thinking about how to teach other people, and is taking you step-by-step through the information necessary for you to learn how to do something. Which means your learning retention is better.

There’s also another reason. Watching free videos helps the people with ads make money, and encourages them - whether they make money or not - to make more Godot content. That benefits the community. The same goes for supporting people who charge for classes, as it literally pays for new classes to be made. In an open-source community where the time of the developers is limited, this is the only way people learn advanced topics and engage with the software.

This is the reason I spend so much time on the forums answering questions. It’s my way to support the project because I don’t have money to donate to it right now.

When one uses ChatGPT or another LLM, they do not support Godot in any way - even indirectly. Because LLMs do not create new content. They just scrape these forums and regurgitate them.

4 Likes

I think chatbots are great as tools for learning. There are problems (like sometimes recommending some function that no longer exists or not being aware of a new feature) but the same can be said about many old (and new) tutorials.
I think people exaggerate the issues and don’t give the bots enough cred for how good they actually are. For example, the other day I had this weird bug that appeared after I added timers to some of my equipment nodes (like “shoot out a fireball every 5 seconds”). Whenever I unequipped the node and stopped the timer, the game would crash upon scene change. I struggled a bit with finding it but copy pasted two scripts to chatGPT and told it which lines I had added and removed. Turned out the problem was that the “await” keyword was still waiting for the timer.timeout as it never resolved. So GPT suggested that instead of using await i should check for a certain value and if i want to cancel the timer, set it to null so it bypasses the check. Problem instantly solved and i learned something useful. And this is a kind of weird issue i think, that does not pop up often or i have seen mentioned elsewhere.
Just remember to be specific and give the bot some help. If i had not pointed out which lines were causing the issues maybe it would not have found it as easily and instead made something up. It is easy to get led on a wild goose chase if you trust the bots too much and don’t know enough to ignore some of their less useful ideas.

1 Like

This is true, but also they will literally make answers up out of whole cloth. I’ve seen multiple times where an LLM created a function name out of whole cloth that never existed.

Perhaps, but then your example (which is a good one) is an example of how to use an LLM to debug you code. Which is a targeted use where you already knew your code and what was wrong. Which goes back to what I was saying about checking your input.

This issue comes up every few weeks on here. Two weeks ago: Custom resources in Godot Mono Two weeks before that: Demotivation using AI

Personally I think this is a much better use of an LLM: Offline AI-Powered NPCs Teaching Sustainable Farming — Built with Godot 4.x and Gemma 3n

3 Likes

I don’t recommend, there are many people who use Godot for a while which can help on youtube, the forums, and you can also use the documentation. chatGPT also messes a lot of things up.

1 Like

Pretty much what everyone else has said. It’s okay for general things, but it confuses a lot of other things. If it doesn’t know something, it’s programmed to come up with a solution… even if it doesn’t know the answer. It literally can’t just say “I don’t know”. The Godot 3/4 mix up is real :joy:

Generally it seems okay to use if you just want to speed up some typing, but you need to already know what you want it to do aka you specifically guide it and are aware of the common pitfalls chatGPT will face; and tell it specifically how to avoid that…which at that point, you might be better off doing yourself.

Maybe you can ask it how to walk you through some common concepts though. It’s good at explaining things it does actually know.

3 Likes

i’m new for Godot as well and I came from other industry(telecom), which makes me curious about the AI’s useage in Godot or the other Game Engines(Unity.e.g)…

Check out some of the links I posted above.