This is true, but also they will literally make answers up out of whole cloth. I’ve seen multiple times where an LLM created a function name out of whole cloth that never existed.
Perhaps, but then your example (which is a good one) is an example of how to use an LLM to debug you code. Which is a targeted use where you already knew your code and what was wrong. Which goes back to what I was saying about checking your input.
This issue comes up every few weeks on here. Two weeks ago: Custom resources in Godot Mono Two weeks before that: Demotivation using AI
Personally I think this is a much better use of an LLM: Offline AI-Powered NPCs Teaching Sustainable Farming — Built with Godot 4.x and Gemma 3n