Godot Version
4.5.stable
Question
I have a script which has type SceneTree, I cannot get it to run. I am new to godot so if something is super obvious, just tell me.
Program:
extends SceneTree
const server = {"host": "https://localhost","port": 11434}
const model = "gpt-oss:120b-cloud"
var prompt = [
{
"role": "user",
"content": "Hello! I am Nathanne."
},
{
"role": "assistant",
"content": "Hello, Nathanne. I am ChatGPT, an AI assistant developed by OpenAI."
},
{
"role": "user",
"content": "Can you tell me back my name? (Answer in one sentence only)"
}
]
# Preload the NokoModel module, which provides functions to manage AI models.
const NokoModel = preload("res://addons/noko/modules/NokoModel.gd")
# Preload the NokoPrompt module, which provides functions to interact with AI models via prompts.
const NokoPrompt = preload("res://addons/noko/modules/NokoPrompt.gd")
func _init():
call_deferred("_Call_Models()")
func _Call_Models():
print("Loading Models")
var root = get_root()
if await NokoModel.load_chat_model(root, server, model, false):
print("Successfully loaded model: " + model)
else:
print("Something went wrong trying to load model: " + model)
quit() # Terminate the application if loading fails.
# Asynchronously send the conversation prompt to the AI model and await the response.
var generated = await NokoPrompt.chat(root, server, model, prompt)
# Output the assistant's response from the AI model.
print(prompt)
print(model + ":\t" + generated["body"]["message"]["content"])
# Asynchronously unload the chat model from the server to free up resources.
if await NokoModel.unload_chat_model(root, server, model, false):
print("Successfully unloaded model: " + model)
else:
print("Something went wrong trying to unload model: " + model)
For reference, I am using this: GitHub - nthnn/noko: User-friendly Godot plugin that facilitates seamless interaction with Ollama models via API that empowers developers to enhance their games with interactive Large Language Models (LLMs), enabling dynamic dialogues, intelligent NPCs, and more.