Image Format and Rendering Server confusion

Godot Version

4.6 Stable

Question

I’m trying to create a texture based off of an image and attach it to a Texture2D. Which works with a float type, but not with an integer type.
Scene Tree is a Node2D with the following script attached and a child TextureRect.

extends Node2D
var rdmain := RenderingServer.get_rendering_device()
var level_picture : Image = load("res://boid_particles/Trash.png").get_image()
var level_size : Vector2
var level_texture := Texture2DRD.new()

func _ready() -> void:
	#setup_image.convert(Image.FORMAT_RGBAF)
	level_picture.convert(Image.FORMAT_RGBA16I)
	level_size = level_picture.get_size()
	var texture_format = _create_texture_format()
	var view = RDTextureView.new()
	var local_texture = rdmain.texture_create(texture_format,view,[level_picture.get_data()])
	level_texture.texture_rd_rid = local_texture
	print("Local textuer is ",local_texture)
	$TextureRect.texture = level_texture


func _create_texture_format() -> RDTextureFormat:
	var format := RDTextureFormat.new()
	format.array_layers = 1
	format.depth = 1
	#format.format = RenderingDevice.DATA_FORMAT_R32G32B32A32_SFLOAT 
	format.format = RenderingDevice.DATA_FORMAT_R16G16B16A16_UINT
	format.width = int(level_size.x)
	format.height = int(level_size.y)
	format.usage_bits = RenderingDevice.TEXTURE_USAGE_CAN_COPY_FROM_BIT \
	| RenderingDevice.TEXTURE_USAGE_CAN_COPY_TO_BIT \
	| RenderingDevice.TEXTURE_USAGE_SAMPLING_BIT \
	| RenderingDevice.TEXTURE_USAGE_STORAGE_BIT

If I convert the image to RGBAF and use the R32G32B32A32_SFLOAT, the image shows up properly in the scene. (Both commented out at the moment)
If I use the code as given, RGBA16I and R16G16B16A16_UINT, I wind up with nothing showing, as if the texture is fully transparent or doesn’t exist.

Am I doing something wrong here, or is the image type not supported by the main rendering device natively? Project is set to Forward+, if that makes any difference.
Same situation occurs if I use Godot’s icon.svg instead of the image file I’ve put in (both show as RGBA8 types in editor)

Why?

@export var texture_2d: Texture2D

Drag and drop res://boid_particles/Trash.png into the texture field.

Done.

Or:

const TRASH = preload("res://boid_particles/Trash.png")

var texture_2d: Texture2D

func _ready() -> void:
	texture_2d.texture = TRASH

Check if the format is supported by your device using RenderingDevice::texture_is_format_supported_for_usage()

@DragoN Why? Because I’m planning to modify it with a compute sharer later and am testing the smaller things to start with. As well as learning how to handle shaders and other possible uses later. Also because I’m trying to learn, and I’m confused as to why a float type will draw just fine but an integer type won’t, especially when there’s no errors being thrown and everything that I can find in the documentation seems to support the current setup.

@normalized It’s returning “true”.
print(rdmain.texture_is_format_supported_for_usage(RenderingDevice.DATA_FORMAT_R16G16B16A16_UINT,RenderingDevice.TEXTURE_USAGE_SAMPLING_BIT))
I had checked the documentation beforehand to make sure, and I’m not getting any errors, so its weird that it simply has nothing show up. If I put intentionally screw it up and convert the image to a float and tell the computer to take an integer format, I do get an error, and I get a pink&black checkerboard texture showing up.

I played with it some more and I receive confirmation that the texture is valid and the data appears correct.
rdmain.texture_is_valid(local_texture)
rdmain.texture_get_data(local_texture,0)

The above works both with float and integer format settings, so I’m still not sure why the integer version doesn’t draw anything.

Could be a sampler issue in the shader. There are some caveats mentioned in the docs regarding the 16 bit integer pixel format.

What happens if you just create the texture from 16I data using ImageTexture::create_from_image() and try to display it?

If I put
ImageTexture.create_from_image(level_picture)
after reformatting it to an 16I, I still get no image. Tested a few other formats, including float, and they work. So I’m guessing the main rendering device doesn’t cover the usampler stuff mentioned in the documents and the 16I format doesn’t work natively on default rendering server.

Looks like it. Try drawing it with a custom shader that uses usampler.

Thanks. I’ll work on that later, should only be using 1 or 2 images for the thing I’m working on so that shouldn’t be too much work.

1 Like

Although I don’t think the format doesn’t work on the rendering server. It may just be that the material shader code doesn’t ever use usampler. If you try drawing it with a custom shader and it draws properly, then everything is fine with the texture data on the rendering server/device side.