How to pass a list/array of texture3Ds into shader as uniform input?

:information_source: Attention Topic was automatically imported from the old Question2Answer platform.
:bust_in_silhouette: Asked By toblin

I’m using shaders for speedy calculations, and I have several values per “vertex” that I need to provide as input to a shader. The best way I’ve figured how to do so is to have a texture3D for each number of inputs, and then let the shader get its inputs by reading them using its own vertex coordinates.

The question therefore is: how do you pass an array/list of texture3Ds as input to a shader? Alternatively: how else would one solve the above use case?

:bust_in_silhouette: Reply From: klaas

godot does not support sampler3Darrays,

Addressing the vertex by there spacial position seems a huge waste of memory because most position will just be empty. You can go another way to provide extra data through a texture by compressing it more thightly.

Since godot does not give you the vertex index(what a shame) you put the id into the mesh

func id_to_uv2(mesh):
	var mdt =
	mdt.create_from_surface(mesh, 0)
	vCount = mdt.get_vertex_count()
	for i in vCount:
		mdt.set_vertex_uv2( i, Vector2(i,0)) 

now you can read the vertex index in your vertex shader.

you now have to define your data length per pixel … lets say 4x4 bytes

vec4 get_data(vec2 uv2){
    int offset = uv2.x * 4.0;
    pixel_data_1 = get_pixel(  offset );
    pixel_data_2 = get_pixel(  offset+1 );
    pixel_data_3 = get_pixel(  offset+2 );
    pixel_data_4 = get_pixel(  offset+3 );

vec4 get_pixel( int offset ){
    //get the pixel size of the texture
    ivec2 tSize = textureSize(dataTexture,0);
    //compute row and column in the texture
    ivec2 uv = ivec2( mod(offset, tSize.x), floor(offset / tSize.y));
    //read and return the pixel
    return texelFetch(dataTexture,uv,0);

off cause you have to make sure that the data fits into the texture.

The data-image can be filled in the same manner from a gd script

P.s.: havent tested the code so you surely have to filldle around with type cast etc

ok … i have missed that one. Didnt know about that.

Since addressing vertex by there spacial position is a overboarding waste of memory i just correct my answer and let it still standing.

Thanks for the insight!

klaas | 2020-09-03 15:07

I don’t think I understand what is being said, nor how to use the information =). And I’m not sure it fits my use-case. (I assume that my lack of knowledge is the problem).

My use case
Just to be clear about my particular use case. I have an artificial neural net that I need to run for each position in a 3D cubical lattice. The neural net has a varied structure, such that each node in the network has a varied amount of input nodes, and there is no way for me to know in advance how many nodes a given node has as input. As a consequence, my implementation needs to allow for a varied number of inputs to the nodes.

The way I currently implement this is by using the CPU, and running the neural net sequentially for each position in the 3D lattice. Since the only thing that differs between the calculations is the position-based inputs to the neural net, this should be possible to speed up using GPU and shaders.

The simplest way I’ve come up with using shaders is to let each node in the network have its own 3D texture. For each node, I can then pass the textures of its input nodes as input, and store their calculated values in the texture of the given node.

Does your answer cover this use case? If it does, then I will definitely dig deeper to understand what you are telling me.

toblin | 2020-09-08 07:03

i dont think my answer fits this well.

P.S. I wonder if godot is even a good platform for this usecase.

klaas | 2020-09-08 08:07

Ok, thanks for confirming =).

P.S. The artificial network is used to create 3D shapes, and I need Godot to save time in implementing graphics and rendering. I’m not too much of an experienced programmer, and getting into Godot is best option given the time I have available for the project. I’ve only run into problems now when it comes to GPU acceleration.

toblin | 2020-09-08 09:17