JPH, Ville - help with user bitmaps Import help for a script

Discuss the ZGameEditor Visualizer plugin for FL-Studio.

Moderator: Moderators

User avatar
VilleK
Site Admin
Posts: 2356
Joined: Mon Jan 15, 2007 4:50 pm
Location: Stockholm, Sweden
Contact:

Post by VilleK »

Kjell wrote:Since ZGE doesn't support floating-point buffers ( it actually does internally for the GPU Array, but it's not exposed anywhere in the Editor ) you need to pull some ( fixed-point ) tricks to get the data around.
I can't test right now because I'm reading the forum on a small netbook with integrated graphics, but can't the buffer be passed to the shader just like how AudioArray is defined (a ShaderVariable with array binding)?
StevenM
Posts: 149
Joined: Thu Jan 20, 2011 10:03 am

Post by StevenM »

alphanimal wrote:OK thanks that's useful!

How can I store float (or fixed if necessary) variables for each vertex?
I would help you with that-but I'm too much of a noob myself. Kjell can probably help but you will have to do some reseach -

a lot of data is parallel processed in GLSL, reason why it is so incredibly fast - and as Ville put it - with GLSL you have to "think like the GPU".

So it's a paradigm that is different from what you may be use to.

The Orange Book is a great start - this book 2006 2nd edition is free online -

http://wiki.labomedia.org/images/1/10/O ... dition.pdf
User avatar
Kjell
Posts: 1915
Joined: Sat Feb 23, 2008 11:15 pm

Post by Kjell »

Hey guys,
VilleK wrote:can't the buffer be passed to the shader just like how AudioArray is defined (a ShaderVariable with array binding)?
For any constant values that's probably the way to go yes ( due to the lack of vertex attributes ) .. but the results of his physics calculations should never leave the GPU, so using a ( CPU-bound ) array for those is obviously not a option ( if you're going for high-performance at least ).

Edit - Actually, just went through the source ( Renderer.pas line 2103 ) and it seems that the data is uploaded each frame, in which case it probably pays off to bake a Bitmap instead ( and take the fixed-point overhead if needed ).
alphanimal wrote:How can I store float (or fixed if necessary) variables for each vertex?
Haven't looked at your effect closely, but you're probably going to end up with at least two FBO's ( RenderTarget ) between which you "ping-pong" double buffer style. One containing the results for the current frame, the other for the previous frame.
StevenM wrote:So it's a paradigm that is different from what you may be use to.
You're making it sound more mythical / difficult then it really is :wink:

K
alphanimal
Posts: 25
Joined: Sun Feb 27, 2011 1:09 am

Post by alphanimal »

Kjell wrote:Haven't looked at your effect closely, but you're probably going to end up with at least two FBO's ( RenderTarget ) between which you "ping-pong" double buffer style. One containing the results for the current frame, the other for the previous frame.
OK thanks! Can you give me some example code?

I'm not sure if I need a double buffer. Why would I? As long as I can store some values somewhere I'm fine.

Particles don't need to interact. no calculation includes any parameter from a different vertex. (if that makes room for any parallel processing tweaks or simplifies things)

cheers
User avatar
Kjell
Posts: 1915
Joined: Sat Feb 23, 2008 11:15 pm

Post by Kjell »

Hi alphanimal,
alphanimal wrote:I'm not sure if I need a double buffer. Why would I?
OpenGL doesn't like it when you're reading from and writing to the same FBO. Check out the following video ( the right side of the screen ) to see what happens if you do.
alphanimal wrote:As long as I can store some values somewhere I'm fine.
The only way to write / store values from a shader is by rendering to a buffer.

K
User avatar
VilleK
Site Admin
Posts: 2356
Joined: Mon Jan 15, 2007 4:50 pm
Location: Stockholm, Sweden
Contact:

Post by VilleK »

Kjell is the most helpful and knowledgeable person on this forum, only problem is that he sometimes suggest solutions that only he himself has the skills to implement :)
alphanimal
Posts: 25
Joined: Sun Feb 27, 2011 1:09 am

Post by alphanimal »

indeed I'm lost :)

In the example above...

How do the variables tex1 get to the shader? It's declared but never assigned anything (unlike the coord var).

I can imagine how it works though. texture() reads color information from the texture tex1, but where is it defined that my Bitmap that's assigned to the Material is called "tex1"?

Also, if I change the vertex coordinates within the shader code, it does not effect the actual vertices but only how they are rendered right?

How can I define a buffer and how can I "render" my physics properties into that?
User avatar
Kjell
Posts: 1915
Joined: Sat Feb 23, 2008 11:15 pm

Post by Kjell »

:)
alphanimal wrote:How do the variables tex1 get to the shader? It's declared but never assigned anything (unlike the coord var). I can imagine how it works though. texture() reads color information from the texture tex1, but where is it defined that my Bitmap that's assigned to the Material is called "tex1"?
This is a bit counter-intuitive yes. While you do explicitly need to define a variable name for ShaderVariable, ZGE assigns these names automatically when it comes to Textures. The first texture is called tex1, second tex2 etc.
alphanimal wrote:Also, if I change the vertex coordinates within the shader code, it does not effect the actual vertices but only how they are rendered right?
Correct, it only changes the way they are rendered, not the actual mesh data.
alphanimal wrote:How can I define a buffer and how can I "render" my physics properties into that?
If you want to store results calculated by a shader, you can use RenderTarget in combination with the SetRenderTarget component.

K
User avatar
Kjell
Posts: 1915
Joined: Sat Feb 23, 2008 11:15 pm

Post by Kjell »

8)

Anyway, attached is a simple example of a double-buffer FBO setup. The actual "simulation" is super boring and frame-dependent .. but that's beside the point. What matters is that it's running entirely on the GPU.

K
Attachments
Buffer.zgeproj
(2.83 KiB) Downloaded 1122 times
User avatar
VilleK
Site Admin
Posts: 2356
Joined: Mon Jan 15, 2007 4:50 pm
Location: Stockholm, Sweden
Contact:

Post by VilleK »

Good stuff!

I noticed I had to change "frac" to "fract" in the fragment shader to get it to work. Apparently the GLSL-function is called "fract" but some graphics cards also accepts "frac" because that is what the function is called in Cg-syntax.
User avatar
Kjell
Posts: 1915
Joined: Sat Feb 23, 2008 11:15 pm

Post by Kjell »

Oops,

CG was the first shader language I learned .. so I guess it still creeps up once in a while :oops:

File attached to my previous post is fixed now.

K
alphanimal
Posts: 25
Joined: Sun Feb 27, 2011 1:09 am

Post by alphanimal »

impressive :) though I have no idea what's going on here.

Can you explain what steps are carried out in the OnRender node?

Why did you switch to a Mesh instead of the DisplayList (whatever that is)

What's this about:
glMatrixMode(0x1700);
glPushMatrix();
glLoadIdentity();

What's going on in the BufferShader and in the EffectShader?
If I understand correctly, BufferShader is used to process "invisible" parameters.
EffectShader uses BufferShader's output to manipulate vertex positions.
User avatar
Kjell
Posts: 1915
Joined: Sat Feb 23, 2008 11:15 pm

Post by Kjell »

Hi alphanimal,
alphanimal wrote:Why did you switch to a Mesh instead of the DisplayList
Just being lazy, taking the Display List route requires a bit more typing :P
alphanimal wrote:What's this about:
glMatrixMode(0x1700);
glPushMatrix();
glLoadIdentity();
Sorry, this is poor thinking on my part. Should have simply ignored the OpenGL matrices in the vertex program of the Buffer shader instead. Updated the file again :)
alphanimal wrote:What's going on in the BufferShader and in the EffectShader?
The Buffer shader performs the calculations / simulation, while the Effect shader is simply used to show the results on the screen .. so you can ignore that.

K
User avatar
VilleK
Site Admin
Posts: 2356
Joined: Mon Jan 15, 2007 4:50 pm
Location: Stockholm, Sweden
Contact:

Post by VilleK »

It's important to understand that in order to let the GPU do the physics you need to get the output of the calculations somehow. But what are shaders designed to output? Pixels! So this is what Kjell take advantage of here, the physics is calculated in the shader when it renders a polygon to a off-screen buffer. It stores the value in the r-channel of the output pixel.

This off-screen buffer is then used as a texture when rendering the mesh. So the vertex-shader on EffectShader reads a pixel from the texture and use the r-value to modify the z-coordinate of the vertex. So clever tricks like this are required to make GPUs do calculations.
alphanimal
Posts: 25
Joined: Sun Feb 27, 2011 1:09 am

Post by alphanimal »

Thanks for wrapping that up! :)
Post Reply