Skip to content

Conversation

@Vipitis
Copy link
Contributor

@Vipitis Vipitis commented Dec 6, 2025

Had this idea for a while... think I got a working prototype.

rendervans_material.mp4

It's likely easier to just create a pygfx texture form a wgpu texture... but why not do some dependency inversion.
this branch is mainly here to share the idea and the code, perhaps it's better suited in the pygfx repo. It's not really a custom context, since I replace the WgpuContext, but this way it can be used by downstream apps without large changes (apart from using the same device).

connections: fastplotlib/fastplotlib#943, pygfx/wgpu-py#704 (comment)

@almarklein
Copy link
Member

Super interesting!

Some loose thought for if/when we want to move this forward:

  • I like that the canvas inherits from OffscreenRenderCanvas. That way it already has means to set size, pixel ratio etc. Maybe Pygfx can implement a subclass?
  • A first thought was to have an extra present-method 'texture', which would be similar to WgpuContextToBitmap, but without downloading to a bitmap. But then we still need to wrap that wgpu texture in a Pygfx texture somehow.
  • Another approach could be that the present-method is 'screen' (which really means 'present to a texture that's provided by the canvas/surface'). And then BaseContext._create_wgpu_py_context should somehow be overloadable by the canvas. That way a pygfx.PyGfxRenderCanvas could simply provide its own texture to render to.
  • How to deal with resizing is also an issue; it would mean that a new texture is needed, how to replace that in the pygfx scene?

@Vipitis
Copy link
Contributor Author

Vipitis commented Dec 8, 2025

Thanks for your ideas. It likely helps you think about a few more use cases, as my own application was for having previews inside FPL plots. I remember a pygfx issue about rendering data already on the GPU, but can't find it at the moment.

  • I like that the canvas inherits from OffscreenRenderCanvas. That way it already has means to set size, pixel ratio etc. Maybe Pygfx can implement a subclass?

I think it's possible to put the texture constructor inside the canvas instead of taking it as an parameter. Perhaps get_current_texture("pygfx") or something could return the wrapped version for use in pygfx/FPL?
You could also think about some sort of "rendercanvas material"? By moving it more towards the existing pygfx systems. It sorta feels like this same functionality can be done in many different ways, so it's an design question of what makes sense for which users.

  • Another approach could be that the present-method is 'screen' (which really means 'present to a texture that's provided by the canvas/surface'). And then BaseContext._create_wgpu_py_context should somehow be overloadable by the canvas. That way a pygfx.PyGfxRenderCanvas could simply provide its own texture to render to.

It's a single resource and no actual swap chain (is needed when you sample and render to the same texture in one renderpass, maybe better for performance). So having a way to construct the pygfx wrapper from an existing GPU texture might be really useful (for other uses too).

  • A first thought was to have an extra present-method 'texture', which would be similar to WgpuContextToBitmap, but without downloading to a bitmap. But then we still need to wrap that wgpu texture in a Pygfx texture somehow.

Technically the bitmap only gets downloaded when you call .draw() so instead just call .present() and use the resource on GPU. I wanted to validate if that really works either way.

  • How to deal with resizing is also an issue; it would mean that a new texture is needed, how to replace that in the pygfx scene?

Is there ever a situation where you need to resize a texture that's used in a 3D scene? You could consider maybe LoD so you only render to lower resolution in some cases. I also thought about only rendering to pixels that are visible using like a depth stencil and maybe even recovering the quad vertices to then perspective corrected etc. But that would be something like a shader material and mostly a different problem to solve. Here it's just constant size render target (that you can update at any point you want, or not update it at all).

We (Shadertoy community discord) were discussing making virtual "galleries" where you display your shader art on canvasses in a 3D scene that you can walk through. And here the user logic could decided if it's visible or looked at. To only animate the relevant shaders. A bit like https://cineshader.com/ (which is currently not working because Shadertoy api is unavailable).

@almarklein
Copy link
Member

Is there ever a situation where you need to resize a texture that's used in a 3D scene?

I was thinking about it because the offscreen canvas allows resizing. But I guess if there is to be something like a PyGfxTextureRenderCanvas, its size would determined at initialization and fixed from there.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants