Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Egui Integration #42

Open
wants to merge 14 commits into
base: main
Choose a base branch
from
Open

Conversation

seabassjh
Copy link

@seabassjh seabassjh commented Jan 13, 2022

Checklist

  • I have read the Contributor Guide
  • I have read and agree to the Code of Conduct
  • I have added a description of my changes and why I'd like them included in the section below

Description of Changes

Adds two new crates: ash-egui and kajiya-egui. They're heavily based off of their imgui counterparts with exception to a few modifications to work with egui.

Related Issues

List related issues here

@seabassjh seabassjh requested a review from h3r2tic as a code owner January 13, 2022 20:45
@seabassjh seabassjh changed the title egui integration Egui Integration Jan 13, 2022
@h3r2tic
Copy link
Collaborator

h3r2tic commented Jan 19, 2022

Sorry for the delay here! I've nerd-sniped myself with looking at HDR -> LDR color mapping the past few weeks 😅 Will have a look tomorrow!

Copy link
Collaborator

@h3r2tic h3r2tic left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you for this! ❤️🚀 The code looks good -- I only have some minor nitpicks 🕵️

Quick question first: Have you tested whether Vulkan validation layers are happy with the use of the API?

Now for the main course: I do have a high-level concern -- currently there's no way to test the egui renderer in main, which means that it will likely get broken at some point, and rot over time. Upon realizing this, I attempted to add support for egui to the kajiya-simple crate, thinking that it would be neat to use it in a stand-alone example using egui (alongside the view and hello binaries). I quickly ran into my unfamiliarity with egui however, and some confusion with the backend implementation (see comments about prepare_frame and handle_event).

Would you perchance be interested in adding egui support to kajiya-simple? 😅 Otherwise this PR will probably have to wait until I can allocate the time -- it's important that all features of the renderer can be tested (ideally by CI -- though the dlss feature is an exception due to licensing reasons) and checked against Vulkan validation layers.

physical_device_properties: &vk::PhysicalDeviceProperties,
physical_device_memory_properties: &vk::PhysicalDeviceMemoryProperties,
egui: &mut CtxRef,
raw_input: RawInput,
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Feels weird to be passing input to the renderer's constructor. Are all the fields actually needed? The doc from egui says "Set the values that make sense, leave the rest at their Default::default()." Maybe an instance of RawInput with only the fields relevant to the font texture creation could be constructed in-place before the call to egui.run?


assert_eq!(texture.pixels.len(), texture.width * texture.height);
let srgba_pixels: Vec<u8> = texture
.srgba_pixels(0.25)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

0.25 looks oddly close to the 0.24 which would be used for sRGB gamma correction. I think if you used R8G8B8A8_SRGB for the texture format, you might find that 1.0 works correctly.

Also it looks like the source font texture is actually just alpha, so R8_SRGB should be sufficient (though I'm not sure if the shader you're using is compatible with a red-only texture font).

Copy link
Author

@seabassjh seabassjh Mar 12, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

With all of the changes to the backend after fixing vulkan settings, 0.5 seems to produce the clearest image. Not sure why that number... do you think that's fine?

Comment on lines 604 to 606
// TODO: NEED???
// update font texture
// self.upload_font_texture(command_buffer, &self.egui.fonts().texture());
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This would be good to resolve 😅

Comment on lines 696 to 702
let next_vertex_offset = vertex_offset + mesh.vertices.len();
let next_index_offset = index_offset + mesh.indices.len();
// if next_vertex_offset > Renderer::VERTEX_COUNT_PER_FRAME
// || next_index_offset > Renderer::INDEX_COUNT_PER_FRAME
// {
// break;
// }
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What will happen if the number of vertices is greater than the size of the preallocated buffers? If it crashes, then this check needs to happen similarly to the egui renderer.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Added the check back to make sure buffer offset does not exceed buffer size

Comment on lines 102 to 119
pub fn handle_event(
&mut self,
_window: &winit::window::Window,
_egui: &mut ash_egui::egui::Context,
_event: &winit::event::Event<'_, ()>,
) {
}

pub fn prepare_frame(&mut self, context: &mut CtxRef, dt: f32) {
// update time
if let Some(time) = self.raw_input.time {
self.raw_input.time = Some(time + dt as f64);
} else {
self.raw_input.time = Some(0.0);
}

context.begin_frame(self.raw_input.take());
}
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

handle_event is obviously dead code, so probably shouldn't be here, but prepare_frame I don't know about -- I don't see it being used anywhere here or in your bevy_kajiya_egui, so I'm a bit confused how the backend is supposed to be used 👀

Copy link
Author

@seabassjh seabassjh Mar 12, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I changed the frame prep API to this: EguiBackend::prepare_frame(&mut self.egui); (self.egui is of type EguiState) and you may wonder why this is now a static function, and that is because prepare_frame() only operates on fields of EguiState, while it has no need for data in EguiBackend, unlike in imgui.

Basically, EguiState contains the egui Context instance and RawInput input state that the user of the library will interact with directly, while EguiBackend needs the data passed from EguiState once a frame to render.

The decoupling of the backend into EguiState and EguiBackend makes it trivial for me to use kajiya-egui in either kajiya-simple or in bevy-kajiya-egui.

impl EguiBackend {
pub fn new(
device: Arc<Device>,
window_settings: (u32, u32, f64),
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Would probably be better as the individual scalars that you deconstruct this to in the body of the function.


layout(binding = 0, set = 0) uniform sampler2D font_texture;

void main() { outColor = inColor.rgba * texture(font_texture, inUV); }
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This would allow us to use a single-component texture -- or do you have plans for using the other components?

Suggested change
void main() { outColor = inColor.rgba * texture(font_texture, inUV); }
void main() { outColor = inColor.rgba * texture(font_texture, inUV).r; }

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No plans? Is it wasteful to leave the other components unused?

@seabassjh
Copy link
Author

I agree. I will implement it in kajiya-simple and formalize the API so that it's trivial to use with either bevy or kajiya-simple, hopefully soon if work slows down. Thanks for looking this over!

@h3r2tic
Copy link
Collaborator

h3r2tic commented Apr 3, 2022

Hey! I've finally had some time to come back to this 😅 The PR pretty well, and I have some fixes to make it work better 😀
I have a branch here: https://github.com/EmbarkStudios/kajiya/compare/egui-fixes based on yours.

In this commit I've done a bunch of fixes and improvements:

  • Vulkan validation error about texture format mismatches
  • Remaining sRGB/gamma stuff
  • Prevented input from being passed to the view app if egui wants it -- so that the sun direction / camera don't move when you drag a slider
  • Use egui-winit to map the winit events instead of manually poking at RawInput
  • ... and necessitated by the above, I've upgraded a bunch of crates

Now almost everything works, except one bit which I haven't figured out yet: egui won't accept focus on any of the input fields, preventing values from being entered via typing numbers. Clicking on the fields should enable keyboard input, but here, nothing happens. I haven't used egui before, so don't know if there's some incantation we might be missing. Any ideas?

@seabassjh
Copy link
Author

seabassjh commented Feb 26, 2023

Poke... I know it's been eons, but I have some time to close this out. Still worth it after all the changes since this was last touched? @h3r2tic

@h3r2tic
Copy link
Collaborator

h3r2tic commented Mar 2, 2023

Hmm, good question, I'm not sure!

On the one hand, if egui is useful to folks (such as yourself), it would be great to have it!
On the other hand, I'm not actively working on kajiya right now, but I'm eventually planning to port it to a new Vulkan backend that I wrote for Tiny Glade, which comes with its own egui integration.

Up to you! :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants