Skip to content

Questions about the coopvec branch #387

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
spock-the-wizard opened this issue Apr 29, 2025 · 1 comment
Closed

Questions about the coopvec branch #387

spock-the-wizard opened this issue Apr 29, 2025 · 1 comment

Comments

@spock-the-wizard
Copy link

Hi,

Thanks for the great work.

I’m implementing a BSDF plugin in Mitsuba with neural networks (small MLP) inside. I am currently implementing it using Pytorch interoperability, but am looking for faster options.

I came across the branch “coopvec” in this repo, which looks a lot like what I need. I have two questions.

Posting a very brief example code of my current implementation, just in case. Thanks.

import torch
import torch.nn as nn

import mitsuba as mi
mi.set_variant('cuda_ad_rgb')
mi.set_log_level(mi.LogLevel.Info)

import drjit as dr
import time

def mega_kernel(state):
    dr.set_flag(dr.JitFlag.LoopRecord, state)
    dr.set_flag(dr.JitFlag.VCallRecord, state)
    dr.set_flag(dr.JitFlag.VCallOptimize, state)

class MyBSDF(mi.BSDF):
    """
    Custom BSDF with neural network inside
    """
    def __init__(self, props):
        mi.BSDF.__init__(self, props)

        self.device = torch.device("cuda")  
        self.network = MLP(device=self.device)
        self.base_bsdf = mi.load_dict({"type": "diffuse",
                                       "reflectance": {'type': 'rgb', 'value': [0.4,1.0,1.0]}})

    def sample(self, ctx, si, sample1, sample2, active):
        return self.base_bsdf.sample(ctx,si,sample1,sample2, active)

    def eval(self, ctx, si, wo, active):
        # network takes features evaluated at si (currently using si.p as dummy)
        # and outputs color 
        result = self.eval_network(si.p)
        
        return result

    def pdf(self, cix, si, wo, active):
        # Use base_bsdf's pdf and sampling routine
        return self.base_bsdf.pdf(cix, si, wo, active)
    
    @dr.wrap(source='drjit',target='torch')
    def eval_network(self, input):
        input = input.T.to(self.device)
        return self.network(input).T

class MLP(nn.Module):
    """ Minimal MLP inside MYBSDF"""
    def __init__(self,in_size=3, hidden_size=16, out_size=3,
                 device='cuda',):
        super(MLP, self).__init__()

        self.device=device
        self.layers = nn.Sequential(
            nn.Linear(in_size, hidden_size),
            nn.ReLU(),
            nn.Linear(hidden_size, hidden_size),
            nn.ReLU(),
            nn.Linear(hidden_size, out_size),
            nn.ReLU(),
        ).to(self.device)
    
    def forward(self,input):
        return self.layers(input)


mi.register_bsdf("mybsdf", lambda props: MyBSDF(props))
if __name__ == "__main__":

    # Disable mega kernel
    mega_kernel(False)
    # mega_kernel(True)

    scene_dict = mi.cornell_box()
    # Replace an object's bsdf with custom bsdf
    scene_dict['large-box']['bsdf'] = {'type': 'mybsdf'}
    scene = mi.load_dict(scene_dict)
    start = time.time()
    img = mi.render(scene,spp=128,)
    end = time.time()
    print(f"Rendering time: {end-start}")

    mi.util.write_bitmap("test.png", img)
@njroussel
Copy link
Member

Hi @spock-the-wizard

we can train MLPs without pytorch ?

Yes, they'd be written in pure Dr.Jit.

we can run NNs in symbolic mode ?

Yes.

we can expect speedup compared to torch implementations ? (Approximately how much?)

I haven't personally compared this, so I can't say for sure. As a reminder, the cooperative vector feature is only really meant for smaller neural networks. Theoretically, you should see some speedup there because you no longer need to write your NN inputs to memory (disadvantage of evaluated mode). For larger models, PyTorch is still the way to go.

Are you planning on integrating the coopvec branch into Mitsuba any time soon?

I'm hoping to merge the Dr.Jit PR #384 this week. Once it's merged, it should be available to mitsuba:master fairly quickly.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants