-
Notifications
You must be signed in to change notification settings - Fork 35
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Implementing non-stationary gibbs kernel #372
Comments
Welcome :) I'm keen to see this functionality made available too, so thanks for opening the issue. Yes, this should be straightforward to do. The Gibbs kernel is an interesting example, because both the original data Consequently, my inclination would be to say that the best way to go would be to create a new kernel which is parametrised in terms of Would you be interested in opening a PR to implement this? |
Hi @willtebbutt thanks for replying! Yeah I'm interested in trying to get this done for sure. I mainly work in python and only recently started looking at Julia so I will need some hand holding to get this done if you are OK with that. Using the Also I'm fairly new to thinking about GPs so appologies if a say something stupid :) But like I said I'm very happy to give this a go. |
Ahh, sorry, I should have been clearer. I wasn't thinking so much about the mathematical relationship with the
The kind of type was imagining was something like struct GibbsKernel{Tl} <: Kernel
l::Tl
end so letting the type of
Very happy to help. The most important thing is to keep the PRs (in particular the first) as small as possible so that we can work through the process of getting you acquainted with the contribution process as efficiently as possible so that future PRs proceed smoothly -- this can be hard to do if the PR is large. Things generally go much more smoothly if we use a number of small PRs rather than a single big PR. For example, I would suggest a first PR would contain the above type, an implementation of its evaluation function -- something with signature function (k::GibbsKernel)(x, y)
# code to implement the Gibbs kernel
end -- and associated tests. Having said that the ScaledKernel is a good thing to build off, now that I think about it I would actually avoid implementing The ScaledKernel tests should give you an idea of the tests that need implementing. Lines 12-16 are standardised tests that should probably also be run on the GibbsKernel (maybe with different inputs though?), whereas the preceeding lines are ScaledKernel-specific tests, so you'd need to construct different tests here. It might be an idea, for example, to check that the GibbsKernel relates to the squared exponential kernel when |
Hey @willtebbutt, thanks for the pointers! I have added a PR with a very basic first go at this. I don't use the proper kernel infrastructure yet and think I will need to some understanding that and thinking of the best way to implement the kernel properly. I actually think that putting the gibbs kernel into "src/basekernels" might be best because it's not a simple transform of the squared exponential right? But I'm not so used to thinking about this so looking forward to your thoughts. Best |
Many thanks for opening the PR -- it shouldn't need many tweaks before it can be merged.
Good idea. We can move things around before merging your PR. |
Now that #374 is in, what would you like to work on next @Cyberface ? |
That is really cool! Thanks for the super helpful discussions on the PR! I guess next thing to do is to get kernelmatrix working with it? Ultimately I wanted to get to a point where we can model the gibbs kernel length scale can be modelled with a separate GP and fit/optimise that. |
So I guess copy and edit some of the code from the ScaledKernel.jl like you suggested before? |
Well
Nice. I look forward to seeing that. Always good to see a deep GP!
Yeah -- that should provide a good template on which to build. |
Would be great to extend the GibbsKernel to work with any (stationary) base kernel. Effectively it's just changing the metric! (Though it might be simpler to implement it as a wrapped kernel, i.e. add a |
I'm more than happy to give this a go! ... with some hand holding :) I quickly put together something along the lines of what you suggested but I'm sure I've done it wrong haha I'm keen to work on this but don't have as much time these days so you will have to excuse me if I'm slow in replying! |
Hi,
I'm new to Julia and Gaussian processes but recently came across a couple of interesting papers about non-stationary spectral mixture kernels [1], [2]. I can see that you have apready implemented a spectral mixture kernel here (As a side note I couldn't figure out how to get that to work)
I'm interested in trying to implement this but starting off small so just try to implement one part of this first which would be the input-dependent lengthscale kernel called the Gibbs kernel [3] which can be more easily see in [1] just before equation 7. (Just found another reference also here [4]).
I just took a screenshot from a talk that describes this so you can clearly see the difference between the typical RBF type kernel and this variable length-scale kernel.
Basically if you take the length-scale parameter from a Squared-Exponential Kernel and turn this into a function that depends on the input coordinates then you get a non-stationary kernel called the "Gibbs kernel".
The variable length-scale kernel has a learnable function
\ell(x)
which in [1] they parameterise by another GP.Can anyone help me implementing this? From reading how KernelFunctions works this looks like it could be done easily using some kind of
Transforms
?Thanks in advance for any help!
[1]: Non-Stationary Spectral Kernels
[2]: Neural Non-Stationary Spectral Kernel
[3]: Gibbs 1997
[4]: Nonstationary Covariance Functions for
Gaussian Process Regression (NIPS 2004)
The text was updated successfully, but these errors were encountered: