Suggestion for code implementations of deep kernel learning

Hi,

I am looking for any implementations available for Deep kernel learning([1511.02222] Deep Kernel Learning). I am aware of gpax(GitHub - ziatdinovmax/gpax: Gaussian Processes for Experimental Sciences, more specifically this example gpax/examples/gpax_viDKL_plasmons.ipynb at main · ziatdinovmax/gpax · GitHub) having it, but it is built on top of Jax.

Does anyone know if there is any implementation already out there in botorch (motivation being botorch already have many optimizers to play with)?

1 Like

While it’s not a direct answer/solution, this thread may be of help!

1 Like

You might look into GPytorch, which is the GP backend for Botorch. Depending on your familiarity with GPs it has a bit of a learning curve, but they have several tutorials on deep kernel learning.

https://docs.gpytorch.ai/en/stable/examples/06_PyTorch_NN_Integration_DKL/index.html

https://docs.gpytorch.ai/en/stable/examples/06_PyTorch_NN_Integration_DKL/KISSGP_Deep_Kernel_Regression_CUDA.html

2 Likes

@Utkarsh - there is a GPyTorch version of deep kernel learning, PyTorch NN Integration (Deep Kernel Learning) — GPyTorch 1.12.dev60+g25da2cc documentation, which should be compatible with BOTorch. However, one of the reasons I added DKL to GPax was that I didn’t find the GPyTorch implementation flexible enough or easily customizable. For example, trying to run the exact DKL for convents was quite a painful process. Same with placing priors over the NN weights. Is the problem with jax in particular or is there anything I can add to the current GPax implementation that may help you?

2 Likes

@maxim.ziatdinov

I want to train the DKL with multiple objectives. I noticed that extending the JAX version might not allow me to use the multiobjective-optimizers available in BoTorch. However, as you mentioned, the GPyTorch implementation is not particularly friendly for customization, especially for convolutional networks. I’m a bit confused about how to proceed. Could you provide some thoughts?

1 Like

If the goal is to use an advanced suite of multi-objective optimization tools in Botorch, I’m afraid there’s no other way but to deal with the pain of customizing gpytorch’s DKL models. I won’t have the capacity to add those advanced multi-objective opt tools to GPax for the foreseeable future.

2 Likes

Thank you! I will try customising gpytorch’s DKL model.

Just to update : This was achieved here - GitHub - utkarshp1161/Active-learning-in-microscopy: Notebooks for Active learning in microscopy.

Thanks!

1 Like