You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
or if you wish to have access to all [slimgroup](https://github.com/slimgroup/)' softwares you can add our registry to have access to our packages in the standard julia way:
- Yann Dauphin, Angela Fan, Michael Auli and David Grangier, "Language modeling with gated convolutional networks", Proceedings of the 34th International Conference on Machine Learning, 2017. https://arxiv.org/pdf/1612.08083.pdf
29
+
- Yann Dauphin, Angela Fan, Michael Auli and David Grangier, "Language modeling with gated convolutional networks", Proceedings of the 34th International Conference on Machine Learning, 2017. [ArXiv](https://arxiv.org/pdf/1612.08083.pdf)
38
30
39
-
- Laurent Dinh, Jascha Sohl-Dickstein and Samy Bengio, "Density estimation using Real NVP", International Conference on Learning Representations, 2017, https://arxiv.org/abs/1605.08803
31
+
- Laurent Dinh, Jascha Sohl-Dickstein and Samy Bengio, "Density estimation using Real NVP", International Conference on Learning Representations, 2017, [ArXiv](https://arxiv.org/abs/1605.08803)
40
32
41
-
- Diederik P. Kingma and Prafulla Dhariwal, "Glow: Generative Flow with Invertible 1x1 Convolutions", Conference on Neural Information Processing Systems, 2018. https://arxiv.org/abs/1807.03039
33
+
- Diederik P. Kingma and Prafulla Dhariwal, "Glow: Generative Flow with Invertible 1x1 Convolutions", Conference on Neural Information Processing Systems, 2018. [ArXiv](https://arxiv.org/abs/1807.03039)
42
34
43
-
- Keegan Lensink, Eldad Haber and Bas Peters, "Fully Hyperbolic Convolutional Neural Networks", arXiv Computer Vision and Pattern Recognition, 2019. https://arxiv.org/abs/1905.10484
35
+
- Keegan Lensink, Eldad Haber and Bas Peters, "Fully Hyperbolic Convolutional Neural Networks", arXiv Computer Vision and Pattern Recognition, 2019. [ArXiv](https://arxiv.org/abs/1905.10484)
44
36
45
-
- Patrick Putzky and Max Welling, "Invert to learn to invert", Advances in Neural Information Processing Systems, 2019. https://arxiv.org/abs/1911.10914
37
+
- Patrick Putzky and Max Welling, "Invert to learn to invert", Advances in Neural Information Processing Systems, 2019. [ArXiv](https://arxiv.org/abs/1911.10914)
46
38
47
-
- Jakob Kruse, Gianluca Detommaso, Robert Scheichl and Ullrich Köthe, "HINT: Hierarchical Invertible Neural Transport for Density Estimation and Bayesian Inference", arXiv Statistics and Machine Learning, 2020. https://arxiv.org/abs/1905.10687
39
+
- Jakob Kruse, Gianluca Detommaso, Robert Scheichl and Ullrich Köthe, "HINT: Hierarchical Invertible Neural Transport for Density Estimation and Bayesian Inference", arXiv Statistics and Machine Learning, 2020. [ArXiv](https://arxiv.org/abs/1905.10687)
48
40
49
41
## Related work and publications
50
42
51
43
The following publications use [InvertibleNetworks.jl]:
52
44
53
-
-**[“Preconditioned training of normalizing flows for variational inference in inverse problems”]**
45
+
-**[“Preconditioned training of normalizing flows for variational inference in inverse problems”](https://slim.gatech.edu/content/preconditioned-training-normalizing-flows-variational-inference-inverse-problems)**
-**["Parameterizing uncertainty by deep invertible networks, an application to reservoir characterization"]**
50
+
-**["Parameterizing uncertainty by deep invertible networks, an application to reservoir characterization"](https://slim.gatech.edu/content/parameterizing-uncertainty-deep-invertible-networks-application-reservoir-characterization)**
-**["Generalized Minkowski sets for the regularization of inverse problems"]**
55
+
-**["Generalized Minkowski sets for the regularization of inverse problems"](https://slim.gatech.edu/content/generalized-minkowski-sets-regularization-inverse-problems-1)**
This package uses functions from [NNlib.jl](https://github.com/FluxML/NNlib.jl), [Flux.jl](https://github.com/FluxML/Flux.jl) and [Wavelets.jl](https://github.com/JuliaDSP/Wavelets.jl)
[“Preconditioned training of normalizing flows for variational inference in inverse problems”]:https://slim.gatech.edu/content/preconditioned-training-normalizing-flows-variational-inference-inverse-problems
["Generalized Minkowski sets for the regularization of inverse problems"]:https://slim.gatech.edu/content/generalized-minkowski-sets-regularization-inverse-problems-1
["Parameterizing uncertainty by deep invertible networks, an application to reservoir characterization"]:https://slim.gatech.edu/content/parameterizing-uncertainty-deep-invertible-networks-application-reservoir-characterization
0 commit comments