-
Notifications
You must be signed in to change notification settings - Fork 24
Update on documentation #121
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Merged
Merged
Changes from 4 commits
Commits
Show all changes
6 commits
Select commit
Hold shift + click to select a range
f878739
Updating the code description
jayjay-park eacde87
Made temporary update to fix the failing case in test
jayjay-park ceb0623
Update README.md
jayjay-park 3659c36
Deleted .CondaPkg
jayjay-park 01cd6e3
Updated README.md accordingly
jayjay-park fc1fd51
Update for CI
jayjay-park File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
|
|
@@ -3,3 +3,4 @@ data | |
| Manifest.toml | ||
| settings.json | ||
| *.png | ||
| .CondaPkg/ | ||
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
|
|
@@ -5,169 +5,205 @@ | |
| |[](https://slimgroup.github.io/InvertibleNetworks.jl/stable/) [](https://slimgroup.github.io/InvertibleNetworks.jl/dev/)| [](https://github.com/slimgroup/InvertibleNetworks.jl/actions/workflows/runtests.yml)| [](https://doi.org/10.21105/joss.06554) | ||
|
|
||
|
|
||
| Building blocks for invertible neural networks in the [Julia] programming language. | ||
| ## 🎯 Overview | ||
|
|
||
| - Memory efficient building blocks for invertible neural networks | ||
| - Hand-derived gradients, Jacobians $J$ , and $\log |J|$ | ||
| - [Flux] integration | ||
| - Support for [Zygote] and [ChainRules] | ||
| - GPU support | ||
| - Includes various examples of invertible neural networks, normalizing flows, variational inference, and uncertainty quantification | ||
| InvertibleNetworks.jl provides memory-efficient building blocks for invertible neural networks with hand-derived gradients, Jacobians, and log-determinants. The package is designed for high-performance scientific computing and machine learning applications. | ||
|
|
||
| ### ✨ Key Features | ||
|
|
||
| ## Installation | ||
| - **Memory Efficient**: Hand-derived gradients, Jacobians J, and log|J| for optimal memory usage | ||
| - **Flux Integration**: Seamless integration with Flux.jl for automatic differentiation | ||
| - **AD Support**: Support for [Zygote] and [ChainRules] automatic differentiation | ||
| - **GPU Support**: Full GPU support via CuArray | ||
| - **Comprehensive Examples**: Various examples of invertible neural networks, normalizing flows, variational inference, and uncertainty quantification | ||
|
|
||
| InvertibleNetworks is registered and can be added like any standard Julia package with the command: | ||
| ## 🚀 Quick Start | ||
|
Member
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Add back the Installation section
Contributor
Author
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Installation section was located at the back, but I will bring it up front to the original location. |
||
|
|
||
| ``` | ||
| ] add InvertibleNetworks | ||
| ``` | ||
| ### Basic Usage | ||
|
|
||
| ```julia | ||
| using InvertibleNetworks, Flux | ||
|
|
||
| ## Uncertainty-aware image reconstruction | ||
| # Create a simple activation normalization layer | ||
| an = ActNorm(10; logdet=true) | ||
|
|
||
| # Forward pass | ||
| X = randn(Float32, 64, 64, 10, 4) | ||
| Y, logdet = an.forward(X) | ||
|
|
||
| Due to its memory scaling InvertibleNetworks.jl, has been particularily successful at Bayesian posterior sampling with simulation-based inference. To get started with this application refer to a simple example ([Conditional sampling for MNSIT inpainting](https://github.com/slimgroup/InvertibleNetworks.jl/tree/master/examples/applications/conditional_sampling/amortized_glow_mnist_inpainting.jl)) but feel free to modify this script for your application and please reach out to us for help. | ||
| # Inverse pass | ||
| X_reconstructed = an.inverse(Y) | ||
|
|
||
|  | ||
| # Test invertibility | ||
| @assert norm(X - X_reconstructed) < 1e-6 | ||
| ``` | ||
|
|
||
| ### GPU Support | ||
|
|
||
| ## Building blocks | ||
| ```julia | ||
| using InvertibleNetworks, Flux | ||
|
|
||
| - 1x1 Convolutions using Householder transformations ([example](https://github.com/slimgroup/InvertibleNetworks.jl/tree/master/examples/layers/layer_convolution_1x1.jl)) | ||
| # Move data to GPU | ||
| X = randn(Float32, 64, 64, 10, 4) |> gpu | ||
| AN = ActNorm(10; logdet=true) |> gpu | ||
|
|
||
| - Residual block ([example](https://github.com/slimgroup/InvertibleNetworks.jl/tree/master/examples/layers/layer_residual_block.jl)) | ||
| # Forward pass on GPU | ||
| Y, logdet = AN.forward(X) | ||
| ``` | ||
|
|
||
| - Invertible coupling layer from Dinh et al. (2017) ([example](https://github.com/slimgroup/InvertibleNetworks.jl/tree/master/examples/layers/layer_coupling_glow.jl)) | ||
| ## 🧱 Building Blocks | ||
|
|
||
| - Invertible hyperbolic layer from Lensink et al. (2019) ([example](https://github.com/slimgroup/InvertibleNetworks.jl/tree/master/examples/layers/layer_coupling_hyperbolic.jl)) | ||
| ### Core Layers | ||
|
|
||
| - Invertible coupling layer from Putzky and Welling (2019) ([example](https://github.com/slimgroup/InvertibleNetworks.jl/tree/master/examples/layers/layer_coupling_irim.jl)) | ||
| - **ActNorm**: Activation normalization (Kingma and Dhariwal, 2018) ([example](https://github.com/slimgroup/InvertibleNetworks.jl/tree/master/examples/layers/layer_actnorm.jl)) | ||
| - **Conv1x1**: 1x1 Convolutions using Householder transformations ([example](https://github.com/slimgroup/InvertibleNetworks.jl/tree/master/examples/layers/layer_convolution_1x1.jl)) | ||
| - **ResidualBlock**: Invertible residual blocks ([example](https://github.com/slimgroup/InvertibleNetworks.jl/tree/master/examples/layers/layer_residual_block.jl)) | ||
| - **CouplingLayerGlow**: Invertible coupling layer from Dinh et al. (2017) ([example](https://github.com/slimgroup/InvertibleNetworks.jl/tree/master/examples/layers/layer_coupling_glow.jl)) | ||
| - **CouplingLayerHINT**: Invertible recursive coupling layer HINT from Kruse et al. (2020) ([example](https://github.com/slimgroup/InvertibleNetworks.jl/tree/master/examples/layers/layer_coupling_hint.jl)) | ||
| - **CouplingLayerHyperbolic**: Invertible hyperbolic layer from Lensink et al. (2019) ([example](https://github.com/slimgroup/InvertibleNetworks.jl/tree/master/examples/layers/layer_coupling_hyperbolic.jl)) | ||
| - **CouplingLayerIRIM**: Invertible coupling layer from Putzky and Welling (2019) ([example](https://github.com/slimgroup/InvertibleNetworks.jl/tree/master/examples/layers/layer_coupling_irim.jl)) | ||
|
|
||
| - Invertible recursive coupling layer HINT from Kruse et al. (2020) ([example](https://github.com/slimgroup/InvertibleNetworks.jl/tree/master/examples/layers/layer_coupling_hint.jl)) | ||
| ### Activation Functions | ||
|
|
||
| - Activation normalization (Kingma and Dhariwal, 2018) ([example](https://github.com/slimgroup/InvertibleNetworks.jl/tree/master/examples/layers/layer_actnorm.jl)) | ||
| - **ReLU**: Rectified Linear Unit | ||
| - **LeakyReLU**: Leaky Rectified Linear Unit | ||
| - **Sigmoid**: Sigmoid activation with optional scaling | ||
| - **Sigmoid2**: Modified sigmoid activation | ||
| - **GaLU**: Gated Linear Unit | ||
| - **ExpClamp**: Exponential with clamping | ||
|
|
||
| - Various activation functions (Sigmoid, ReLU, leaky ReLU, GaLU) | ||
| ### Utilities | ||
|
|
||
| - Objective and misfit functions (mean squared error, log-likelihood) | ||
| - **Jacobian Computation**: Hand-derived Jacobians for memory efficiency | ||
| - **Dimensionality Manipulation**: squeeze/unsqueeze (column, patch, checkerboard), split/cat | ||
| - **Wavelet Transform** | ||
|
|
||
| - Dimensionality manipulation: squeeze/unsqueeze (column, patch, checkerboard), split/cat | ||
|
|
||
| - Squeeze/unsqueeze using the wavelet transform | ||
| ## 🌐 Network Architectures | ||
|
|
||
| ### Pre-built Networks | ||
|
|
||
| ## Examples | ||
| - **NetworkGlow**: Generative flow with invertible 1x1 convolutions ([generic example](https://github.com/slimgroup/InvertibleNetworks.jl/tree/master/examples/networks/network_glow.jl), [source](https://github.com/slimgroup/InvertibleNetworks.jl/tree/master/src/networks/invertible_network_glow.jl)) | ||
| - **NetworkHINT**: Multi-scale HINT networks | ||
| - **NetworkHyperbolic**: Hyperbolic networks | ||
| - **NetworkIRIM**: Invertible recurrent inference machines (Putzky and Welling, 2019) ([generic example](https://github.com/slimgroup/InvertibleNetworks.jl/tree/master/examples/networks/network_irim.jl)) | ||
| - **NetworkConditionalGlow**: Conditional Glow networks | ||
| - **NetworkConditionalHINT**: Conditional HINT networks | ||
|
|
||
| - Invertible recurrent inference machines (Putzky and Welling, 2019) ([generic example](https://github.com/slimgroup/InvertibleNetworks.jl/tree/master/examples/networks/network_irim.jl)) | ||
|
|
||
| - Generative models with maximum likelihood via the change of variable formula ([example](https://github.com/slimgroup/InvertibleNetworks.jl/tree/master/examples/applications/application_glow_banana_dist.jl)) | ||
| ## 🔍 Uncertainty-aware Image Reconstruction | ||
|
|
||
| - Glow: Generative flow with invertible 1x1 convolutions (Kingma and Dhariwal, 2018) ([generic example](https://github.com/slimgroup/InvertibleNetworks.jl/tree/master/examples/networks/network_glow.jl), [source](https://github.com/slimgroup/InvertibleNetworks.jl/tree/master/src/networks/invertible_network_glow.jl)) | ||
| Due to its memory scaling InvertibleNetworks.jl, has been particularily successful at Bayesian posterior sampling with simulation-based inference. To get started with this application refer to a simple example ([Conditional sampling for MNSIT inpainting](https://github.com/slimgroup/InvertibleNetworks.jl/tree/master/examples/applications/conditional_sampling/amortized_glow_mnist_inpainting.jl)) but feel free to modify this script for your application and please reach out to us for help. | ||
|
|
||
| ## GPU support | ||
|
|
||
| GPU support is supported via Flux/CuArray. To use the GPU, move the input and the network layer to GPU via `|> gpu` | ||
| ### Example: MNIST Inpainting | ||
|
|
||
| ```julia | ||
| # See examples/applications/conditional_sampling/amortized_glow_mnist_inpainting.jl | ||
| # for a complete example of conditional sampling for MNIST inpainting | ||
| ``` | ||
| using InvertibleNetworks, Flux | ||
|
|
||
| # Input | ||
| nx = 64 | ||
| ny = 64 | ||
| k = 10 | ||
| batchsize = 4 | ||
|
|
||
| # Input image: nx x ny x k x batchsize | ||
| X = randn(Float32, nx, ny, k, batchsize) |> gpu | ||
|  | ||
|
|
||
| # Activation normalization | ||
| AN = ActNorm(k; logdet=true) |> gpu | ||
| ### Other Examples | ||
|
|
||
| # Test invertibility | ||
| Y_, logdet = AN.forward(X) | ||
| ``` | ||
| - **Invertible recurrent inference machines** (Putzky and Welling, 2019) ([generic example](https://github.com/slimgroup/InvertibleNetworks.jl/tree/master/examples/networks/network_irim.jl)) | ||
|
|
||
| ## Reference | ||
| - **Generative models with maximum likelihood** via the change of variable formula ([example](https://github.com/slimgroup/InvertibleNetworks.jl/tree/master/examples/applications/application_glow_banana_dist.jl)) | ||
|
|
||
| If you use InvertibleNetworks.jl in your research, we would be grateful if you cite us with the following bibtex: | ||
| - **Glow**: Generative flow with invertible 1x1 convolutions (Kingma and Dhariwal, 2018) ([generic example](https://github.com/slimgroup/InvertibleNetworks.jl/tree/master/examples/networks/network_glow.jl), [source](https://github.com/slimgroup/InvertibleNetworks.jl/tree/master/src/networks/invertible_network_glow.jl)) | ||
|
|
||
| ``` | ||
| @article{Orozco2024, doi = {10.21105/joss.06554}, url = {https://doi.org/10.21105/joss.06554}, year = {2024}, publisher = {The Open Journal}, volume = {9}, number = {99}, pages = {6554}, author = {Rafael Orozco and Philipp Witte and Mathias Louboutin and Ali Siahkoohi and Gabrio Rizzuti and Bas Peters and Felix J. Herrmann}, title = {InvertibleNetworks.jl: A Julia package for scalable normalizing flows}, journal = {Journal of Open Source Software} } | ||
| ``` | ||
| ## 📖 Documentation | ||
|
|
||
| - **API Documentation**: [Stable](https://slimgroup.github.io/InvertibleNetworks.jl/stable/) | [Development](https://slimgroup.github.io/InvertibleNetworks.jl/dev/) | ||
| - **Examples**: See the `examples/` directory for comprehensive usage examples | ||
| - **Tests**: The `test/` directory contains extensive unit tests | ||
|
|
||
| ## Papers | ||
| ## 🤝 Contributing | ||
|
|
||
| The following publications use [InvertibleNetworks.jl]: | ||
| We welcome contributions! Please see [CONTRIBUTING.md](CONTRIBUTING.md) for guidelines. | ||
|
|
||
| - **["Reliable amortized variational inference with physics-based latent distribution correction"]** | ||
| - paper: [https://arxiv.org/abs/2207.11640](https://arxiv.org/abs/2207.11640) | ||
| - [presentation](https://slim.gatech.edu/Publications/Public/Submitted/2022/siahkoohi2022ravi/slides.pdf) | ||
| - code: [ReliableAVI.jl] | ||
| ### Development Setup | ||
|
|
||
| - **["Learning by example: fast reliability-aware seismic imaging with normalizing flows"]** | ||
| - paper: [https://arxiv.org/abs/2104.06255](https://arxiv.org/abs/2104.06255) | ||
| - [presentation](https://slim.gatech.edu/Publications/Public/Conferences/KAUST/2021/siahkoohi2021EarthMLfar/siahkoohi2021EarthMLfar.pdf) | ||
| - code: [ReliabilityAwareImaging.jl] | ||
|
|
||
| - **["Enabling uncertainty quantification for seismic data pre-processing using normalizing flows (NF)—an interpolation example"]** | ||
| - [paper](https://slim.gatech.edu/Publications/Public/Conferences/SEG/2021/kumar2021SEGeuq/kumar2021SEGeuq.pdf) | ||
| - code: [WavefieldRecoveryUQ.jl] | ||
|
|
||
| - **["Preconditioned training of normalizing flows for variational inference in inverse problems"]** | ||
| - paper: [https://arxiv.org/abs/2101.03709](https://arxiv.org/abs/2101.03709) | ||
| - [presentation](https://slim.gatech.edu/Publications/Public/Conferences/AABI/2021/siahkoohi2021AABIpto/siahkoohi2021AABIpto_pres.pdf) | ||
| - code: [FastApproximateInference.jl] | ||
| ```julia | ||
| using Pkg | ||
| Pkg.develop("InvertibleNetworks") | ||
| ``` | ||
|
|
||
| - **["Parameterizing uncertainty by deep invertible networks, an application to reservoir characterization"]** | ||
| - paper: [https://arxiv.org/abs/2004.07871](https://arxiv.org/abs/2004.07871) | ||
| - [presentation](https://slim.gatech.edu/Publications/Public/Conferences/SEG/2020/rizzuti2020SEGuqavp/rizzuti2020SEGuqavp_pres.pdf) | ||
| - code: [https://github.com/slimgroup/Software.SEG2020](https://github.com/slimgroup/Software.SEG2020) | ||
| ### Running Tests | ||
|
|
||
| - **["Generalized Minkowski sets for the regularization of inverse problems"]** | ||
| - paper: [http://arxiv.org/abs/1903.03942](http://arxiv.org/abs/1903.03942) | ||
| - code: [SetIntersectionProjection.jl] | ||
| ```julia | ||
| using Pkg | ||
| Pkg.test("InvertibleNetworks") | ||
| ``` | ||
|
|
||
| ## Contributing | ||
| ## 📄 Citation | ||
|
|
||
| If you use InvertibleNetworks.jl in your research, please cite: | ||
|
|
||
| ```bibtex | ||
| @article{Orozco2024, | ||
| doi = {10.21105/joss.06554}, | ||
| url = {https://doi.org/10.21105/joss.06554}, | ||
| year = {2024}, | ||
| publisher = {The Open Journal}, | ||
| volume = {9}, | ||
| number = {99}, | ||
| pages = {6554}, | ||
| author = {Rafael Orozco and Philipp Witte and Mathias Louboutin and Ali Siahkoohi and Gabrio Rizzuti and Bas Peters and Felix J. Herrmann}, | ||
| title = {InvertibleNetworks.jl: A Julia package for scalable normalizing flows}, | ||
| journal = {Journal of Open Source Software} | ||
| } | ||
| ``` | ||
|
|
||
| We welcome contributions and bug reports! | ||
| Please see [CONTRIBUTING.md](https://github.com/slimgroup/InvertibleNetworks.jl/blob/master/CONTRIBUTING.md) for guidance. | ||
| ## 📚 Related Publications | ||
|
|
||
| InvertibleNetworks.jl development subscribes to the [Julia Community Standards](https://julialang.org/community/standards/). | ||
| The following publications use InvertibleNetworks.jl: | ||
|
|
||
| ## Authors | ||
| - **Reliable amortized variational inference with physics-based latent distribution correction** | ||
| - Paper: [https://arxiv.org/abs/2207.11640](https://arxiv.org/abs/2207.11640) | ||
| - Code: [ReliableAVI.jl] | ||
|
|
||
| - Rafael Orozco, Georgia Institute of Technology [rorozco@gatech.edu] | ||
| - **Learning by example: fast reliability-aware seismic imaging with normalizing flows** | ||
| - Paper: [https://arxiv.org/abs/2104.06255](https://arxiv.org/abs/2104.06255) | ||
| - Code: [ReliabilityAwareImaging.jl] | ||
|
|
||
| - Philipp Witte, Georgia Institute of Technology (now Microsoft) | ||
| - **Enabling uncertainty quantification for seismic data pre-processing using normalizing flows** | ||
| - Paper: [https://slim.gatech.edu/Publications/Public/Conferences/SEG/2021/kumar2021SEGeuq/kumar2021SEGeuq.pdf] | ||
| - Code: [WavefieldRecoveryUQ.jl] | ||
|
|
||
| - Gabrio Rizzuti, Utrecht University | ||
| - **Preconditioned training of normalizing flows for variational inference in inverse problems** | ||
| - Paper: [https://arxiv.org/abs/2101.03709](https://arxiv.org/abs/2101.03709) | ||
| - Code: [FastApproximateInference.jl] | ||
|
|
||
| - Mathias Louboutin, Georgia Institute of Technology | ||
| - **Parameterizing uncertainty by deep invertible networks, an application to reservoir characterization** | ||
| - Paper: [https://arxiv.org/abs/2004.07871](https://arxiv.org/abs/2004.07871) | ||
|
|
||
| - Ali Siahkoohi, Georgia Institute of Technology | ||
| ## 👥 Authors | ||
|
|
||
| - **Rafael Orozco** - Georgia Institute of Technology [rorozco@gatech.edu] | ||
| - **Philipp Witte** - Georgia Institute of Technology (now Microsoft) | ||
| - **Gabrio Rizzuti** - Utrecht University | ||
| - **Mathias Louboutin** - Georgia Institute of Technology | ||
| - **Ali Siahkoohi** - Georgia Institute of Technology | ||
|
|
||
| ## 🙏 Acknowledgments | ||
|
|
||
| This package uses functions from: | ||
| - [NNlib.jl](https://github.com/FluxML/NNlib.jl) | ||
| - [Flux.jl](https://github.com/FluxML/Flux.jl) | ||
| - [Wavelets.jl](https://github.com/JuliaDSP/Wavelets.jl) | ||
|
|
||
| ## Acknowledgments | ||
| ## 📄 License | ||
|
|
||
| This package uses functions from [NNlib.jl](https://github.com/FluxML/NNlib.jl), [Flux.jl](https://github.com/FluxML/Flux.jl) and [Wavelets.jl](https://github.com/JuliaDSP/Wavelets.jl) | ||
| This project is licensed under the MIT License - see the [LICENSE](LICENSE) file for details. | ||
|
|
||
| [Flux]:https://fluxml.ai | ||
| [Julia]:https://julialang.org | ||
| [Zygote]:https://github.com/FluxML/Zygote.jl | ||
| [ChainRules]:https://github.com/JuliaDiff/ChainRules.jl | ||
| [InvertibleNetworks.jl]:https://github.com/slimgroup/InvertibleNetworks.jl | ||
| ["Learning by example: fast reliability-aware seismic imaging with normalizing flows"]:https://slim.gatech.edu/content/learning-example-fast-reliability-aware-seismic-imaging-normalizing-flows | ||
| ["Enabling uncertainty quantification for seismic data pre-processing using normalizing flows (NF)—an interpolation example"]:https://slim.gatech.edu/content/ultra-low-memory-seismic-inversion-randomized-trace-estimation-0 | ||
| ["Preconditioned training of normalizing flows for variational inference in inverse problems"]:https://slim.gatech.edu/content/preconditioned-training-normalizing-flows-variational-inference-inverse-problems | ||
| [ReliabilityAwareImaging.jl]:https://github.com/slimgroup/Software.SEG2021/tree/main/ReliabilityAwareImaging.jl | ||
| [WavefieldRecoveryUQ.jl]:https://github.com/slimgroup/Software.SEG2021/tree/main/WavefieldRecoveryUQ.jl | ||
| [FastApproximateInference.jl]:https://github.com/slimgroup/Software.siahkoohi2021AABIpto | ||
| ["Generalized Minkowski sets for the regularization of inverse problems"]:https://slim.gatech.edu/content/generalized-minkowski-sets-regularization-inverse-problems-1 | ||
| [SetIntersectionProjection.jl]:https://github.com/slimgroup/SetIntersectionProjection.jl | ||
| ["Parameterizing uncertainty by deep invertible networks, an application to reservoir characterization"]:https://slim.gatech.edu/content/parameterizing-uncertainty-deep-invertible-networks-application-reservoir-characterization | ||
| ["Reliable amortized variational inference with physics-based latent distribution correction"]:https://slim.gatech.edu/content/reliable-amortized-variational-inference-physics-based-latent-distribution-correction | ||
| [ReliableAVI.jl]:https://github.com/slimgroup/ReliableAVI.jl | ||
| [Flux]: https://fluxml.ai | ||
| [Julia]: https://julialang.org | ||
| [Zygote]: https://github.com/FluxML/Zygote.jl | ||
| [ChainRules]: https://github.com/JuliaDiff/ChainRules.jl | ||
| [InvertibleNetworks.jl]: https://github.com/slimgroup/InvertibleNetworks.jl | ||
| [ReliableAVI.jl]: https://github.com/slimgroup/ReliableAVI.jl | ||
| [ReliabilityAwareImaging.jl]: https://github.com/slimgroup/Software.SEG2021/tree/main/ReliabilityAwareImaging.jl | ||
| [WavefieldRecoveryUQ.jl]: https://github.com/slimgroup/Software.SEG2021/tree/main/WavefieldRecoveryUQ.jl | ||
| [FastApproximateInference.jl]: https://github.com/slimgroup/Software.siahkoohi2021AABIpto | ||
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Nvidia GPU. Full GPU would mean AMD (RocArray), Apple (MtlArray) and Inte (Sycl array) as well
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, I will update that part.