Skip to content

Commit 7b9aa21

Browse files
committed
fix links
1 parent 5f571ec commit 7b9aa21

6 files changed

Lines changed: 46 additions & 48 deletions

File tree

.github/workflows/docs.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -10,7 +10,7 @@ jobs:
1010
build:
1111
runs-on: ubuntu-latest
1212
steps:
13-
- uses: actions/checkout@v2
13+
- uses: actions/checkout@v3
1414

1515
- uses: julia-actions/setup-julia@latest
1616

.github/workflows/runtests.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -25,7 +25,7 @@ jobs:
2525

2626
steps:
2727
- name: Checkout InvertibleNetworks.jl
28-
uses: actions/checkout@v2
28+
uses: actions/checkout@v3
2929

3030
- name: Setup julia
3131
uses: julia-actions/setup-julia@v1

Project.toml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
11
name = "InvertibleNetworks"
22
uuid = "b7115f24-5f92-4794-81e8-23b0ddb121d3"
33
authors = ["Philipp Witte <p.witte@ymail.com>", "Ali Siahkoohi <alisk@gatech.edu>", "Mathias Louboutin <mlouboutin3@gatech.edu>", "Gabrio Rizzuti <g.rizzuti@umcutrecht.nl>", "Rafael Orozco <rorozco@gatech.edu>", "Felix J. herrmann <fherrmann@gatech.edu>"]
4-
version = "2.2.1"
4+
version = "2.2.2"
55

66
[deps]
77
CUDA = "052768ef-5323-5732-b1bb-66c8b64840ba"

docs/src/api.md

Lines changed: 26 additions & 14 deletions
Original file line numberDiff line numberDiff line change
@@ -1,36 +1,48 @@
11

2-
## Invertible Layers
2+
# Invertible Networks API reference
33

4-
### Types
4+
```@autodocs
5+
Modules = [InvertibleNetworks]
6+
Order = [:function]
7+
Pages = ["neuralnet.jl", "parameter.jl"]
8+
```
9+
10+
## Activations functions
511

612
```@autodocs
713
Modules = [InvertibleNetworks]
8-
Order = [:type]
9-
Filter = t -> t<:NeuralNetLayer
14+
Order = [:function]
15+
Pages = ["activation_functions.jl"]
1016
```
1117

12-
## Invertible Networks
18+
## Dimensions manipulation
19+
20+
```@autodocs
21+
Modules = [InvertibleNetworks]
22+
Order = [:function]
23+
Pages = ["dimensionality_operations.jl"]
24+
```
1325

14-
### Types
26+
## Layers
1527

1628
```@autodocs
1729
Modules = [InvertibleNetworks]
18-
Order = [:type]
19-
Filter = t -> t<:InvertibleNetwork
30+
Order = [:type]
31+
Filter = t -> t<:NeuralNetLayer
2032
```
2133

22-
## Activations functions
34+
## Networks
2335

2436
```@autodocs
2537
Modules = [InvertibleNetworks]
26-
Order = [:function]
27-
Pages = ["activation_functions.jl"]
38+
Order = [:type]
39+
Filter = t -> t<:InvertibleNetwork
2840
```
2941

30-
## Dimensions manipulation
42+
## AD Integration
3143

3244
```@autodocs
3345
Modules = [InvertibleNetworks]
34-
Order = [:function]
35-
Pages = ["dimensionality_operations.jl"]
46+
Order = [:function]
47+
Pages = ["chainrules.jl"]
3648
```

docs/src/index.md

Lines changed: 15 additions & 31 deletions
Original file line numberDiff line numberDiff line change
@@ -17,65 +17,49 @@ This package is developped and maintained by Felix J. Herrmann's [SlimGroup](htt
1717

1818
## Installation
1919

20-
To install this package you can either directly install it from its url:
20+
THis package is registered in the Julia general registry and can be directly installed in the julia REPL package manager (`]`):
2121

2222

2323
```julia
24-
] add https://github.com/slimgroup/InvertibleNetworks.jl
25-
```
26-
27-
or if you wish to have access to all [slimgroup](https://github.com/slimgroup/)' softwares you can add our registry to have access to our packages in the standard julia way:
28-
29-
30-
```julia
31-
] registry add https://Github.com/slimgroup/SLIMregistryJL.git
32-
] add InvertibleNetworks
24+
] add/dev InvertibleNetworks
3325
```
3426

3527
## References
3628

37-
- Yann Dauphin, Angela Fan, Michael Auli and David Grangier, "Language modeling with gated convolutional networks", Proceedings of the 34th International Conference on Machine Learning, 2017. https://arxiv.org/pdf/1612.08083.pdf
29+
- Yann Dauphin, Angela Fan, Michael Auli and David Grangier, "Language modeling with gated convolutional networks", Proceedings of the 34th International Conference on Machine Learning, 2017. [ArXiv](https://arxiv.org/pdf/1612.08083.pdf)
3830

39-
- Laurent Dinh, Jascha Sohl-Dickstein and Samy Bengio, "Density estimation using Real NVP", International Conference on Learning Representations, 2017, https://arxiv.org/abs/1605.08803
31+
- Laurent Dinh, Jascha Sohl-Dickstein and Samy Bengio, "Density estimation using Real NVP", International Conference on Learning Representations, 2017, [ArXiv](https://arxiv.org/abs/1605.08803)
4032

41-
- Diederik P. Kingma and Prafulla Dhariwal, "Glow: Generative Flow with Invertible 1x1 Convolutions", Conference on Neural Information Processing Systems, 2018. https://arxiv.org/abs/1807.03039
33+
- Diederik P. Kingma and Prafulla Dhariwal, "Glow: Generative Flow with Invertible 1x1 Convolutions", Conference on Neural Information Processing Systems, 2018. [ArXiv](https://arxiv.org/abs/1807.03039)
4234

43-
- Keegan Lensink, Eldad Haber and Bas Peters, "Fully Hyperbolic Convolutional Neural Networks", arXiv Computer Vision and Pattern Recognition, 2019. https://arxiv.org/abs/1905.10484
35+
- Keegan Lensink, Eldad Haber and Bas Peters, "Fully Hyperbolic Convolutional Neural Networks", arXiv Computer Vision and Pattern Recognition, 2019. [ArXiv](https://arxiv.org/abs/1905.10484)
4436

45-
- Patrick Putzky and Max Welling, "Invert to learn to invert", Advances in Neural Information Processing Systems, 2019. https://arxiv.org/abs/1911.10914
37+
- Patrick Putzky and Max Welling, "Invert to learn to invert", Advances in Neural Information Processing Systems, 2019. [ArXiv](https://arxiv.org/abs/1911.10914)
4638

47-
- Jakob Kruse, Gianluca Detommaso, Robert Scheichl and Ullrich Köthe, "HINT: Hierarchical Invertible Neural Transport for Density Estimation and Bayesian Inference", arXiv Statistics and Machine Learning, 2020. https://arxiv.org/abs/1905.10687
39+
- Jakob Kruse, Gianluca Detommaso, Robert Scheichl and Ullrich Köthe, "HINT: Hierarchical Invertible Neural Transport for Density Estimation and Bayesian Inference", arXiv Statistics and Machine Learning, 2020. [ArXiv](https://arxiv.org/abs/1905.10687)
4840

4941
## Related work and publications
5042

5143
The following publications use [InvertibleNetworks.jl]:
5244

53-
- **[“Preconditioned training of normalizing flows for variational inference in inverse problems”]**
45+
- **[“Preconditioned training of normalizing flows for variational inference in inverse problems”](https://slim.gatech.edu/content/preconditioned-training-normalizing-flows-variational-inference-inverse-problems)**
5446
- paper: [https://arxiv.org/abs/2101.03709](https://arxiv.org/abs/2101.03709)
5547
- [presentation](https://slim.gatech.edu/Publications/Public/Conferences/AABI/2021/siahkoohi2021AABIpto/siahkoohi2021AABIpto_pres.pdf)
56-
- code: [FastApproximateInference.jl]
48+
- code: [FastApproximateInference.jl](https://github.com/slimgroup/Software.siahkoohi2021AABIpto)
5749

58-
- **["Parameterizing uncertainty by deep invertible networks, an application to reservoir characterization"]**
50+
- **["Parameterizing uncertainty by deep invertible networks, an application to reservoir characterization"](https://slim.gatech.edu/content/parameterizing-uncertainty-deep-invertible-networks-application-reservoir-characterization)**
5951
- paper: [https://arxiv.org/abs/2004.07871](https://arxiv.org/abs/2004.07871)
6052
- [presentation](https://slim.gatech.edu/Publications/Public/Conferences/SEG/2020/rizzuti2020SEGuqavp/rizzuti2020SEGuqavp_pres.pdf)
6153
- code: [https://github.com/slimgroup/Software.SEG2020](https://github.com/slimgroup/Software.SEG2020)
6254

63-
- **["Generalized Minkowski sets for the regularization of inverse problems"]**
55+
- **["Generalized Minkowski sets for the regularization of inverse problems"](https://slim.gatech.edu/content/generalized-minkowski-sets-regularization-inverse-problems-1)**
6456
- paper: [http://arxiv.org/abs/1903.03942](http://arxiv.org/abs/1903.03942)
65-
- code: [SetIntersectionProjection.jl]
57+
- code: [SetIntersectionProjection.jl](https://github.com/slimgroup/SetIntersectionProjection.jl)
6658

6759

6860
## Acknowledgments
6961

7062
This package uses functions from [NNlib.jl](https://github.com/FluxML/NNlib.jl), [Flux.jl](https://github.com/FluxML/Flux.jl) and [Wavelets.jl](https://github.com/JuliaDSP/Wavelets.jl)
7163

72-
[Flux]:https://fluxml.ai
73-
[Julia]:https://julialang.org
74-
[Zygote]:https://github.com/FluxML/Zygote.jl
75-
[ChainRules]:https://github.com/JuliaDiff/ChainRules.jl
76-
[InvertibleNetworks.jl]:https://github.com/slimgroup/InvertibleNetworks.jl
77-
[“Preconditioned training of normalizing flows for variational inference in inverse problems”]:https://slim.gatech.edu/content/preconditioned-training-normalizing-flows-variational-inference-inverse-problems
78-
[FastApproximateInference.jl]:https://github.com/slimgroup/Software.siahkoohi2021AABIpto
79-
["Generalized Minkowski sets for the regularization of inverse problems"]:https://slim.gatech.edu/content/generalized-minkowski-sets-regularization-inverse-problems-1
80-
[SetIntersectionProjection.jl]:https://github.com/slimgroup/SetIntersectionProjection.jl
81-
["Parameterizing uncertainty by deep invertible networks, an application to reservoir characterization"]:https://slim.gatech.edu/content/parameterizing-uncertainty-deep-invertible-networks-application-reservoir-characterization
64+
65+

examples/networks/network_glow.jl

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -6,6 +6,7 @@ using InvertibleNetworks, LinearAlgebra, Flux
66
import Flux.Optimise.update!
77

88
device = InvertibleNetworks.CUDA.functional() ? gpu : cpu
9+
910
# Define network
1011
nx = 64 # must be multiple of 2
1112
ny = 64
@@ -31,6 +32,7 @@ end
3132

3233
# Evaluate loss
3334
f = loss(X)
35+
@time loss(X)
3436

3537
# Update weights
3638
opt = Flux.ADAM()

0 commit comments

Comments
 (0)