Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

reverse glow for MAP #65

Open
wants to merge 5 commits into
base: master
Choose a base branch
from
Open

reverse glow for MAP #65

wants to merge 5 commits into from

Conversation

rafaelorozco
Copy link
Collaborator

Add some backward_inv functions so that glow can get derivatives wrt to latent variable after reversing it. Now can do asim MAP solutions for learned regularization with non conditional glow network.

@codecov
Copy link

codecov bot commented Sep 11, 2022

Codecov Report

Base: 88.06% // Head: 51.90% // Decreases project coverage by -36.16% ⚠️

Coverage data is based on head (f55745e) compared to base (6d2fba0).
Patch coverage: 47.56% of modified lines in pull request are covered.

Additional details and impacted files
@@             Coverage Diff             @@
##           master      #65       +/-   ##
===========================================
- Coverage   88.06%   51.90%   -36.17%     
===========================================
  Files          33       33               
  Lines        2439     2576      +137     
===========================================
- Hits         2148     1337      -811     
- Misses        291     1239      +948     
Impacted Files Coverage Δ
src/networks/invertible_network_glow.jl 0.00% <0.00%> (-89.00%) ⬇️
src/layers/invertible_layer_glow.jl 96.80% <96.87%> (-0.49%) ⬇️
src/layers/invertible_layer_actnorm.jl 92.68% <100.00%> (-3.79%) ⬇️
src/networks/invertible_network_hyperbolic.jl 0.00% <0.00%> (-97.15%) ⬇️
...rc/networks/invertible_network_conditional_hint.jl 0.00% <0.00%> (-94.74%) ⬇️
src/networks/invertible_network_irim.jl 0.00% <0.00%> (-91.79%) ⬇️
src/utils/chainrules.jl 0.00% <0.00%> (-91.49%) ⬇️
src/utils/invertible_network_sequential.jl 0.00% <0.00%> (-89.71%) ⬇️
...rc/networks/invertible_network_conditional_glow.jl 0.00% <0.00%> (-82.09%) ⬇️
src/utils/jacobian.jl 0.00% <0.00%> (-81.82%) ⬇️
... and 15 more

Help us with your feedback. Take ten seconds to tell us how you rate us. Have a feature suggestion? Share it here.

☔ View full report at Codecov.
📢 Do you have feedback about the report comment? Let us know in this issue.


function jacobian(ΔX::AbstractArray{T, N}, Δθ::Array{Parameter, 1}, X, L::CouplingLayerGlow) where {T,N}
# Recompute forward state
#Y, Y1, X2, S = forward(X, L; save=true)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

leftover

src/layers/invertible_layer_glow.jl Show resolved Hide resolved
@mloubout
Copy link
Member

Is this ready?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants