-
Notifications
You must be signed in to change notification settings - Fork 23
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
reverse glow for MAP #65
base: master
Are you sure you want to change the base?
Conversation
Codecov ReportBase: 88.06% // Head: 51.90% // Decreases project coverage by
Additional details and impacted files@@ Coverage Diff @@
## master #65 +/- ##
===========================================
- Coverage 88.06% 51.90% -36.17%
===========================================
Files 33 33
Lines 2439 2576 +137
===========================================
- Hits 2148 1337 -811
- Misses 291 1239 +948
Help us with your feedback. Take ten seconds to tell us how you rate us. Have a feature suggestion? Share it here. ☔ View full report at Codecov. |
src/layers/invertible_layer_glow.jl
Outdated
|
||
function jacobian(ΔX::AbstractArray{T, N}, Δθ::Array{Parameter, 1}, X, L::CouplingLayerGlow) where {T,N} | ||
# Recompute forward state | ||
#Y, Y1, X2, S = forward(X, L; save=true) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
leftover
Is this ready? |
Add some backward_inv functions so that glow can get derivatives wrt to latent variable after reversing it. Now can do asim MAP solutions for learned regularization with non conditional glow network.