Skip to content

Commit

Permalink
Move ad.jl to DynamicPPL (#2158)
Browse files Browse the repository at this point in the history
* `getADBackend`s are renamed to `getADType` and moved to `Inference` module as well as `LogDensityProblemsAD.ADgradient(ℓ::LogDensityFunction)`
* The ` LogDensityProblemsAD.ADgradient(adtype, ℓ)` specific to RD and FD are moved to DynamicPPL
* The idea is that with DynamicPPL, call to `ADgradient` must also gives the `adtype`, in Turing, we just use the `adtype` from the algorithm

---------

Co-authored-by: Hong Ge <[email protected]>
  • Loading branch information
sunxd3 and yebai authored Feb 19, 2024
1 parent cf647b1 commit 616a07f
Show file tree
Hide file tree
Showing 7 changed files with 23 additions and 71 deletions.
7 changes: 7 additions & 0 deletions HISTORY.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,10 @@
# Release 0.30.5

- `essential/ad.jl` is removed, `ForwardDiff` and `ReverseDiff` integrations via `LogDensityProblemsAD` are moved to `DynamicPPL` and live in corresponding package extensions.
- `LogDensityProblemsAD.ADgradient(ℓ::DynamicPPL.LogDensityFunction)` (i.e. the single argument method) is moved to `Inference` module. It will create `ADgradient` using the `adtype` information stored in `context` field of ``.
- `getADbackend` function is renamed to `getADType`, the interface is preserved, but packages that previously used `getADbackend` should be updated to use `getADType`.
- `TuringTag` for ForwardDiff is also removed, now `DynamicPPLTag` is defined in `DynamicPPL` package and should serve the same [purpose](https://www.stochasticlifestyle.com/improved-forwarddiff-jl-stacktraces-with-package-tags/).

# Release 0.30.0

- [`ADTypes.jl`](https://github.com/SciML/ADTypes.jl) replaced Turing's global AD backend. Users should now specify the desired `ADType` directly in sampler constructors, e.g., `HMC(0.1, 10; adtype=AutoForwardDiff(; chunksize))`, or `HMC(0.1, 10; adtype=AutoReverseDiff(false))` (`false` indicates not to use compiled tape).
Expand Down
4 changes: 2 additions & 2 deletions Project.toml
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
name = "Turing"
uuid = "fce5fe82-541a-59a6-adf8-730c64b5f9a0"
version = "0.30.4"
version = "0.30.5"

[deps]
ADTypes = "47edcb42-4c32-4615-8424-f2b9edc5f35b"
Expand Down Expand Up @@ -58,7 +58,7 @@ Distributions = "0.23.3, 0.24, 0.25"
DistributionsAD = "0.6"
DocStringExtensions = "0.8, 0.9"
DynamicHMC = "3.4"
DynamicPPL = "0.24"
DynamicPPL = "0.24.7"
EllipticalSliceSampling = "0.5, 1, 2"
ForwardDiff = "0.10.3"
Libtask = "0.7, 0.8"
Expand Down
8 changes: 0 additions & 8 deletions src/Turing.jl
Original file line number Diff line number Diff line change
Expand Up @@ -32,14 +32,6 @@ function setprogress!(progress::Bool)
return progress
end

# Standard tag: Improves stacktraces
# Ref: https://www.stochasticlifestyle.com/improved-forwarddiff-jl-stacktraces-with-package-tags/
struct TuringTag end

# Allow Turing tag in gradient etc. calls of the log density function
ForwardDiff.checktag(::Type{ForwardDiff.Tag{TuringTag, V}}, ::LogDensityFunction, ::AbstractArray{V}) where {V} = true
ForwardDiff.checktag(::Type{ForwardDiff.Tag{TuringTag, V}}, ::Base.Fix1{typeof(LogDensityProblems.logdensity),<:LogDensityFunction}, ::AbstractArray{V}) where {V} = true

# Random probability measures.
include("stdlib/distributions.jl")
include("stdlib/RandomMeasures.jl")
Expand Down
3 changes: 0 additions & 3 deletions src/essential/Essential.jl
Original file line number Diff line number Diff line change
Expand Up @@ -14,11 +14,8 @@ using StatsFuns: logsumexp, softmax
using ADTypes: ADTypes, AutoForwardDiff, AutoTracker, AutoReverseDiff, AutoZygote

import AdvancedPS
import LogDensityProblems
import LogDensityProblemsAD

include("container.jl")
include("ad.jl")

export @model,
@varname,
Expand Down
45 changes: 0 additions & 45 deletions src/essential/ad.jl

This file was deleted.

16 changes: 14 additions & 2 deletions src/mcmc/Inference.jl
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,6 @@ import AdvancedHMC; const AHMC = AdvancedHMC
import AdvancedMH; const AMH = AdvancedMH
import AdvancedPS
import BangBang
import ..Essential: getADbackend
import EllipticalSliceSampling
import LogDensityProblems
import LogDensityProblemsAD
Expand Down Expand Up @@ -78,7 +77,6 @@ abstract type ParticleInference <: InferenceAlgorithm end
abstract type Hamiltonian <: InferenceAlgorithm end
abstract type StaticHamiltonian <: Hamiltonian end
abstract type AdaptiveHamiltonian <: Hamiltonian end
getADbackend(alg::Hamiltonian) = alg.adtype

"""
ExternalSampler{S<:AbstractSampler}
Expand All @@ -98,6 +96,20 @@ Wrap a sampler so it can be used as an inference algorithm.
"""
externalsampler(sampler::AbstractSampler) = ExternalSampler(sampler)

getADType(spl::Sampler) = getADType(spl.alg)
getADType(::SampleFromPrior) = AutoForwardDiff(; chunksize=0)

getADType(ctx::DynamicPPL.SamplingContext) = getADType(ctx.sampler)
getADType(ctx::DynamicPPL.AbstractContext) = getADType(DynamicPPL.NodeTrait(ctx), ctx)
getADType(::DynamicPPL.IsLeaf, ctx::DynamicPPL.AbstractContext) = AutoForwardDiff(; chunksize=0)
getADType(::DynamicPPL.IsParent, ctx::DynamicPPL.AbstractContext) = getADType(DynamicPPL.childcontext(ctx))

getADType(alg::Hamiltonian) = alg.adtype

function LogDensityProblemsAD.ADgradient(ℓ::DynamicPPL.LogDensityFunction)
return LogDensityProblemsAD.ADgradient(getADType(ℓ.context), ℓ)
end

function LogDensityProblems.logdensity(
f::Turing.LogDensityFunction{<:AbstractVarInfo,<:Model,<:DynamicPPL.DefaultContext},
x::NamedTuple
Expand Down
11 changes: 0 additions & 11 deletions test/essential/ad.jl
Original file line number Diff line number Diff line change
Expand Up @@ -165,17 +165,6 @@

end

@testset "tag" begin
for chunksize in (0, 1, 10)
ad = Turing.AutoForwardDiff(; chunksize=chunksize)
@test ad === Turing.AutoForwardDiff(; chunksize=chunksize)
@test Turing.Essential.standardtag(ad)
for standardtag in (false, 0, 1)
@test !Turing.Essential.standardtag(Turing.AutoForwardDiff(; chunksize=chunksize, tag=standardtag))
end
end
end

@testset "ReverseDiff compiled without linking" begin
f = DynamicPPL.LogDensityFunction(gdemo_default)
θ = DynamicPPL.getparams(f)
Expand Down

2 comments on commit 616a07f

@sunxd3
Copy link
Member Author

@sunxd3 sunxd3 commented on 616a07f Feb 19, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@JuliaRegistrator register

Release notes:

  • essential/ad.jl is removed, ForwardDiff and ReverseDiff integrations via LogDensityProblemsAD are moved to DynamicPPL and live in corresponding package extensions.
  • LogDensityProblemsAD.ADgradient(ℓ::DynamicPPL.LogDensityFunction) (i.e. the single argument method) is moved to Inference module. It will create ADgradient using the adtype information stored in context field of .
  • getADbackend function is renamed to getADType, the interface is preserved, but packages that previously used getADbackend should be updated to use getADType.
  • TuringTag for ForwardDiff is also removed, now DynamicPPLTag is defined in DynamicPPL package and should serve the same purpose.

@JuliaRegistrator
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Registration pull request created: JuliaRegistries/General/101195

Tagging

After the above pull request is merged, it is recommended that a tag is created on this repository for the registered package version.

This will be done automatically if the Julia TagBot GitHub Action is installed, or can be done manually through the github interface, or via:

git tag -a v0.30.5 -m "<description of version>" 616a07fa0ca91496119e4ce66beb28f28e10fa33
git push origin v0.30.5

Please sign in to comment.