Skip to content

Commit

Permalink
fix docs
Browse files Browse the repository at this point in the history
  • Loading branch information
mcabbott committed Nov 23, 2024
1 parent 317816c commit 0bef0d6
Showing 1 changed file with 4 additions and 9 deletions.
13 changes: 4 additions & 9 deletions docs/src/reference/training/enzyme.md
Original file line number Diff line number Diff line change
Expand Up @@ -79,22 +79,17 @@ julia> opt_state = Flux.setup(Adam(0), model);
julia> Flux.train!((m,x,y) -> sum(abs2, m(x) .- y), dup_model, [(x1, y1)], opt_state)
```

## Second-order AD

## Listing
If you calculate a gradient within the loss function, then training will involve 2nd derivatives.
While this is in principle supported by Zygote.jl, there are many bugs, and Enzyme.jl is probably a better choice.

Flux functions:
## Listing

```@docs
Flux.gradient(f, args::Union{Flux.EnzymeCore.Const, Flux.EnzymeCore.Duplicated}...)
Flux.withgradient(f, args::Union{Flux.EnzymeCore.Const, Flux.EnzymeCore.Duplicated}...)
Flux.train!(loss, model::Flux.EnzymeCore.Duplicated, data, opt)
```

EnzymeCore types:

```@docs
Flux.EnzymeCore.Duplicated
Flux.EnzymeCore.Const
```

Enzyme.jl has [its own extensive documentation](https://enzymead.github.io/Enzyme.jl/stable/).

0 comments on commit 0bef0d6

Please sign in to comment.