Releases: FluxML/NNlib.jl
Releases · FluxML/NNlib.jl
v0.8.17
NNlib v0.8.17
Closed issues:
- Multi-head attention? (#385)
- Batched multiplication support for ndims > 3 (#391)
- Symmetric Padding (#463)
Merged pull requests:
- implement dot_product_attention (#455) (@CarloLucibello)
- symmetric and circular padding (#465) (@nikopj)
- Reduce
lpnormpool
unnecessary conversion (#467) (@skyleaworlder) - Update deprecated actions & change coverage upload entry (#469) (@skyleaworlder)
v0.8.16
NNlib v0.8.16
Closed issues:
batched_vec
>1000X slower thanbatched_mul
(#462)
Merged pull requests:
- Add
lppool
implementation (#447) (@skyleaworlder) - Refactor
batched_vec
(#464) (@jondeuce)
v0.8.15
v0.8.14
NNlib v0.8.14
Closed issues:
- support arbitrary number of batch dimensions in batched_mul (#451)
Merged pull requests:
v0.8.13
v0.8.12
v0.8.11
NNlib v0.8.11
Closed issues:
- (Flaky?) CI failures on GHA latest + Buildkite (#359)
Merged pull requests:
- Trigger tagbot on issue comments (#440) (@Saransh-cpp)
- Remove threading from all
∇*conv_filter
and re-enable old tests (#441) (@ToucheSir) - Slightly faster softplus (#443) (@Sleort)
- Add fold and unfold (#444) (@nikopj)
v0.8.10
NNlib v0.8.10
Closed issues:
- Incorrect gradient of convolution w.r.t. weights (#197)
- Create independent documentation for
NNlib.jl
? (#430) - Nested AD failure with
logσ
after JuliaDiff/ChainRules.jl#644 (#432)
Merged pull requests:
- Add minimal infrastructure for the docs (#431) (@Saransh-cpp)
- Widen activation broadcast rules (#433) (@mcabbott)
- Add basic benchmark harness (#436) (@ToucheSir)
- Remove negative margin in docs CSS (#437) (@ToucheSir)
- Create root level index.html (#438) (@Saransh-cpp)
- Update readme (#439) (@mcabbott)
v0.8.9
NNlib v0.8.9
Merged pull requests:
- make BatchedAdjOrTrans return correct BroadcastStyle (#424) (@chengchingwen)
- Move
ctc_loss
from Flux to NNlib (#426) (@mcabbott)
v0.8.8
NNlib v0.8.8
Closed issues:
- Activation functions have to be broadcasted by the user to act on arrays (#422)
Merged pull requests:
- support complex input for upsample (#421) (@mloubout)
- Define activation functions taking arrays as input (#423) (@theabhirath)