Releases: LuxDL/Lux.jl
Releases · LuxDL/Lux.jl
v1.5.0
Lux v1.5.0
Merged pull requests:
- CompatHelper: add new compat entry for ProgressTables at version 0.1 for package CIFAR10, (keep existing compat) (#1154) (@github-actions[bot])
device(NN)
should only give warning once (#1156) (@vpuri3)- feat: conditional VAE (#1157) (@avik-pal)
- fix: ConditionalVAE on CI (#1159) (@avik-pal)
- CompatHelper: add new compat entry for Optimisers at version 0.4 for package ConditionalVAE, (keep existing compat) (#1161) (@github-actions[bot])
- CompatHelper: add new compat entry for StableRNGs at version 1 for package ConditionalVAE, (keep existing compat) (#1162) (@github-actions[bot])
- CompatHelper: add new compat entry for Comonicon at version 1 for package ConditionalVAE, (keep existing compat) (#1163) (@github-actions[bot])
- [MLDataDevices] Bump deps (#1164) (@pxl-th)
- docs: migrate most examples to Reactant (#1180) (@avik-pal)
- chore: bump crate-ci/typos from 1.28.4 to 1.29.4 (#1183) (@dependabot[bot])
- fix: pass in RNG to shuffle (#1188) (@avik-pal)
- feat: allow no grad option for reactant (#1190) (@avik-pal)
Closed issues:
- Recurrent cells cannot be chained with other layers (#1155)
- No Grad option for TrainState
single_train_step(!)
(#1181) - sparse_init doesn't use provided rng fully (#1185)
- Incorrect IR generated for some neural networks (#1186)
WeightInitializers.DeviceAgnostic
doesn't respect Reactant (#1187)- Export utilities in partial.jl (#1189)
WeightInitializers-v1.1.0
WeightInitializers WeightInitializers-v1.1.0
Diff since WeightInitializers-v1.0.5
Merged pull requests:
Closed issues:
WeightInitializers.DeviceAgnostic
doesn't respect Reactant (#1187)
WeightInitializers-v1.0.5
WeightInitializers WeightInitializers-v1.0.5
Merged pull requests:
- Rewrite (#7) (@avik-pal)
- Rename to Lux (#11) (@avik-pal)
- Initial Documentation (#14) (@avik-pal)
- Minor Updates (#15) (@avik-pal)
- Better CUDNN Dispatches (#16) (@avik-pal)
- Tutorials (#21) (@avik-pal)
- Proper dispatch for types not supported by CUDNN (#23) (@avik-pal)
- [WIP] Recurrent Neural Networks (#24) (@avik-pal)
- Fix math display in docs (#27) (@gdalle)
- Initial ViT Implementation & Pretrained ImageNet Models (#29) (@avik-pal)
- CompatHelper: bump compat for Setfield to 1, (keep existing compat) (#30) (@github-actions[bot])
- Code Formatting -- SciMLStyle (#31) (@avik-pal)
- Cleanup generated function style (#33) (@avik-pal)
- Update README.md (#37) (@zsz00)
- Fix doc for
PairwiseFusion
(#39) (@theabhirath) - Extending
Scale
to allow for multiple dimension inputs (#40) (@theabhirath) - Fix Zygote error caused due to
fill!
(#41) (@theabhirath) - CompatHelper: bump compat for ComponentArrays to 0.12, (keep existing compat) (#43) (@github-actions[bot])
- Update JET tests to allow julia v1.6 (#47) (@avik-pal)
- Formatting updates and relax parameter type (#48) (@avik-pal)
- Enable doctests in CI (#51) (@avik-pal)
- fix quickstart example (#52) (@visr)
- Test on 1.8 (#54) (@avik-pal)
- Separate out testing unreleased julia versions (#55) (@avik-pal)
- Cleaner and Better Documentation (#56) (@avik-pal)
- Bump Pkg Compats (#66) (@avik-pal)
- CompatHelper: bump compat for MLDatasets to 0.7 for package examples, (keep existing compat) (#67) (@github-actions[bot])
- Manual to translate Flux to Lux (#69) (@avik-pal)
- Try codecov for doctests (#70) (@avik-pal)
- Add tests for utility functions (#74) (@avik-pal)
- Add tip to install packages (#76) (@Karthik-d-k)
- More Testing + Deprecate Nonsensical Functions + Better Naming for Kwargs (#80) (@avik-pal)
- CompatHelper: add new compat entry for Optimisers at version 0.2, (keep existing compat) (#82) (@github-actions[bot])
- Update rrules so that we can support Yota (#85) (@avik-pal)
- CompatHelper: bump compat for FluxMPI to 0.6 for package examples, (keep existing compat) (#86) (@github-actions[bot])
- Update comparison section in overview.md (#88) (@ToucheSir)
- Fix typos (#89) (@claforte)
- Fix minor typos in the docs (#93) (@gabrevaya)
- making x Float32 in migrate from Flux example (#97) (@gabrevaya)
- add init_hidden_state function (#101) (@gabrevaya)
- JLArray is now registered (#103) (@YichengDWu)
- [LuxTraining] Wrappers for less clunky training loops (#104) (@avik-pal)
- Use OneHotArrays (#105) (@YichengDWu)
- Fixes WeightNorm with zero Parameter bug (#106) (@avik-pal)
- fix state update in NeuralODE example (#107) (@gabrevaya)
- Deprecate
elementwise_*
andapplyactivation
(#113) (@avik-pal) - Go through the dense bias deprecation (#114) (@avik-pal)
- Fix Scale's paramlength (#116) (@lungd)
- Trainable hidden states (#117) (@lungd)
- Rnn bias deprecation (#120) (@lungd)
- Add use_bias kwarg to LSTMCell and GRUCell (#121) (@lungd)
- Update docs for dense layer (#124) (@avik-pal)
- Upper bound ComponentArrays (#125) (@avik-pal)
- Relax ComponentArrays compat (#126) (@avik-pal)
- Layer Normalization Implementation (#127) (@avik-pal)
- LSTM docs: don't go over first element in sequence twice (#132) (@visr)
- fix PairwiseFusion docs (#133) (@YichengDWu)
- Generic recurrent cells (#136) (@jumerckx)
- relu tests with finite diff is too unreliable (#137) (@avik-pal)
- Add kaiming initialization (#138) (@YichengDWu)
- Remove Val in typeinfo of WeightNorm (#140) (@avik-pal)
- Named Layers inside Generic Containers (#143) (@avik-pal)
- Allow fmapping over the model (#144) (@avik-pal)
- Update Imagenet example (#147) (@avik-pal)
- Make normalization more AD friendly (Diffractor) (#148) (@avik-pal)
- Fix CuArray -> Array rrule (#149) (@avik-pal)
- Allow indexing into Chains (#150) (@avik-pal)
- API for freezing layers (#151) (@avik-pal)
- Allow controlling fast activation transformation (#153) (@avik-pal)
- Introducing LuxLib.jl: Effectively pullout some of the custom layer implementations from Lux.jl (#154) (@avik-pal)
- Try relaxing JET version (#155) (@avik-pal)
- Update to use LuxLib (#156) (@avik-pal)
- Allow dispatch using
Lux.apply
(#158) (@avik-pal) - Mark non differentiable code paths (#160) (@avik-pal)
- Fix generic GN dispatch for non 4D arrays (#161) (@avik-pal)
- Add dispatch for subarray (#162) (@avik-pal)
- Add More Layers (#163) (@avik-pal)
- Fix type stability in normalization implementation (#164) (@avik-pal)
- Codecov for lib directories Take 2 (#165) (@avik-pal)
- Add freeze tests to runtests (#166) (@avik-pal)
- Precompile common workflows + check invalidations (#167) (@avik-pal)
- Make normalization typestable (#168) (@avik-pal)
- Add a manual page on precompilation (#169) (@avik-pal)
- Deprecate Lux.transform in favor of Flux2Lux.jl (#170) (@avik-pal)
- Remove dead code and improve var for Tracker.jl support (#171) (@avik-pal)
- Hyper Network Example (#172) (@avik-pal)
- Modify mkdocs settings (#173) (@avik-pal)
- Make ViT work on GPUs (#174) (@avik-pal)
- Add sensible recurrent layer wrappers (#175) (@avik-pal)
setup
only on AbstractRules (#176) (@avik-pal)- Start using Flux2Lux (#177) (@avik-pal)
- Fix some displays (#178) (@avik-pal)
- Relax dropout types (#179) (@avik-pal)
- Add instancenorm and alpha_dropout implementations (#180) (@avik-pal)
- Add InstanceNorm and AlphaDropout (#181) (@avik-pal)
- CompatHelper: bump compat for MLUtils to 0.3 for package examples, (keep existing compat) (#184) (@github-actions[bot])
- remove convert rrule (#185) (@ArnoStrouwen)
- CompatHelper: bump compat for OneHotArrays to 0.2 for package examples, (keep existing compat) (#186) (@github-actions[bot])
- CompatHelper: bump compat for Turing to 0.22 for package examples, (keep existing compat) (#188) (@github-actions[bot])
- Fix layer_map for custom layers (#189) (@avik-pal)
- add example of DDIM implementation (#190) (@yng87)
- LuxCore.jl: Extremely light dependency for Lux Compatibility (#191) (@avik-pal)
- Revert github workflows for merged LuxCore.jl (#193) (@avik-pal)
- CompatHelper: bump compat for MLUtils to 0.3 for package ImageNet, (keep existing compat) (#194) (@github-actions[bot])
- CompatHelper: bump compat for Setfield to 1 for package ImageNet, (keep existing compat) (#195) (@github-actions[bot])
- CompatHelper: bump compat for OneHotArrays to 0.2 for package ImageNet, (keep existing compat) (#196) (@github-actions[bot])
- ADAM -> Adam (#197) (@cossio)
- CompatHelper: bump compat for Functors to 0.4, (keep existing compat) (#199) (@github-actions[bot])
- CompatHelper: bump compat for Functors to 0.4 for package examples, (keep existing compat) (#200) (@github-actions[bot])
- CompatHelper: bump compat for Functors to 0.4 for package ImageNet, (keep existing compat) (#201) (@github-actions[bot])
- Add easy tied weights/parameter sharing support (#202) (@avik-pal)
- CompatHelper: bump compat for Functors to 0.4 for package LuxCore, (keep existing compat) (#203) (@github-actions[bot])
- CompatHelper: add new compat entry for Zygote at version 0.6 for package DDIM, (keep existing compat) (#218) (@github-actions[bot])
- Update DDIM compat requirements (#219) (@avik-pal)
- Update examples (#221) (@avik-pal)
- CompatHelper: bump compat for Turing to 0.23 for package examples, (keep existing compat) (#222) (@github-actions[bot])
- Fix docs (#223) (@avik-pal)
- CompatHelper: bump compat for MLUtils to 0.4 for package examples, (keep existing compat) (#226) (@github-actions[bot])
- CompatHelper: bump compat for MLUtils to 0.4 for package ImageNet, (keep existing compat) (#227) (@github-actions[bot])
- CompatHelper: bump compat for MLUtils to 0.4 for package DDIM, (keep existing compat) (#228) (@github-actions[bot])
- Functor ambiguity fix (#229) (@avik-pal)
- Add all compats together (#238) (@avik-pal)
- CompatHelper: bump compat for Turing to 0.24 for package examples, (keep existing compat) (#241) (@github-actions[bot])
- CompatHelper: bump compat for JET to 0.7 for package test, (keep existing compat) (#251) (@github-actions[bot])
- [WIP] Use Extensions for Flux2Lux (#261) (@avik-pal)
- Cleaner test workflow (#262) (@avik-pal)
- Add a patch for #243 (#263) (@avik-pal)
- Update LuxLib dependencies (#265) (@avik-pal)
- Dropping Julia 1.6 support for Lux (#266) (@avik-pal)
- Purge unnecessary dependencies into weak dependencies (#267) (@avik-pal)
- Add ForwardDiff Extension: Dropout (#269) (@avik-pal)
- Add Tracker as an Extension (#272) (@avik-pal)
- CompatHelper: bump compat for AbstractDifferentiation to 0.5 for package examples, (keep existing compat) (#273) (@github-actions[bot])
- Some Improvements (#274) (@avik-pal)
- Tracker has some of the rules (#275) (@avik-pal)
- Temporary CA + Tracker Patches (#276) (@avik-pal)
- Add CUDA and AMDGPU trigger packages (#277) (@avik-pal)
- ReverseDiff Extension (#280) (@avik-pal)
- Bump peter-evans/create-pull-request from 3 to 4 (#283) (@dependabot[bot])
- Bump actions/cache from 1 to 3 (#284) (@dependabot[bot])
- Bump actions/checkout from 1 to 3 (#285) (@dependabot[bot])
- Return the history for Recurrence (#287) (@avik-pal)
- Truncate tuples and namedtuples (#290) (@avik-pal)
- [WIP] Remove projects from
lib
toLuxDL
(#291) (@avik-pal) - Patch freeze (#292) (@avik-pal)
- Add dispatch for no activation (#293) (@avik-pal)
- Remove weakdeps from deps (#295) (@avik-pal)
- Try restoring lts support (#296) (@avik-pal)
- Testing using LuxTestUtils.jl (#297) (@avik-pal)
- CompatHelper: bump compat for Boltz to 0.2 for package ImageNet, (kee… (#298) (@avik-pal)
- Bump peter-evans/create-pull-request from 4 to 5 (#299) (@dependabot[bot])
- remove Dataloaders (#300) (@avik-pal)
- Update docs (#301) (@avik-pal)
- Fix bug in recurrence ordering (#303) (@avik-pal)
- Update LuxComponentArraysExt.jl (#304) (@avik-pal)
- CompatHelper: bump compat for Turing to 0.25 for package examples, (keep existing compat) (...
LuxLib-v1.4.1
LuxLib LuxLib-v1.4.1
Merged pull requests:
- feat: update ConvMixer to support reactant (#1063) (@avik-pal)
- fix: update default rng for reactant (#1152) (@avik-pal)
- Change update step to thousands in PINN 2D PDE (#1153) (@abhro)
- CompatHelper: add new compat entry for ProgressTables at version 0.1 for package CIFAR10, (keep existing compat) (#1154) (@github-actions[bot])
device(NN)
should only give warning once (#1156) (@vpuri3)- feat: conditional VAE (#1157) (@avik-pal)
- fix: ConditionalVAE on CI (#1159) (@avik-pal)
- CompatHelper: add new compat entry for Optimisers at version 0.4 for package ConditionalVAE, (keep existing compat) (#1161) (@github-actions[bot])
- CompatHelper: add new compat entry for StableRNGs at version 1 for package ConditionalVAE, (keep existing compat) (#1162) (@github-actions[bot])
- CompatHelper: add new compat entry for Comonicon at version 1 for package ConditionalVAE, (keep existing compat) (#1163) (@github-actions[bot])
- [MLDataDevices] Bump deps (#1164) (@pxl-th)
- docs: migrate most examples to Reactant (#1180) (@avik-pal)
- chore: bump crate-ci/typos from 1.28.4 to 1.29.4 (#1183) (@dependabot[bot])
- fix: pass in RNG to shuffle (#1188) (@avik-pal)
Closed issues:
LuxCore-v1.2.2
LuxCore LuxCore-v1.2.2
Merged pull requests:
- feat: update ConvMixer to support reactant (#1063) (@avik-pal)
- test: re-enable flux testing (#1123) (@avik-pal)
- chore: bump minimum Reactant version (#1125) (@avik-pal)
- fix: try fixing cuda install in tests (#1126) (@avik-pal)
- docs: run partial dataset only on CI (#1128) (@avik-pal)
- chore: bump crate-ci/typos from 1.28.1 to 1.28.2 (#1130) (@dependabot[bot])
- fix: preserve object when device is same (#1133) (@avik-pal)
- fix: use functors for testing wrapped arrays (#1134) (@avik-pal)
- fix: remove old patches around reactant bug (#1135) (@avik-pal)
- CompatHelper: bump compat for Flux in [weakdeps] to 0.16, (keep existing compat) (#1136) (@github-actions[bot])
- chore: bump crate-ci/typos from 1.28.2 to 1.28.3 (#1137) (@dependabot[bot])
- fix: update to new reactant changes (#1140) (@avik-pal)
- chore: bump crate-ci/typos from 1.28.3 to 1.28.4 (#1144) (@dependabot[bot])
- don't declare implicitly exported functions public (#1147) (@simeonschaub)
- use
return_type
instead of_return_type
(#1148) (@simeonschaub) - feat: more nested AD rules (#1151) (@avik-pal)
- fix: update default rng for reactant (#1152) (@avik-pal)
- Change update step to thousands in PINN 2D PDE (#1153) (@abhro)
- CompatHelper: add new compat entry for ProgressTables at version 0.1 for package CIFAR10, (keep existing compat) (#1154) (@github-actions[bot])
device(NN)
should only give warning once (#1156) (@vpuri3)- feat: conditional VAE (#1157) (@avik-pal)
- fix: ConditionalVAE on CI (#1159) (@avik-pal)
- CompatHelper: add new compat entry for Optimisers at version 0.4 for package ConditionalVAE, (keep existing compat) (#1161) (@github-actions[bot])
- CompatHelper: add new compat entry for StableRNGs at version 1 for package ConditionalVAE, (keep existing compat) (#1162) (@github-actions[bot])
- CompatHelper: add new compat entry for Comonicon at version 1 for package ConditionalVAE, (keep existing compat) (#1163) (@github-actions[bot])
- [MLDataDevices] Bump deps (#1164) (@pxl-th)
- docs: migrate most examples to Reactant (#1180) (@avik-pal)
- chore: bump crate-ci/typos from 1.28.4 to 1.29.4 (#1183) (@dependabot[bot])
- fix: pass in RNG to shuffle (#1188) (@avik-pal)
Closed issues:
- Immutable Arrays (#8)
- Downstream Compat Updates (#880)
- Re-enable Flux compatibility testing (#1070)
- Documentation Build Stalls (#1120)
- CUDA Test CI is broken (#1121)
- [MLDataDevices] devices don't preserve identity (#1129)
- Random Numbers & Reactant (#1131)
- Problem with Lux & SymbolicsLuxExt (#1132)
- How to implement a detach operation similar to Pytorch? (#1138)
- Unexpected handling of LR Schedulers in TrainState (#1143)
- Directly construct Optimiser state on Reactant buffers (#1145)
(AbstractDevice)(x)
should respectAdapt.adapt_structure
(#1149)- CUDA 2nd order AD with MaxPool and logsoftmax (#1150)
- Recurrent cells cannot be chained with other layers (#1155)
- sparse_init doesn't use provided rng fully (#1185)
- Incorrect IR generated for some neural networks (#1186)
WeightInitializers.DeviceAgnostic
doesn't respect Reactant (#1187)
MLDataDevices-v1.6.7
MLDataDevices MLDataDevices-v1.6.7
Diff since MLDataDevices-v1.6.6
Merged pull requests:
- feat: update ConvMixer to support reactant (#1063) (@avik-pal)
- Change update step to thousands in PINN 2D PDE (#1153) (@abhro)
- CompatHelper: add new compat entry for ProgressTables at version 0.1 for package CIFAR10, (keep existing compat) (#1154) (@github-actions[bot])
device(NN)
should only give warning once (#1156) (@vpuri3)- feat: conditional VAE (#1157) (@avik-pal)
- fix: ConditionalVAE on CI (#1159) (@avik-pal)
- CompatHelper: add new compat entry for Optimisers at version 0.4 for package ConditionalVAE, (keep existing compat) (#1161) (@github-actions[bot])
- CompatHelper: add new compat entry for StableRNGs at version 1 for package ConditionalVAE, (keep existing compat) (#1162) (@github-actions[bot])
- CompatHelper: add new compat entry for Comonicon at version 1 for package ConditionalVAE, (keep existing compat) (#1163) (@github-actions[bot])
- [MLDataDevices] Bump deps (#1164) (@pxl-th)
Closed issues:
- Recurrent cells cannot be chained with other layers (#1155)
v1.4.4
MLDataDevices-v1.6.6
MLDataDevices MLDataDevices-v1.6.6
Diff since MLDataDevices-v1.6.5
Merged pull requests:
- fix: remove old patches around reactant bug (#1135) (@avik-pal)
- CompatHelper: bump compat for Flux in [weakdeps] to 0.16, (keep existing compat) (#1136) (@github-actions[bot])
- chore: bump crate-ci/typos from 1.28.2 to 1.28.3 (#1137) (@dependabot[bot])
- fix: update to new reactant changes (#1140) (@avik-pal)
- chore: bump crate-ci/typos from 1.28.3 to 1.28.4 (#1144) (@dependabot[bot])
- don't declare implicitly exported functions public (#1147) (@simeonschaub)
- use
return_type
instead of_return_type
(#1148) (@simeonschaub) - feat: more nested AD rules (#1151) (@avik-pal)
- fix: update default rng for reactant (#1152) (@avik-pal)
Closed issues:
- Random Numbers & Reactant (#1131)
- Problem with Lux & SymbolicsLuxExt (#1132)
- How to implement a detach operation similar to Pytorch? (#1138)
- Unexpected handling of LR Schedulers in TrainState (#1143)
- Directly construct Optimiser state on Reactant buffers (#1145)
(AbstractDevice)(x)
should respectAdapt.adapt_structure
(#1149)- CUDA 2nd order AD with MaxPool and logsoftmax (#1150)
LuxLib-v1.4.0
LuxLib LuxLib-v1.4.0
Merged pull requests:
Closed issues:
- CUDA 2nd order AD with MaxPool and logsoftmax (#1150)
v1.4.3
Lux v1.4.3
Merged pull requests:
- CompatHelper: bump compat for Flux in [weakdeps] to 0.16, (keep existing compat) (#1136) (@github-actions[bot])
- chore: bump crate-ci/typos from 1.28.2 to 1.28.3 (#1137) (@dependabot[bot])
- fix: update to new reactant changes (#1140) (@avik-pal)
- chore: bump crate-ci/typos from 1.28.3 to 1.28.4 (#1144) (@dependabot[bot])
- don't declare implicitly exported functions public (#1147) (@simeonschaub)
- use
return_type
instead of_return_type
(#1148) (@simeonschaub)
Closed issues: