Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Distinct init for kernel and recurrent #2522

Merged
merged 2 commits into from
Nov 10, 2024

Conversation

MartinuzziFrancesco
Copy link
Contributor

Adding different initializers options for input matrix and recurrent matrix in recurrent cells as described in #2514. Similar approaches are available in:

An example of the implementation is as follows:

function RNNCell((in, out)::Pair, σ=relu;
    kernel_init = glorot_uniform,
    recurrent_kernel_init = glorot_uniform,
    bias = true)
    Wi = kernel_init(out, in)
    U = recurrent_kernel_init(out, out)
    b = create_bias(Wi, bias, size(Wi, 1))
    return RNNCell(σ, Wi, U, b)
end

PR Checklist

  • Tests are added (should not be necessary)
  • Entry in NEWS.md
  • Documentation, if applicable (updated docstrings)

Copy link

codecov bot commented Nov 9, 2024

Codecov Report

All modified and coverable lines are covered by tests ✅

Project coverage is 60.37%. Comparing base (c51e6fb) to head (bd2d257).
Report is 1 commits behind head on master.

Additional details and impacted files
@@             Coverage Diff             @@
##           master    #2522       +/-   ##
===========================================
+ Coverage   33.40%   60.37%   +26.96%     
===========================================
  Files          31       31               
  Lines        1907     1938       +31     
===========================================
+ Hits          637     1170      +533     
+ Misses       1270      768      -502     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

Wh::H
bias::V
struct RNNCell{F, I, H, V}
σ::F
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
σ::F
σ::F

We have two spaces indent in Flux, unfortunately

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yeha I figured, although https://github.com/FluxML/Flux.jl/blob/master/.JuliaFormatter.toml#L1

indent = 4

kinda confused me

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ah, that should be fixed, or even better we should move to 4 spaces indent

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

With 0.15 coming up that could be a good opportunity! Also at the moment if you do follow https://github.com/FluxML/Flux.jl/blob/master/CONTRIBUTING.md

julia dev/flux_format.jl --verbose .

results in formatting 80 files, so I guess it's formatting everything with 4 spaces indent

@CarloLucibello
Copy link
Member

can we use init_kernel and init_recurrent_kernel instead?

@MartinuzziFrancesco
Copy link
Contributor Author

The naming at the moment follows Flax's, while TF has kernel_initializer and recurrent_initializer. But I have nothing against changing it to init_kernel and init_recurrent_kernel if that's preferred!

@CarloLucibello
Copy link
Member

Lux has init_weight, init_bias, .... I think the init_* convention is better as the argument stand out more clearly as similar.

@MartinuzziFrancesco
Copy link
Contributor Author

makes sense then, I'll change it

@mcabbott mcabbott changed the title Differentiating init for kernel and recurrent Distinct init for kernel and recurrent Nov 10, 2024
@mcabbott mcabbott added the RNN label Nov 10, 2024
@CarloLucibello CarloLucibello merged commit 51ca97f into FluxML:master Nov 10, 2024
10 of 12 checks passed
@CarloLucibello CarloLucibello mentioned this pull request Nov 12, 2024
7 tasks
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants