You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Nov 4, 2024. It is now read-only.
I want to initialize Lux layer weights using an initializer other than ones/zeros/rand/randn but still specify the type such as Float16 or Float32.
The way it's setup, Lux layers like Dense expect init_weight to be a function with signature (rng, dims...). The only option to customize is initial_weight=glorot_uniform(gain=0.5) and the __partial_apply are setup so that when Lux initializes it will always be calling (line 324 in initializer)
I might be missing something obvious but I don't see a __partial_apply that exposes the right signature so that I could do something like initial_weight = glorot_uniform(Float64). If I try it of course complains
julia> ps, st = Lux.setup(Xoshiro(), nn)
ERROR: MethodError: no method matching glorot_uniform(::Xoshiro, ::Type{Float64}, ::Xoshiro, ::Int64, ::Int64)
I want to initialize Lux layer weights using an initializer other than ones/zeros/rand/randn but still specify the type such as Float16 or Float32.
The way it's setup, Lux layers like
Dense
expectinit_weight
to be a function with signature(rng, dims...)
. The only option to customize isinitial_weight=glorot_uniform(gain=0.5)
and the __partial_apply are setup so that when Lux initializes it will always be calling (line 324 in initializer)I might be missing something obvious but I don't see a __partial_apply that exposes the right signature so that I could do something like
initial_weight = glorot_uniform(Float64)
. If I try it of course complainsSo for now I use an anonymous function
The text was updated successfully, but these errors were encountered: