大佬们, 我在做循环神经网络时遇到了这样的问题, 也不知道为啥, 发帖求助
做这个的思路是取循环神经网络多个序列的最后一个作为输出, 把这个输出和实际标签作比较, 生成loss函数, loss损失可以求出来, 然而在求梯度时却求不出来
using Flux
using Flux: GRU,onehotbatch,params,logitcrossentropy
m=Chain(GRU(300,64),Dense(64,3))
function eval_model(x)
out = m.(x)[end]
Flux.reset!(m)
out
end
loss(x, y) = logitcrossentropy(eval_model(x) ,y)
x=[rand(300) for i=1:5]
y=[0,0,1]
f(xx,yy)=gradient(loss(xx,yy),params(m))
f(x,y)
报错
julia> f(x,y)
ERROR:
MethodError: objects of type Float32 are not callable
Stacktrace:
[1] macro expansion at /home/jerrywang/.julia/packages/Zygote/1GXzF/src/compiler/interface2.jl:0 [inlined]
[2] _pullback(::Zygote.Context, ::Float32) at /home/jerrywang/.julia/packages/Zygote/1GXzF/src/compiler/interface2.jl:13
[3] pullback(::Float32, ::Zygote.Params) at /home/jerrywang/.julia/packages/Zygote/1GXzF/src/compiler/interface.jl:172
[4] gradient(::Float32, ::Zygote.Params) at /home/jerrywang/.julia/packages/Zygote/1GXzF/src/compiler/interface.jl:53
[5] f(::Array{Array{Float64,1},1}, ::Array{Int64,1}) at ./REPL[8]:1
[6] top-level scope at REPL[9]:1