-
Notifications
You must be signed in to change notification settings - Fork 22
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Default simulate fails after clean install of KomaMRI version 0.7.5 #298
Comments
Thank you for your patience! We are experiencing problems with the latest version of Julia (1.10) on Windows. The problems were already fixed and a new version of Kona is planned to be released today. This is also related to #295. In particular, we are waiting for the following package to be registered: After that, we will be able to register KomaMRI 0.8, which should fix these problems. |
The new version 0.8.0 of KomaMRI has been released 😄! If you face any problems, make sure that after running: using Pkg
begin
println("OS $(Base.Sys.MACHINE)") # OS
println("Julia $VERSION") # Julia version
# Koma sub-packages
for (_, pkg) in filter(((_, pkg),) -> occursin("KomaMRI", pkg.name), Pkg.dependencies())
println("$(pkg.name) $(pkg.version)")
end
end you have the following output:
|
I have tried reinstalling (Pkg.rm -> Pkg.gc -> Pkg.add) and updating KomaMRI, but I still get the following versions: OS x86_64-w64-mingw32 Is there some delay to when ver 8+ will be the default download perhaps? |
Update: I removed all compiled files related to KomaMRI and repeated the Pkg.add and I am now on the new version> OS x86_64-w64-mingw32 Simulation and recosntruction was succesful! Thanks for the quick response! |
Nice 🎊 !!! We will mark this issue as solved. |
What happened?
After a clean install of Julia and KomaMRI it fails to perform the simulation with default setting. UI is stuck on spinning circle, and the command window stops at the statements below. I tried running it on two different PCs with win 11.
From a brief contact with developers I was informed that this was an issue with the latest Julia version and that a fix is forthcoming.
Best,
/F
---- LOG -----
[ Info: Currently using KomaMRICore v0.7.6
julia> MethodError: no method matching length(::Nothing)
Closest candidates are:
length(::Base.MethodSpecializations)
@ Base reflection.jl:1166
length(::Tables.DictRowTable)
@ Tables C:\Users\FilipSz.julia\packages\Tables\NSGZI\src\dicts.jl:118
length(::HDF5.BlockRange) (method too new to be called from this world context.)
@ HDF5 C:\Users\FilipSz.julia\packages\HDF5\MIuzl\src\dataspaces.jl:190
...
Stacktrace:
[1] #s597#122
@ C:\Users\FilipSz.julia\packages\GPUCompiler\S3TWf\src\cache.jl:18 [inlined]
[2] var"#s597#122"(f::Any, tt::Any, ::Any, job::Any)
@ GPUCompiler .\none:0
[3] (::Core.GeneratedFunctionStub)(::UInt64, ::LineNumberNode, ::Any, ::Vararg{Any})
@ Core .\boot.jl:602
[4] cached_compilation(cache::Dict{UInt64, Any}, job::GPUCompiler.CompilerJob, compiler::typeof(CUDA.cufunction_compile), linker::typeof(CUDA.cufunction_link))
@ GPUCompiler C:\Users\FilipSz.julia\packages\GPUCompiler\S3TWf\src\cache.jl:71
[5] cufunction(f::GPUArrays.var"#broadcast_kernel#26", tt::Type{Tuple{CUDA.CuKernelContext, CUDA.CuDeviceVector{Float32, 1}, Base.Broadcast.Broadcasted{CUDA.CuArrayStyle{1}, Tuple{Base.OneTo{Int64}}, typeof(convert), Tuple{CUDA.CuRefType{Float32}, Base.Broadcast.Extruded{CUDA.CuDeviceVector{Float32, 1}, Tuple{Bool}, Tuple{Int64}}}}, Int64}}; name::Nothing,
always_inline::Bool, kwargs::@kwargs{})
@ CUDA C:\Users\FilipSz.julia\packages\CUDA\BbliS\src\compiler\execution.jl:300
[6] cufunction
@ C:\Users\FilipSz.julia\packages\CUDA\BbliS\src\compiler\execution.jl:293 [inlined]
[7] macro expansion
@ C:\Users\FilipSz.julia\packages\CUDA\BbliS\src\compiler\execution.jl:102 [inlined]
[8] #launch_heuristic#252
@ C:\Users\FilipSz.julia\packages\CUDA\BbliS\src\gpuarrays.jl:17 [inlined]
[9] launch_heuristic
@ C:\Users\FilipSz.julia\packages\CUDA\BbliS\src\gpuarrays.jl:15 [inlined]
[10] _copyto!
@ C:\Users\FilipSz.julia\packages\GPUArrays\5XhED\src\host\broadcast.jl:65 [inlined]
[11] copyto!
@ C:\Users\FilipSz.julia\packages\GPUArrays\5XhED\src\host\broadcast.jl:46 [inlined]
[12] copy
@ C:\Users\FilipSz.julia\packages\GPUArrays\5XhED\src\host\broadcast.jl:37 [inlined]
[13] materialize
@ .\broadcast.jl:903 [inlined]
[14] adapt_storage(T::Type{Float32}, xs::CUDA.CuArray{Float32, 1, CUDA.Mem.DeviceBuffer})
@ KomaMRICore C:\Users\FilipSz.julia\packages\KomaMRICore\wY77R\src\simulation\GPUFunctions.jl:90
[15] adapt_structure(to::Type, x::CUDA.CuArray{Float32, 1, CUDA.Mem.DeviceBuffer})
@ Adapt C:\Users\FilipSz.julia\packages\Adapt\cdaEv\src\Adapt.jl:57
[16] adapt
@ C:\Users\FilipSz.julia\packages\Adapt\cdaEv\src\Adapt.jl:40 [inlined]
[17] #162
@ C:\Users\FilipSz.julia\packages\KomaMRICore\wY77R\src\simulation\GPUFunctions.jl:88 [inlined]
[18] ExcludeWalk
@ C:\Users\FilipSz.julia\packages\Functors\rlD70\src\walks.jl:106 [inlined]
[19] (::Functors.CachedWalk{Functors.ExcludeWalk{Functors.DefaultWalk, KomaMRICore.var"#162#163"{DataType}, typeof(Functors.isleaf)}, Functors.NoKeyword})(::Function, ::CUDA.CuArray{Float32, 1, CUDA.Mem.DeviceBuffer})
@ Functors C:\Users\FilipSz.julia\packages\Functors\rlD70\src\walks.jl:146
[20] recurse
@ Functors C:\Users\FilipSz.julia\packages\Functors\rlD70\src\walks.jl:37 [inlined]
[21] map
@ Base .\tuple.jl:294 [inlined]
[22] map(f::Functors.var"#recurse#19"{Functors.CachedWalk{Functors.ExcludeWalk{Functors.DefaultWalk, KomaMRICore.var"#162#163"{DataType}, typeof(Functors.isleaf)}, Functors.NoKeyword}}, t::Tuple{String, CUDA.CuArray{Float32, 1, CUDA.Mem.DeviceBuffer}, CUDA.CuArray{Float32, 1, CUDA.Mem.DeviceBuffer}, CUDA.CuArray{Float32, 1, CUDA.Mem.DeviceBuffer}, CUDA.CuArray{Float32, 1, CUDA.Mem.DeviceBuffer}, CUDA.CuArray{Float32, 1, CUDA.Mem.DeviceBuffer}, CUDA.CuArray{Float32, 1, CUDA.Mem.DeviceBuffer}, CUDA.CuArray{Float32, 1, CUDA.Mem.DeviceBuffer}, CUDA.CuArray{Float32, 1, CUDA.Mem.DeviceBuffer}, CUDA.CuArray{Float32, 1, CUDA.Mem.DeviceBuffer}, CUDA.CuArray{Float32, 1, CUDA.Mem.DeviceBuffer}, CUDA.CuArray{Float32, 1, CUDA.Mem.DeviceBuffer}, KomaMRICore.var"#115#130", KomaMRICore.var"#116#131", KomaMRICore.var"#117#132"})
@ Base .\tuple.jl:294
[23] map(::Function, ::@NamedTuple{name::String, x::CUDA.CuArray{Float32, 1, CUDA.Mem.DeviceBuffer}, y::CUDA.CuArray{Float32, 1, CUDA.Mem.DeviceBuffer}, z::CUDA.CuArray{Float32, 1, CUDA.Mem.DeviceBuffer}, ρ::CUDA.CuArray{Float32, 1, CUDA.Mem.DeviceBuffer}, T1::CUDA.CuArray{Float32, 1, CUDA.Mem.DeviceBuffer}, T2::CUDA.CuArray{Float32, 1, CUDA.Mem.DeviceBuffer}, T2s::CUDA.CuArray{Float32, 1, CUDA.Mem.DeviceBuffer}, Δw::CUDA.CuArray{Float32, 1, CUDA.Mem.DeviceBuffer}, Dλ1::CUDA.CuArray{Float32, 1, CUDA.Mem.DeviceBuffer}, Dλ2::CUDA.CuArray{Float32, 1, CUDA.Mem.DeviceBuffer}, Dθ::CUDA.CuArray{Float32, 1, CUDA.Mem.DeviceBuffer}, ux::KomaMRICore.var"#115#130", uy::KomaMRICore.var"#116#131", uz::KomaMRICore.var"#117#132"})
@ Base .\namedtuple.jl:269
[24] _map(f::Function, x::@NamedTuple{name::String, x::CUDA.CuArray{Float32, 1, CUDA.Mem.DeviceBuffer}, y::CUDA.CuArray{Float32, 1, CUDA.Mem.DeviceBuffer}, z::CUDA.CuArray{Float32, 1, CUDA.Mem.DeviceBuffer}, ρ::CUDA.CuArray{Float32, 1, CUDA.Mem.DeviceBuffer}, T1::CUDA.CuArray{Float32, 1, CUDA.Mem.DeviceBuffer}, T2::CUDA.CuArray{Float32, 1, CUDA.Mem.DeviceBuffer}, T2s::CUDA.CuArray{Float32, 1, CUDA.Mem.DeviceBuffer}, Δw::CUDA.CuArray{Float32, 1, CUDA.Mem.DeviceBuffer}, Dλ1::CUDA.CuArray{Float32, 1, CUDA.Mem.DeviceBuffer}, Dλ2::CUDA.CuArray{Float32, 1, CUDA.Mem.DeviceBuffer}, Dθ::CUDA.CuArray{Float32, 1, CUDA.Mem.DeviceBuffer}, ux::KomaMRICore.var"#115#130", uy::KomaMRICore.var"#116#131", uz::KomaMRICore.var"#117#132"})
@ Functors C:\Users\FilipSz.julia\packages\Functors\rlD70\src\walks.jl:1
[25] (::Functors.DefaultWalk)(::Function, ::Phantom{Float32})
@ Functors C:\Users\FilipSz.julia\packages\Functors\rlD70\src\walks.jl:76
[26] ExcludeWalk
@ C:\Users\FilipSz.julia\packages\Functors\rlD70\src\walks.jl:106 [inlined]
[27] (::Functors.CachedWalk{Functors.ExcludeWalk{Functors.DefaultWalk, KomaMRICore.var"#162#163"{DataType}, typeof(Functors.isleaf)}, Functors.NoKeyword})(::Function, ::Phantom{Float32})
@ Functors C:\Users\FilipSz.julia\packages\Functors\rlD70\src\walks.jl:146
[28] execute
@ Functors C:\Users\FilipSz.julia\packages\Functors\rlD70\src\walks.jl:38 [inlined]
[29] #fmap#28
@ Functors C:\Users\FilipSz.julia\packages\Functors\rlD70\src\maps.jl:11 [inlined]
[30] fmap
@ Functors C:\Users\FilipSz.julia\packages\Functors\rlD70\src\maps.jl:3 [inlined]
[31] paramtype
@ KomaMRICore C:\Users\FilipSz.julia\packages\KomaMRICore\wY77R\src\simulation\GPUFunctions.jl:88 [inlined]
[32] f32
@ KomaMRICore C:\Users\FilipSz.julia\packages\KomaMRICore\wY77R\src\simulation\GPUFunctions.jl:99 [inlined]
[33] |>(x::Phantom{Float32}, f::typeof(f32))
@ Base .\operators.jl:917
[34] simulate(obj::Phantom{Float64}, seq::Sequence, sys::Scanner; simParams::Dict{String, Any}, w::Blink.AtomShell.Window)
@ KomaMRICore C:\Users\FilipSz.julia\packages\KomaMRICore\wY77R\src\simulation\SimulatorCore.jl:201
[35] (::KomaMRI.var"#67#112"{String, String})(args::Int64)
@ KomaMRI C:\Users\FilipSz.julia\packages\KomaMRI\CVhId\src\KomaUI.jl:252
[36] #invokelatest#2
@ .\essentials.jl:887 [inlined]
[37] invokelatest
@ .\essentials.jl:884 [inlined]
[38] handle_message(o::Blink.Page, m::Dict{String, Any})
@ Blink C:\Users\FilipSz.julia\packages\Blink\tnO3a\src\rpc\callbacks.jl:7
[39] macro expansion
@ C:\Users\FilipSz.julia\packages\Lazy\9Xnd3\src\macros.jl:268 [inlined]
[40] ws_handler(ws::HTTP.WebSockets.WebSocket)
@ Blink C:\Users\FilipSz.julia\packages\Blink\tnO3a\src\content\server.jl:50
[41] (::Mux.var"#9#10"{Mux.App})(sock::HTTP.WebSockets.WebSocket)
@ Mux C:\Users\FilipSz.julia\packages\Mux\PipQ9\src\server.jl:48
[42] upgrade(f::Mux.var"#9#10"{Mux.App}, http::HTTP.Streams.Stream{HTTP.Messages.Request, HTTP.Connections.Connection{Sockets.TCPSocket}}; suppress_close_error::Bool, maxframesize::Int64, maxfragmentation::Int64, kw::@kwargs{})
@ HTTP.WebSockets C:\Users\FilipSz.julia\packages\HTTP\bDoga\src\WebSockets.jl:440
[43] upgrade
@ C:\Users\FilipSz.julia\packages\HTTP\bDoga\src\WebSockets.jl:420 [inlined]
[44] (::Mux.var"#14#15"{Mux.App, Mux.App})(http::HTTP.Streams.Stream{HTTP.Messages.Request, HTTP.Connections.Connection{Sockets.TCPSocket}})
@ Mux C:\Users\FilipSz.julia\packages\Mux\PipQ9\src\server.jl:81
[45] #invokelatest#2
@ .\essentials.jl:887 [inlined]
[46] invokelatest
@ .\essentials.jl:884 [inlined]
[47] handle_connection(f::Function, c::HTTP.Connections.Connection{Sockets.TCPSocket}, listener::HTTP.Servers.Listener{Nothing, Sockets.TCPServer}, readtimeout::Int64, access_log::Nothing)
@ HTTP.Servers C:\Users\FilipSz.julia\packages\HTTP\bDoga\src\Servers.jl:450
[48] (::HTTP.Servers.var"#16#17"{Mux.var"#14#15"{Mux.App, Mux.App}, HTTP.Servers.Listener{Nothing, Sockets.TCPServer}, Set{HTTP.Connections.Connection}, Int64, Nothing, Base.Semaphore, HTTP.Connections.Connection{Sockets.TCPSocket}})()
@ HTTP.Servers C:\Users\FilipSz.julia\packages\HTTP\bDoga\src\Servers.jl:386
Environment
The text was updated successfully, but these errors were encountered: