-
Notifications
You must be signed in to change notification settings - Fork 7
Use Julia 1.12, Pkg workspaces and JuliaC sans trimming #2629
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
Ugh: Full SciMLBasePythonCallExt precompile error
I've seen this pop up before. We don't use Trying the Null option from https://github.com/JuliaPy/CondaPkg.jl in fa7c122 |
|
PackageCompiler crashes on
On Linux we get On Windows It could be a TeamCity issue, or a PackageCompiler issue. I wanted to do it in a separate PR, but I'm tempted to directly switch to https://github.com/JuliaLang/JuliaC.jl/ (without trimming) here. Based on an early effort in #2460. |
See #2629 But now setting this for all tasks at once, because we never want to create new enviornments. This failed again on the same PR in the JuliaC precompilation step.
See #2629 But now setting this for all tasks at once, because we never want to create new enviornments. This failed again on the same PR in the JuliaC precompilation step.
|
Status: this produces a fully functional binary of the same size, and all tests pass. Building the binary seems quite a bit faster. However I did notice that running the basic model tool 1m30s total (not just computation time), vs 30s. JuliaC doesn't support using our [preferences.Ribasim]
precompile_workload = true
specialize = trueThis caused the compilation to take longer, but seemed to have no effect on latency. Possibly I need to try again or just hardcode this for a bit. So this needs some more investigation to get the precompile workload to be effective to avoid the 3x latency regression. |
|
Ah nevermind I set the preferences in the root TOML instead of the core TOML. Now the latency is about the same, as expected. I will fix up the preferences mechanism without The main thing that is different now is that the CPU target is set to native. I'm worried that that will make it less portable to other machines, so we'll need to do something about that before merging. See JuliaLang/JuliaC.jl#33. EDIT: both done in 2d698c8 |
| @@ -1,4 +1,8 @@ | |||
| @testitem "libribasim" begin | |||
| @testmodule libribasim begin | |||
| include("../../build/libribasim.jl") | |||
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Maybe a norm/joinpath here?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why? The include docs mention
The argument path is normalized using normpath which will resolve relative path tokens such as .. and convert / to the appropriate path separator.
|
|
||
| @pytest.fixture(scope="session") | ||
| def basic_transient(basic) -> ribasim.Model: | ||
| return basic_transient_model(basic) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What did this temporal model do?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It wasn't used in the tests, so I removed it.
Supersedes #2440.
Let's see about the tests. From the original PR this point stands:
julia_initis removed there. We need to update our Rust wrapper for that as well.This may apply to
ribasim_apipackage as well as the Rust wrapper.See also https://github.com/JuliaLang/PackageCompiler.jl/pull/1047/files and https://docs.julialang.org/en/v1/manual/embedding/