Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Multiple DSS contexts #81

Closed
PMeira opened this issue Jul 13, 2022 · 5 comments · Fixed by #88
Closed

Multiple DSS contexts #81

PMeira opened this issue Jul 13, 2022 · 5 comments · Fixed by #88

Comments

@PMeira
Copy link
Member

PMeira commented Jul 13, 2022

DSS C-API 0.12 supports creating multiple DSS instances that can be used separately, in threads, etc.

Should we expose this in ODD.jl? If so, what would be the best approach?

@kdheepak
Copy link
Member

kdheepak commented Jul 13, 2022

Where can I find examples for the C API with contexts?

@tshort
Copy link
Collaborator

tshort commented Jul 13, 2022

With multiple dispatch, it's easy to add an optional context that controls the DSS instance.

# Normal usage (backwards compatible)
Circuit.SetActiveBus(bus)
v = Bus.Voltages()
# Calling a specific instance
Circuit.SetActiveBus(dss1, bus)
v = Bus.Voltages(dss1)

Here's how a definition could look:

    """Set active bus name"""
    function SetActiveBus(BusName::String)::Int
        return @checked Lib.Circuit_SetActiveBus(ACTIVECONTEXT, Cstring(pointer(BusName)))
    end
    function SetActiveBus(dss::TDSSContext, BusName::String)::Int
        return @checked Lib.Circuit_SetActiveBus(dss, Cstring(pointer(BusName)))
    end

@PMeira
Copy link
Member Author

PMeira commented Jul 13, 2022

@kdheepak I added a minimal C sample using OpenMP in https://github.com/dss-extensions/dss_capi/blob/master/examples/ctx_openmp.c
We should have some proper docs explaining the implementation and how it differs from the official OpenDSS by the end of next week. I have some older slides but they're not in English yet.

Currently every function that depends on the context was duplicated, so we have Circuit_SetActiveBus (without the context pointer) and ctx_Circuit_SetActiveBus (with the pointer). We could remove the old version without the context for 0.13, since it's planned to break and clean the API. Version 0.12 is still very backwards compatible.

With multiple dispatch, it's easy to add an optional context that controls the DSS instance.

Thanks @tshort, I like the idea. It should be easy enough for users to adopt it too.

@PMeira
Copy link
Member Author

PMeira commented Jul 13, 2022

I should have mentioned that in DSS Python, to avoid rewriting the code, I wrapped the ctx_* functions with partial to pre-bind the context:

https://github.com/dss-extensions/dss_python/blob/6e08549f1495e63ebdd72830d5ec8ec1d814e740/dss/_cffi_api_util.py#L67

For C# and C++, it was easier to migrate to the ctx_* functions for other reasons.

For compatibility, an initial context is always created and it's accessible via ctx_Get_Prime(). We can use that here and migrate to ctx_*, thus we would be free to remove the old API later (I'm handling ODD.py and DSS_MATLAB in the coming days).

@PMeira
Copy link
Member Author

PMeira commented Dec 15, 2022

I was finally looking into this, so I tested the two approaches below. This is large enough that we could merge #88 as a check-point, and open another PR when DSS C-API 0.12.2 is ready.

First version is simply updating all functions that need a context to receive the context
and then add a function that prebinds the default (null pointer) context:

"""Array of strings containing all Load names"""
function AllNames(dss::DSSContext)::Vector{String}
    return get_string_array(Lib.Loads_Get_AllNames, dss.ctx)
end
AllNames() = AllNames(DSS_DEFAULT_CTX)

DSS_DEFAULT_CTX just wraps C_NULL , which is now checked in the engine to select the instance. This keeps compatibility with the Parallel API from the official OpenDSS -- I don't think anyone uses that in Julia, but it's good to keep it. Not much performance difference, if any.

Alternatively, using a macro do modify the function to add the the context argument,
besides updating the @checked macro and replacing the default context with the
argument:

macro ctxify(ex::Expr)
    ex_org = copy(ex)
    fargs = ex.args[1].args[1].args
    ex.args[1].args[1].args = [fargs[1]; :(dss::DSSContext); fargs[2:end]]
    local found = false

    ex_new = postwalk(ex) do x
        if (x isa Symbol)
            if (String(x) == "C_NULL_CTX")
                found = true
                return :(dss.ctx)
            elseif (String(x) == "@checked")
                found = true
                return Symbol("@checked_ctx")
            end
        end
        return x
    end 
    if !found
        return quote
            @__doc__ $ex_org
        end |> esc
    end
    return quote
        @__doc__ $ex_org
        
        $ex_new
    end |> esc
end

#...

"""Array of strings containing all Load names"""
@ctxify function AllNames()::Vector{String}
    return get_string_array(Lib.Loads_Get_AllNames, C_NULL_CTX)
end

Since the macro uses postwalk from MacroTools, that would add a new dependency.
If feels a bit weird to me, so I think I'm going with the AllNames() = AllNames(DSS_DEFAULT_CTX) approach.
Either way we avoid code duplication, and it should be easy to adjust for either approach using a regular expression on the source code. So, if anyone has a clear preference on this, please let me know.

Next steps would be to add Julia-specific docs and some examples for both the official Parallel API and multi-threading. A quick one would be to adapt custom_8760_pmap.jl to use multi-threading. I'm adding general notes to https://github.com/dss-extensions/dss-extensions/blob/main/multithreading.md

Here, one thing we could recommend is to leave the default engine empty. So if the user accidentally forgets to pass an instance explicitly, that would result in an error sooner or later.

Sample code
using OpenDSSDirect
using OpenDSSDirect.Lib # for SolveModes

println("nthreads:", Threads.nthreads())

Basic.AllowChangeDir(false)

engines = [Basic.NewContext() for _=1:5]

losses = zeros(5)
mults = zeros(5)
@time Threads.@threads for idx in 1:length(engines)
    e = engines[idx]
    dss(e, "redirect ../electricdss-tst/Version8/Distrib/EPRITestCircuits/ckt5/Master_ckt5.dss")
    Solution.LoadMult(e, 1.0 + 0.5 * idx / length(engines))
    Solution.Mode(e, Lib.SolveModes_Daily)
    Solution.Number(e, 1)
    local this_losses = 0;
    for n in 1:1000
        Solution.Solve(e)
        this_losses += real(Circuit.Losses(e))
    end
    mults[idx] = Solution.LoadMult(e)
    losses[idx] = this_losses
end

println("Losses using multiple instances")
println(mults)
println(losses * 1e-6)


losses_main = zeros(5)
mults_main = zeros(5)
@time for idx in 1:length(engines)
    dss("redirect ../electricdss-tst/Version8/Distrib/EPRITestCircuits/ckt5/Master_ckt5.dss")
    Solution.LoadMult(1.0 + 0.5 * idx / length(engines))
    Solution.Mode(Lib.SolveModes_Daily)
    Solution.Number(1)
    local this_losses = 0;
    for n in 1:1000
        Solution.Solve()
        this_losses += real(Circuit.Losses())
    end
    mults_main[idx] = Solution.LoadMult()
    losses_main[idx] = this_losses
end

println("Losses using the main instance")
println(mults_main)
println(losses_main * 1e-6)

@assert losses_main == losses
Sample output with 5 threads
nthreads:5
  8.644027 seconds (870.07 k allocations: 45.927 MiB, 3.69% compilation time)
Losses using multiple instances
[1.1, 1.2, 1.3, 1.4, 1.5]
[314.67863318496796, 367.5434911094195, 429.1441954164126, 494.55577034953353, 561.4847526324147]
 26.255574 seconds (50.11 k allocations: 2.829 MiB)
Losses using the main instance
[1.1, 1.2, 1.3, 1.4, 1.5]
[314.67863318496796, 367.5434911094195, 429.1441954164126, 494.55577034953353, 561.4847526324147]

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants