-
Notifications
You must be signed in to change notification settings - Fork 24
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
API to access desired periods of timesets? #307
Comments
Until now, timesets have been considered period agnostic, so they can be defined with reference to a variable period, e.g. Maybe we should add the concept of a default period that could be declared during the timeset definition, and then if you call Would that work? |
I'm okay with keeping timesets period agnostic. I was thinking more along the lines of a complement to timesets, but I'm not certain how that would look. Maybe something like: # dut.rb
def initialize
timeset :t1, ...
timeset :t2, ...
set_timing(t1: 40, t2: 60)
dut.timing
#=> {t1: 40, t2: 60}
dut.set_timeset(:t1)
# ...
end Though a default period to the timeset would also work and is probably more straight forward than what I have above. However, I think we should start to discourage the use of setting timeset periods in the |
Yeah, I agree with your comments about moving away from defining the period at the point of timeset selection. Why don't we just give the timeset class a # To set the default period when defining the timeset
timeset :func do |t|
t.period_in_ns = 100
end
# To change it (or set it in the first place) later on
dut.timeset(:func).period_in_ns = 80
# We should still keep around the ability to provide a final override at selection time
# for legacy compatibility
dut.set_timeset(:func, 40)
# Though this would be the intended approach and fine if all documentation drops references
# to the above form in preference for this one:
dut.set_timeset(:func) |
Okay, that's fine too! I think not calling it |
I realized why the legacy definition is the way it is - if you're not going anywhere near a simulation and all you want to do is generate an ATE pattern, then you don't need a full timeset definition, all you need is the name of the timeset and the period (so that Origen can convert time-based delays to cycles). So, I don't think we should add a deprecation warning to the legacy API and force users to define a timeset before they reference it - it would be an unnecessary overhead in many cases. |
Doesn't that still have the problem where trying to change the timeset period in the pattern for the same timeset will cause inconsistencies? If all you want is ATE pattern generation, the timeset would still look pretty simple and we could make it support options too (if it doesn't already) for something like: timeset(:ex, period_in_ns: 40) I still think its a good idea to register the timesets beforehand and not set the period during the course of the pattern even for just ATE generation. Edit: to add on a bit, I think the overhead can be made simple enough to warrant the change and provide better application structure and consistency. Plus, you never know. Someone may start the app as just an ATE generation but move to either timing sheet generation or simulation and then have to go update the timesets. |
Has this problem ever actually occurred in practice though? |
I doubt that this has ever occurred in practice, but I don't think we should wait until it does to protect against something we know won't behave as expected. I think raising an error or warning after its set then may be the best. This can have the best of both worlds and OrigenSim can watch for the timeset not being defined and raise its own error. |
Hello,
I've been continuing to work on running direct text source though OrigenSim (from either
.atp
or.avc
source) and have run into something that I think could use an 'official solution', if it doesn't already have one that I'm just missing.I can set up all the
timesets
and query them from Origen fine, which allows the simulator to drive, compare, etc. as expected. When these are set, this is usually one of the first thing that happens indut.startup
, but, when running a direct source through the simulator, I'm bypassingstartup
(and any other non-setup code), leaving out any invocations toset_timeset
and leaving it up to me to set the timeset based on what the pattern has.That's fine, and is an expected behavior, but most of the time (and the examples on the webpage shows this as well), the timeset is set with a hard-coded
period_in_ns
value. This means that there's no way for me, as the simulator (or testers, if we get to time sheet generation) to know what the desiredperiod_in_ns
is.An easy enough workaround that I've done internally is to just require an
attr_reader
on thedut
called#{timeset_name}_period_in_ns
which will be looked up when encountered. For example, to switch to theintram
timeset, I'll require anattr_reader
calledintram_period_in_ns
to simulate correctly.This works fine, but isn't an officially supported thing. Is there currently some sort of API we should be using to declare timeset periods, or is this something we should add? Since a single timeset on the tester can't change without revalidation anyway (the period of
intram
won't randomly change from 40 ns to 60 ns in the course of a single pattern, I mean), I don't think this gets in the way of anything.I can check in my workspace and point to some example usage if needed.
The tl;dr of this: How can I/how should I access the desired
period_in_ns
of a timeset, without having first set that timeset?Thanks!
The text was updated successfully, but these errors were encountered: