-
Notifications
You must be signed in to change notification settings - Fork 1.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Added balances for Yield Yak vaults #5737
Conversation
Workflow run id 8552515453 approved. |
Workflow run id 8552515769 approved. |
Workflow run id 8552521437 approved. |
Workflow run id 8552521711 approved. |
I'm expecting some issues with this PR but would like to discuss the approach. I opened a thread in Discord here but haven't heard anything back yet. I've not used the |
i believe i responded yesterday. unfortunately, i don't think your approach here will work with we don't allow direct access to db, so |
Thanks @jeff-dude. I could be wrong on this, but the issue I see with what you're describing is that I need the result of the query as a jinja variable (specifically, a list) that I can then loop over to create the "main" query. If I were to have a separate spell and select from that using |
@jeff-dude , do you know if the "no direct access to db" would also mean that the |
Workflow run id 8649689954 approved. |
Workflow run id 8649690036 approved. |
Workflow run id 8649698543 approved. |
Workflow run id 8649698631 approved. |
Workflow run id 8649927884 approved. |
Workflow run id 8649928099 approved. |
Workflow run id 8650272962 approved. |
Workflow run id 8650272744 approved. |
Given the "QUERY HAS TOO MANY STAGES" error that's coming now, I'm going to try to break up some of the balances model/macro first into separate Deposits, Withdraws and Reinvests models/macros which can then hopefully be combined to make the balances model/macro without this error. |
…h combine to give balances, and included APY calculation for reinvests model
Workflow run id 8691125503 approved. |
Workflow run id 8691125823 approved. |
Workflow run id 8783217293 approved. |
Workflow run id 8783217639 approved. |
Workflow run id 8784212730 approved. |
Workflow run id 8784212812 approved. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
fantastic code here, thank you for applying all the best practices (and even teaching me some more)!
left a few thoughts below prior to finalizing and merging
{{ source(blockchain, 'transactions') }} t | ||
ON t.hash = c.tx_hash |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
even if we can't apply an incremental filter, we should be able to enhance the join.
can you add block_number
to the join conditions?
transactions is also partitioned on block_date
-- we could likely add condition for block_date = cast(date_trunc('day', block_time) as date)
(assuming block_date
doesn't exist in the decoded tables. if it does, then simply join on block_date
)
{{ source(blockchain, 'transactions') }} t | ||
ON t.hash = c.tx_hash |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
also, can it be an inner join?
that's more efficient on the dunesql engine.
if inner join isn't possible, it's usually best to first read from the larger of the two tables, then left/right join as needed to smaller tables
also, there is some complex incremental logic here. are you aware of the output tables built via CI pipelines attached, so you can query them on dune and test the data and ensure all looks good prior to merge? |
…s merge columns/unique keys
…n for deposits/withdraws/reinvests
@jeff-dude , no I'm not sure on this part. I know about the tables generated by "dbt compile" and have been testing those to make sure the output looks correct, but as far as I could tell, that's for the "initial model". Where do I find the attached CI pipelines/tables to check the incremental logic is working correctly? |
Workflow run id 8812336139 approved. |
Workflow run id 8812336260 approved. |
yep, check out these docs, hopefully they help (if not, let me know, we can modify those docs as needed): |
i'll let this sit for a little bit, if you wanted to query those CI output tables for data quality. let me know when ready to merge 🤝 |
I think the 24 hr window passed. To trigger the actions again I was just going to "Update branch" but it's been showing "Checking for ability to merge automatically..." for a while now (see below). Is there any other way to rerun the gh action? |
i was able to merge in main. i think there is a universal gh issue right now, so it's been slow today. other ways to do it:
|
Thanks for the tips! I've been able to review the CI tables and I'm happy with how it all looks so am happy for you to merge when ready 🤝 |
Thank you for contributing to Spellbook!
Thank you for taking the time to submit code in Spellbook. A few things to consider:
Best practices
To speed up your development process in PRs, keep these tips in mind:
readme
) and rundbt compile
target/
directory to copy/paste and run on Dune directly for initial query testingWHERE
filter for only ~7 days of history on large source tables, i.e.ethereum.transactions
Incremental model setup
COALESCE()
as needed on key column(s), otherwise the tests may fail on duplicates🪄 Use the built CI tables for testing 🪄
Once CI completes, you can query the CI tables and errors in dune when it finishes running.
run initial models
andtest initial models
, there will be a schema that looks like this:test_schema.git_dunesql_4da8bae_sudoswap_v2_base_pool_creations
Leverage these tables to perform QA testing on Dune query editor -- or even full test dashboards!
Spellbook contribution docs
The docs directory has been implemented to answer as many questions as possible. Please take the time to reference each
.md
file within this directory to understand how to efficiently contribute & why the repo is designed as it is 🪄Example questions to be answered:
Please navigate through the docs directory to find as much info as you can.
Note: happy to take PRs to improve the docs, let us know 🤝