-
-
Notifications
You must be signed in to change notification settings - Fork 3.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Implementation status dashboard #6075
base: master
Are you sure you want to change the base?
Conversation
License: MIT Signed-off-by: Christian Couder <[email protected]>
License: MIT Signed-off-by: Christian Couder <[email protected]>
License: MIT Signed-off-by: Christian Couder <[email protected]>
License: MIT Signed-off-by: Christian Couder <[email protected]>
License: MIT Signed-off-by: Christian Couder <[email protected]>
License: MIT Signed-off-by: Christian Couder <[email protected]>
License: MIT Signed-off-by: Christian Couder <[email protected]>
License: MIT Signed-off-by: Christian Couder <[email protected]>
License: MIT Signed-off-by: Christian Couder <[email protected]>
License: MIT Signed-off-by: Christian Couder <[email protected]>
License: MIT Signed-off-by: Christian Couder <[email protected]>
License: MIT Signed-off-by: Christian Couder <[email protected]>
License: MIT Signed-off-by: Christian Couder <[email protected]>
License: MIT Signed-off-by: Christian Couder <[email protected]>
License: MIT Signed-off-by: Christian Couder <[email protected]>
License: MIT Signed-off-by: Christian Couder <[email protected]>
License: MIT Signed-off-by: Christian Couder <[email protected]>
License: MIT Signed-off-by: Christian Couder <[email protected]>
License: MIT Signed-off-by: Christian Couder <[email protected]>
4912a07
to
630595f
Compare
License: MIT Signed-off-by: Christian Couder <[email protected]>
@Kubuxu and @daviddias are you ok to review this? Or do you think someone else should review? Note that I just added a commit to improve the dashboard output (https://gist.github.com/chriscool/1c07b5944a6fc058cf677a04fec62eec) by using the following legend items in the output: 🍏 Passed 🍋 Didn't run 🍅 Failed 🌰 No test implemented |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
So, the output looks nice but there's a lot of cryptic, uncommented, text-munching shell and perl scripts in this PR.
- Could we consider something like JUnit instead of prove? We already patch sharness to support JUnit.
- Perl is, for better or worse, notoriously hard to read and a dying language (for that reason). This is going to be hard to maintain. If we're going to keep the perl, we're going to need to document it thoroughly.
- Instead of parsing the text to figure out which commands we've run, we should probably just inject a fake IPFS binary that logs commands to a file before calling the actual ipfs command. That's going to be a lot simpler, more reliable, and easier to maintain.
- What's our end-goal here? Do we just want a list of supported commands? Sharness isn't going to be able to really give us any reliable information on "in progress" and we should be able to get everything else by by asking js-ipfs/go-ipfs. I guess this last question is for @daviddias.
License: MIT Signed-off-by: Christian Couder <[email protected]>
First some background (that I should have added in the description of this PR): The scripts in this PR are part of the Sharness tests. In fact they could perhaps even be moved to Sharness itself as they are not specific to IPFS. And I think it's better to keep them this way as much as possible. In fact it is already planned to have all the Sharness tests in a separate repo (https://github.com/ipfs/ipfs-sharness-tests/) and to have this repo be a submodule in the go-ipfs and js-ipfs repos.
Yeah, I will try to comment the Perl code more. I tried to reduce the amount of Perl code, but for some things like hash maps and transforms using regexps it's much simpler than doing it in shell (especially if we want it to run with many different shells not just bash). I could also have used sed instead of Perl for regexps but this would not likely have made things more readable.
I don't know the advantage of JUnit over prove and where this patching is done and for which purpose. Anyway prove is part of the Perl ecosystem and Perl is part of the Git/Sharness ecosystem. So using prove and Perl is more likely to run out of the box everywhere Git and Sharness are installed.
Ok to improve documentation and comments in some subsequent commits. Perl is still used a lot for admin scripts especially when something portable is required. Also the scripts in this PR are likely to be maintained by infrastructure team which is more likely to be familiar with Perl than JUnit.
If the IPFS Sharness tests are used not just to test go-ipfs but also to test other implementations that are just starting to being developed, then calling the actual ipfs command that is tested by the Sharness tests is very likely to fail very soon in many test, which means that the following instructions in the tests won't be run (as most tests consists in ipfs commands chained using If we decide to inject a fake ipfs binary that logs command and then always calls the go-ipfs binary instead of the actual ipfs command that is tested, this could work better, but this would depend on always having and carrying around a good, and as up-to-date as possible, go-ipfs binary for that purpose, which is a significant maintenance issue.
I am not sure what "in progress" and "asking js-ipfs/go-ipfs" mean here. Anyway I thought that we wanted something like https://github.com/ipfs/ipfs/blob/master/IMPLEMENTATION_STATUS.md and I think that this PR gives something like that automatically with a bit more details. |
License: MIT Signed-off-by: Christian Couder <[email protected]>
License: MIT Signed-off-by: Christian Couder <[email protected]>
We patch it here: We can use prove, but having to include a custom prove file parser is icky. The advantage of JUnit is:
Could we try upstreaming them? That would make me feel much better about this.
I see. So you want to generate a list of ipfs commands we could test to compare it with a list of commands that actually work. We could also use
Unless I'm mistaken, the point here is to generate a list of things that:
We can get 1 versus 3 by comparing lists of supported commands. I'm guessing these scripts currently try to get (2) running the sharness tests and looking for commands that claim to be supported but aren't. |
About JUnit, I would be ok with adding a JUnit output, but I don't think the way it is patched right now is very nice. It should probably be up-streamed into Sharness too. Also the test output from Sharness is in the TAP (Test Anything Protocol https://testanything.org/) format, so a TAP to JUnit converter could be a completely separate project that we could just use instead of patching Sharness. About prove, it has some interesting options like
Yeah, I think it would be a good idea to upstream the scripts in this PR.
With the result from the scripts in this PR we can easily see the following:
And I think this is enough to get a good idea about the overall progress of an ipfs implementation and the Sharness test infrastructure. Deciding between "not" or "partially" or "fully" implemented cannot really be the goal right now because for example:
If all the tests were implemented and could easily be run for all the commands we want, and if they were all reported by |
There are reasons I created this (admittedly hacky) patch vs just using a TAP converter:
(2 last points may be bugs / wrong implementation in sharness) |
About timing information, the Git test framework (from which Sharness was initialy created) has an extension with a lot of functionality around performance tests. It can even send performance test results to a codespeed server (https://github.com/tobami/codespeed). If we really need performance tests, it could be worth it to port the extension to Sharness. I agree that it should be possible to fix the last 2 points and there is now |
See https://github.com/git/git/tree/master/t/perf for the performance test extension in Git. |
Ok, but that's definitely not what ipfs-inactive/ipfs-sharness-tests#6 is asking for which is why I was so confused. But yeah, I see how this is an implementation status. However, I believe we'll need to be able to set expected failures based on the go-ipfs implementation we're testing instead of the current
We could combine a list of commands supported by all implementations and then just assume that if the flag exists, it's supported. It's not perfect but, as you say, tests aren't going to give us any better information. As I see it, the TODOs are:
|
Ok, I will work on the TODOs you listed. Thanks! |
The goal of this PR is to make it possible to have an implementation status dashboard for any IPFS implementation, as described in: ipfs-inactive/ipfs-sharness-tests#6
More concretely the goal is to get something like for example:
An example of a resulting dashboard produced by code in this PR is:
https://gist.github.com/chriscool/1c07b5944a6fc058cf677a04fec62eec
This is done by:
prove
, asprove
can write .prove files that tells which tests passed or failedThe main way to concretely perform the above steps is to use the Makefile provided by this PR.
cc @scout @daviddias