-
Notifications
You must be signed in to change notification settings - Fork 3.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
If test fails in hooks, send test 'name' as original test to reporters instead of including 'before each hook for....' text in test 'name' #6654
Comments
Any updates with this? I'm facing the exact issue. Failures in hooks are manipulating the name and className, and this is messing things up for reporting in CI. |
Also running into this issue with internal test monitoring tooling parsing cypress test output. |
I don't see us changing the name of the test in this case - this is what we use to display in the Command Log and stdout - in order to be more descriptive about where the error is throwing in order to more quickly debug. Perhaps there is some way to change the name that is sent directly to reporters for use. I'm not sure. |
Any updates on this? Also running into this issue. The only workaround at the moment is to avoid using before or beforeEach hooks, but that's not a solution. :( |
I'm running into this as well. My dashboard is full of "tests" with a zero percent pass percentage. |
Other tests frameworks (like pytest) fail with the test name if a setup/teardown failed. This is easy then to post to an external management tool that automatically maps your jUnit file with tests so I agree on only the tests names must be used for these cases. I am coding a workaround to edit the jUnit file before being posted but would be nice to have this issue resolved directly in Cypress or as an option. |
I am also facing same issue and due to that I am not able to update the generated .xml files in to testrail. |
We are also facing this issue. Any solutions or workarounds? Edit1: Edit2: Edit3: |
I just ran a simple test to verify how mocha handles reporting this (i.e. didn't test using Cypress) and the reporting is the same. Command: mocha --reporter mocha-junit-reporter 'scripts/unit/example-spec.js' Spec with Before Each error: describe('example, () => {
beforeEach(() => {
expect(true).to.be.false
})
it('should pass', () => {
expect(true).to.be.true
}) Generated Report: <?xml version="1.0" encoding="UTF-8"?>
<testsuites name="Mocha Tests" time="0.0000" tests="7" failures="1">
<testsuite name="Root Suite" timestamp="2023-04-07T17:58:58" tests="0" time="0.0000" failures="0">
</testsuite>
<testsuite name="example" timestamp="2023-04-07T17:58:58" tests="7" file="<>/scripts/unit/example-spec.js" time="0.0000" failures="1">
<testcase name="example "before each" hook for "should pass"" time="0.0000" classname=""before each" hook for "should pass"">
<failure message="expect is not defined" type="ReferenceError"><![CDATA[ReferenceError: expect is not defined
at Context.<anonymous> (scripts/unit/example-spec.js:6:5)
at processImmediate (node:internal/timers:466:21)]]></failure>
</testcase>
</testsuite>
</testsuites> This is not a Cypress-specific issue since this is how mocha is reporting this information. |
Switching to cypress-circleci-reporter (from
|
Current behavior:
When you're using the default JUnit reporter, as shown in https://on.cypress.io/reporters#Command-line-1:
With a test similar to:
The
xml
output is this:So far so good.
The problem starts when an error happens in
beforeEach
(e.g.,#element
is no longer on the screen). Then the result is the following:The problem is that
name
withintestcase
changes to"test "before each" hook for "can do stuff""
because of the error inbeforeEach
, even though the test is exactly the same as before.This makes it impossible for tools that ingest this kind of reports to unequivocally identify tests.
Desired behavior:
The
name
in thetestcase
should be constant whether the test fails or not.Test code to reproduce
with failing
beforeEach
run with:Versions
Running on
macOS 10.15.3
The text was updated successfully, but these errors were encountered: