Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

exit code for spec failures is the same as exit code for any errors #154

Closed
SpComb opened this issue Apr 25, 2019 · 2 comments
Closed

exit code for spec failures is the same as exit code for any errors #154

SpComb opened this issue Apr 25, 2019 · 2 comments
Labels

Comments

@SpComb
Copy link

SpComb commented Apr 25, 2019

When running jasmine tests in a CI environment, it would be useful to distinguish between test failures and any other errors in test execution. Currently, it seems like the jasmine exit status is always 1 for both spec failures as well as unhandled nodejs exceptions.

Ideally (using Jenkins terms), things like syntax errors in the test specs should result in a red FAILURE status, whereas spec failures should result in an yellow UNSTABLE status with associated JUnit test results.

This would be easy to implement if the exit code for spec failures were to be e.g. 2, instead of the default exit code 1 that node uses for unhandled errors (process Event uncaughtException).

Exit code with spec failures

$ node_modules/.bin/jasmine --config=jasmine.json || echo "EXIT CODE $?"
Started
F..........

Failures:
1) ...
  Message:
    Expected ...
  Stack:
    Error: ...

11 specs, 1 failure
Finished in 3.14 seconds

EXIT CODE 1

This is presumably from the jasmine runner, and could be customized to use a different exit code?

jasmine-npm/lib/jasmine.js

Lines 217 to 222 in dd00f4b

if(passed) {
jasmineRunner.exit(0);
}
else {
jasmineRunner.exit(1);
}

Exit code with e.g. syntax error

$ echo this is a syntax error >> jasmine.json 
$ node_modules/.bin/jasmine --config=jasmine.json || echo "EXIT CODE $?"
/build/node_modules/jasmine/lib/jasmine.js:109
    if(configFilePath || e.code != 'MODULE_NOT_FOUND') { throw e; }
                                                         ^

SyntaxError: /build/e2e-tests/jasmine.json: Unexpected token t in JSON at position 216
    at JSON.parse (<anonymous>)
    at Object.Module._extensions..json (internal/modules/cjs/loader.js:708:27)
    at Module.load (internal/modules/cjs/loader.js:599:32)
    at tryModuleLoad (internal/modules/cjs/loader.js:538:12)
    at Function.Module._load (internal/modules/cjs/loader.js:530:3)
    at Module.require (internal/modules/cjs/loader.js:637:17)
    at require (internal/modules/cjs/helpers.js:22:18)
    at Jasmine.loadConfigFile (/build/node_modules/jasmine/lib/jasmine.js:106:18)
    at runJasmine (/build/node_modules/jasmine/lib/command.js:111:11)
    at Command.run (/build/node_modules/jasmine/lib/command.js:55:9)
EXIT CODE 1

This is presumably the default node uncaughtException -> exit 1 case.

@slackersoft
Copy link
Member

This seems plausible on the surface, but I have a few questions and concerns I'd like to talk about more before we go forward with this kind of change:

  • Windows builds (VSOnline, TFS), in my experience, don't default to checking exit status, so users may have handwritten checks that look specifically for 1, making this potentially a breaking change.
  • This will only catch errors thrown before Jasmine starts loading and evaluating your specs, any syntax errors in your spec files should end up being caught by Jasmine and reported as a failure in your suite.
  • A more meta-level concern: Are failures in your test expectations actually lower priority than a failure in the tool (or why else do you want the status to be different)?

I'm open to doing something about this, but wanted to make sure everyone involved understands the issues and limitations with a potential implementation before putting more effort in.

Hope this helps. Thanks for using Jasmine!

@SpComb
Copy link
Author

SpComb commented Oct 21, 2019

Windows builds (VSOnline, TFS), in my experience, don't default to checking exit status, so users may have handwritten checks that look specifically for 1, making this potentially a breaking change.

For e.g. Karma + Jenkins, this seems to have been handled by failOnFailingTestSuite: false giving a zero exit code on reported spec failures: https://www.bountysource.com/issues/8313366-configurable-whether-process-should-exit-with-non-zero-exit-code-when-tests-failed -> https://github.com/karma-runner/karma/blob/259be0d2d0d20366b1974ae8921de3163bf2c2c8/lib/browser_collection.js#L55-L57

Making this a documented configuration option that exits with code 0 in the !passed case would work.

A more meta-level concern: Are failures in your test expectations actually lower priority than a failure in the tool (or why else do you want the status to be different)?

Spec expectation failures are a normal part of the development workflow, they're expected and the CI tooling needs to be able to identify and cleanly report those to the developer that submitted the build, so that they can easily identify and fix them.

For complex CI builds with multiple spec suites (backend/frontend, jasmine/others), it's also desirable to continue the build in the case of normal spec failures, so that all of the spec failures can be reported to the developer, rather than stopping the entire build and only reporting the first spec suite failure.

The way this is currently handled is to ignore the jasmine exit code, and rely on picking up spec failures from e.g. junit reports. If the test runner is broken enough that it doesn't actually write out any junit report, then the build may pass without reporting the error.

Test tooling failures caused by e.g. the build machine running out of disk space are not normal or expected, and they should (ideally) immediately abort the entire build. They're also not the developer's fault, and they're not necessarily the ones responsible for fixing those issues, so having a separate build status makes more sense.

This issue was closed.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

3 participants