diff --git a/CHANGELOG.md b/CHANGELOG.md index e2586f9e6da1..4c58714c8f05 100644 --- a/CHANGELOG.md +++ b/CHANGELOG.md @@ -2,16 +2,22 @@ ### Features -- `[jest-resolve]` Add support for `packageFilter` on custom resolver ([#10393](https://github.com/facebook/jest/pull/10393)) - ### Fixes -- `[pretty-format]` Handle `tagName` not being a string ([#10397](https://github.com/facebook/jest/pull/10397)) - ### Chore & Maintenance ### Performance +## 26.4.0 + +### Features + +- `[jest-resolve]` Add support for `packageFilter` on custom resolver ([#10393](https://github.com/facebook/jest/pull/10393)) + +### Fixes + +- `[pretty-format]` Handle `tagName` not being a string ([#10397](https://github.com/facebook/jest/pull/10397)) + ## 26.3.0 ### Features diff --git a/website/versioned_docs/version-26.4/Configuration.md b/website/versioned_docs/version-26.4/Configuration.md new file mode 100644 index 000000000000..0a516470117a --- /dev/null +++ b/website/versioned_docs/version-26.4/Configuration.md @@ -0,0 +1,1268 @@ +--- +id: version-26.4-configuration +title: Configuring Jest +original_id: configuration +--- + +Jest's configuration can be defined in the `package.json` file of your project, or through a `jest.config.js` file or through the `--config ` option. If you'd like to use your `package.json` to store Jest's config, the `"jest"` key should be used on the top level so Jest will know how to find your settings: + +```json +{ + "name": "my-project", + "jest": { + "verbose": true + } +} +``` + +Or through JavaScript: + +```js +// jest.config.js +//Sync object +module.exports = { + verbose: true, +}; + +//Or async function +module.exports = async () => { + return { + verbose: true, + }; +}; +``` + +Please keep in mind that the resulting configuration must be JSON-serializable. + +When using the `--config` option, the JSON file must not contain a "jest" key: + +```json +{ + "bail": 1, + "verbose": true +} +``` + +## Options + +These options let you control Jest's behavior in your `package.json` file. The Jest philosophy is to work great by default, but sometimes you just need more configuration power. + +### Defaults + +You can retrieve Jest's default options to expand them if needed: + +```js +// jest.config.js +const {defaults} = require('jest-config'); +module.exports = { + // ... + moduleFileExtensions: [...defaults.moduleFileExtensions, 'ts', 'tsx'], + // ... +}; +``` + + + +--- + +## Reference + +### `automock` [boolean] + +Default: `false` + +This option tells Jest that all imported modules in your tests should be mocked automatically. All modules used in your tests will have a replacement implementation, keeping the API surface. + +Example: + +```js +// utils.js +export default { + authorize: () => { + return 'token'; + }, + isAuthorized: secret => secret === 'wizard', +}; +``` + +```js +//__tests__/automocking.test.js +import utils from '../utils'; + +test('if utils mocked automatically', () => { + // Public methods of `utils` are now mock functions + expect(utils.authorize.mock).toBeTruthy(); + expect(utils.isAuthorized.mock).toBeTruthy(); + + // You can provide them with your own implementation + // or pass the expected return value + utils.authorize.mockReturnValue('mocked_token'); + utils.isAuthorized.mockReturnValue(true); + + expect(utils.authorize()).toBe('mocked_token'); + expect(utils.isAuthorized('not_wizard')).toBeTruthy(); +}); +``` + +_Note: Node modules are automatically mocked when you have a manual mock in place (e.g.: `__mocks__/lodash.js`). More info [here](manual-mocks.html#mocking-node-modules)._ + +_Note: Core modules, like `fs`, are not mocked by default. They can be mocked explicitly, like `jest.mock('fs')`._ + +### `bail` [number | boolean] + +Default: `0` + +By default, Jest runs all tests and produces all errors into the console upon completion. The bail config option can be used here to have Jest stop running tests after `n` failures. Setting bail to `true` is the same as setting bail to `1`. + +### `cacheDirectory` [string] + +Default: `"/tmp/"` + +The directory where Jest should store its cached dependency information. + +Jest attempts to scan your dependency tree once (up-front) and cache it in order to ease some of the filesystem raking that needs to happen while running tests. This config option lets you customize where Jest stores that cache data on disk. + +### `clearMocks` [boolean] + +Default: `false` + +Automatically clear mock calls and instances before every test. Equivalent to calling `jest.clearAllMocks()` before each test. This does not remove any mock implementation that may have been provided. + +### `collectCoverage` [boolean] + +Default: `false` + +Indicates whether the coverage information should be collected while executing the test. Because these retrofits all executed files with coverage collection statements, it may significantly slow down your tests. + +### `collectCoverageFrom` [array] + +Default: `undefined` + +An array of [glob patterns](https://github.com/jonschlinkert/micromatch) indicating a set of files for which coverage information should be collected. If a file matches the specified glob pattern, coverage information will be collected for it even if no tests exist for this file and it's never required in the test suite. + +Example: + +```json +{ + "collectCoverageFrom": [ + "**/*.{js,jsx}", + "!**/node_modules/**", + "!**/vendor/**" + ] +} +``` + +This will collect coverage information for all the files inside the project's `rootDir`, except the ones that match `**/node_modules/**` or `**/vendor/**`. + +_Note: This option requires `collectCoverage` to be set to true or Jest to be invoked with `--coverage`._ + +
+ Help: + If you are seeing coverage output such as... + +``` +=============================== Coverage summary =============================== +Statements : Unknown% ( 0/0 ) +Branches : Unknown% ( 0/0 ) +Functions : Unknown% ( 0/0 ) +Lines : Unknown% ( 0/0 ) +================================================================================ +Jest: Coverage data for global was not found. +``` + +Most likely your glob patterns are not matching any files. Refer to the [micromatch](https://github.com/jonschlinkert/micromatch) documentation to ensure your globs are compatible. + +
+ +### `coverageDirectory` [string] + +Default: `undefined` + +The directory where Jest should output its coverage files. + +### `coveragePathIgnorePatterns` [array\] + +Default: `["/node_modules/"]` + +An array of regexp pattern strings that are matched against all file paths before executing the test. If the file path matches any of the patterns, coverage information will be skipped. + +These pattern strings match against the full path. Use the `` string token to include the path to your project's root directory to prevent it from accidentally ignoring all of your files in different environments that may have different root directories. Example: `["/build/", "/node_modules/"]`. + +### `coverageProvider` [string] + +Indicates which provider should be used to instrument code for coverage. Allowed values are `babel` (default) or `v8`. + +Note that using `v8` is considered experimental. This uses V8's builtin code coverage rather than one based on Babel. It is not as well tested, and it has also improved in the last few releases of Node. Using the latest versions of node (v14 at the time of this writing) will yield better results. + +### `coverageReporters` [array\] + +Default: `["json", "lcov", "text", "clover"]` + +A list of reporter names that Jest uses when writing coverage reports. Any [istanbul reporter](https://github.com/istanbuljs/istanbuljs/tree/master/packages/istanbul-reports/lib) can be used. + +_Note: Setting this option overwrites the default values. Add `"text"` or `"text-summary"` to see a coverage summary in the console output._ + +_Note: You can pass additional options to the istanbul reporter using the tuple form. For example:_ + +```json +["json", ["lcov", {"projectRoot": "../../"}]] +``` + +For the additional information about the options object shape you can refer to `CoverageReporterWithOptions` type in the [type definitions](https://github.com/facebook/jest/tree/master/packages/jest-types/src/Config.ts). + +### `coverageThreshold` [object] + +Default: `undefined` + +This will be used to configure minimum threshold enforcement for coverage results. Thresholds can be specified as `global`, as a [glob](https://github.com/isaacs/node-glob#glob-primer), and as a directory or file path. If thresholds aren't met, jest will fail. Thresholds specified as a positive number are taken to be the minimum percentage required. Thresholds specified as a negative number represent the maximum number of uncovered entities allowed. + +For example, with the following configuration jest will fail if there is less than 80% branch, line, and function coverage, or if there are more than 10 uncovered statements: + +```json +{ + ... + "jest": { + "coverageThreshold": { + "global": { + "branches": 80, + "functions": 80, + "lines": 80, + "statements": -10 + } + } + } +} +``` + +If globs or paths are specified alongside `global`, coverage data for matching paths will be subtracted from overall coverage and thresholds will be applied independently. Thresholds for globs are applied to all files matching the glob. If the file specified by path is not found, an error is returned. + +For example, with the following configuration: + +```json +{ + ... + "jest": { + "coverageThreshold": { + "global": { + "branches": 50, + "functions": 50, + "lines": 50, + "statements": 50 + }, + "./src/components/": { + "branches": 40, + "statements": 40 + }, + "./src/reducers/**/*.js": { + "statements": 90 + }, + "./src/api/very-important-module.js": { + "branches": 100, + "functions": 100, + "lines": 100, + "statements": 100 + } + } + } +} +``` + +Jest will fail if: + +- The `./src/components` directory has less than 40% branch or statement coverage. +- One of the files matching the `./src/reducers/**/*.js` glob has less than 90% statement coverage. +- The `./src/api/very-important-module.js` file has less than 100% coverage. +- Every remaining file combined has less than 50% coverage (`global`). + +### `dependencyExtractor` [string] + +Default: `undefined` + +This option allows the use of a custom dependency extractor. It must be a node module that exports an object with an `extract` function. E.g.: + +```javascript +const fs = require('fs'); +const crypto = require('crypto'); + +module.exports = { + extract(code, filePath, defaultExtract) { + const deps = defaultExtract(code, filePath); + // Scan the file and add dependencies in `deps` (which is a `Set`) + return deps; + }, + getCacheKey() { + return crypto + .createHash('md5') + .update(fs.readFileSync(__filename)) + .digest('hex'); + }, +}; +``` + +The `extract` function should return an iterable (`Array`, `Set`, etc.) with the dependencies found in the code. + +That module can also contain a `getCacheKey` function to generate a cache key to determine if the logic has changed and any cached artifacts relying on it should be discarded. + +### `displayName` [string, object] + +default: `undefined` + +Allows for a label to be printed alongside a test while it is running. This becomes more useful in multi-project repositories where there can be many jest configuration files. This visually tells which project a test belongs to. Here are sample valid values. + +```js +module.exports = { + displayName: 'CLIENT', +}; +``` + +or + +```js +module.exports = { + displayName: { + name: 'CLIENT', + color: 'blue', + }, +}; +``` + +As a secondary option, an object with the properties `name` and `color` can be passed. This allows for a custom configuration of the background color of the displayName. `displayName` defaults to white when its value is a string. Jest uses [chalk](https://github.com/chalk/chalk) to provide the color. As such, all of the valid options for colors supported by chalk are also supported by jest. + +### `errorOnDeprecated` [boolean] + +Default: `false` + +Make calling deprecated APIs throw helpful error messages. Useful for easing the upgrade process. + +### `extraGlobals` [array\] + +Default: `undefined` + +Test files run inside a [vm](https://nodejs.org/api/vm.html), which slows calls to global context properties (e.g. `Math`). With this option you can specify extra properties to be defined inside the vm for faster lookups. + +For example, if your tests call `Math` often, you can pass it by setting `extraGlobals`. + +```json +{ + ... + "jest": { + "extraGlobals": ["Math"] + } +} +``` + +### `forceCoverageMatch` [array\] + +Default: `['']` + +Test files are normally ignored from collecting code coverage. With this option, you can overwrite this behavior and include otherwise ignored files in code coverage. + +For example, if you have tests in source files named with `.t.js` extension as following: + +```javascript +// sum.t.js + +export function sum(a, b) { + return a + b; +} + +if (process.env.NODE_ENV === 'test') { + test('sum', () => { + expect(sum(1, 2)).toBe(3); + }); +} +``` + +You can collect coverage from those files with setting `forceCoverageMatch`. + +```json +{ + ... + "jest": { + "forceCoverageMatch": ["**/*.t.js"] + } +} +``` + +### `globals` [object] + +Default: `{}` + +A set of global variables that need to be available in all test environments. + +For example, the following would create a global `__DEV__` variable set to `true` in all test environments: + +```json +{ + ... + "jest": { + "globals": { + "__DEV__": true + } + } +} +``` + +Note that, if you specify a global reference value (like an object or array) here, and some code mutates that value in the midst of running a test, that mutation will _not_ be persisted across test runs for other test files. In addition, the `globals` object must be json-serializable, so it can't be used to specify global functions. For that, you should use `setupFiles`. + +### `globalSetup` [string] + +Default: `undefined` + +This option allows the use of a custom global setup module which exports an async function that is triggered once before all test suites. This function gets Jest's `globalConfig` object as a parameter. + +_Note: A global setup module configured in a project (using multi-project runner) will be triggered only when you run at least one test from this project._ + +_Note: Any global variables that are defined through `globalSetup` can only be read in `globalTeardown`. You cannot retrieve globals defined here in your test suites._ + +_Note: While code transformation is applied to the linked setup-file, Jest will **not** transform any code in `node_modules`. This is due to the need to load the actual transformers (e.g. `babel` or `typescript`) to perform transformation._ + +Example: + +```js +// setup.js +module.exports = async () => { + // ... + // Set reference to mongod in order to close the server during teardown. + global.__MONGOD__ = mongod; +}; +``` + +```js +// teardown.js +module.exports = async function () { + await global.__MONGOD__.stop(); +}; +``` + +### `globalTeardown` [string] + +Default: `undefined` + +This option allows the use of a custom global teardown module which exports an async function that is triggered once after all test suites. This function gets Jest's `globalConfig` object as a parameter. + +_Note: A global teardown module configured in a project (using multi-project runner) will be triggered only when you run at least one test from this project._ + +_Note: The same caveat concerning transformation of `node_modules` as for `globalSetup` applies to `globalTeardown`._ + +### `maxConcurrency` [number] + +Default: `5` + +A number limiting the number of tests that are allowed to run at the same time when using `test.concurrent`. Any test above this limit will be queued and executed once a slot is released. + +### `moduleDirectories` [array\] + +Default: `["node_modules"]` + +An array of directory names to be searched recursively up from the requiring module's location. Setting this option will _override_ the default, if you wish to still search `node_modules` for packages include it along with any other options: `["node_modules", "bower_components"]` + +### `moduleFileExtensions` [array\] + +Default: `["js", "json", "jsx", "ts", "tsx", "node"]` + +An array of file extensions your modules use. If you require modules without specifying a file extension, these are the extensions Jest will look for, in left-to-right order. + +We recommend placing the extensions most commonly used in your project on the left, so if you are using TypeScript, you may want to consider moving "ts" and/or "tsx" to the beginning of the array. + +### `moduleNameMapper` [object\>] + +Default: `null` + +A map from regular expressions to module names or to arrays of module names that allow to stub out resources, like images or styles with a single module. + +Modules that are mapped to an alias are unmocked by default, regardless of whether automocking is enabled or not. + +Use `` string token to refer to [`rootDir`](#rootdir-string) value if you want to use file paths. + +Additionally, you can substitute captured regex groups using numbered backreferences. + +Example: + +```json +{ + "moduleNameMapper": { + "^image![a-zA-Z0-9$_-]+$": "GlobalImageStub", + "^[./a-zA-Z0-9$_-]+\\.png$": "/RelativeImageStub.js", + "module_name_(.*)": "/substituted_module_$1.js", + "assets/(.*)": [ + "/images/$1", + "/photos/$1", + "/recipes/$1" + ] + } +} +``` + +The order in which the mappings are defined matters. Patterns are checked one by one until one fits. The most specific rule should be listed first. This is true for arrays of module names as well. + +_Note: If you provide module name without boundaries `^$` it may cause hard to spot errors. E.g. `relay` will replace all modules which contain `relay` as a substring in its name: `relay`, `react-relay` and `graphql-relay` will all be pointed to your stub._ + +### `modulePathIgnorePatterns` [array\] + +Default: `[]` + +An array of regexp pattern strings that are matched against all module paths before those paths are to be considered 'visible' to the module loader. If a given module's path matches any of the patterns, it will not be `require()`-able in the test environment. + +These pattern strings match against the full path. Use the `` string token to include the path to your project's root directory to prevent it from accidentally ignoring all of your files in different environments that may have different root directories. Example: `["/build/"]`. + +### `modulePaths` [array\] + +Default: `[]` + +An alternative API to setting the `NODE_PATH` env variable, `modulePaths` is an array of absolute paths to additional locations to search when resolving modules. Use the `` string token to include the path to your project's root directory. Example: `["/app/"]`. + +### `notify` [boolean] + +Default: `false` + +Activates notifications for test results. + +**Beware:** Jest uses [node-notifier](https://github.com/mikaelbr/node-notifier) to display desktop notifications. On Windows, it creates a new start menu entry on the first use and not display the notification. Notifications will be properly displayed on subsequent runs + +### `notifyMode` [string] + +Default: `failure-change` + +Specifies notification mode. Requires `notify: true`. + +#### Modes + +- `always`: always send a notification. +- `failure`: send a notification when tests fail. +- `success`: send a notification when tests pass. +- `change`: send a notification when the status changed. +- `success-change`: send a notification when tests pass or once when it fails. +- `failure-change`: send a notification when tests fail or once when it passes. + +### `preset` [string] + +Default: `undefined` + +A preset that is used as a base for Jest's configuration. A preset should point to an npm module that has a `jest-preset.json` or `jest-preset.js` file at the root. + +For example, this preset `foo-bar/jest-preset.js` will be configured as follows: + +```json +{ + "preset": "foo-bar" +} +``` + +Presets may also be relative to filesystem paths. + +```json +{ + "preset": "./node_modules/foo-bar/jest-preset.js" +} +``` + +### `prettierPath` [string] + +Default: `'prettier'` + +Sets the path to the [`prettier`](https://prettier.io/) node module used to update inline snapshots. + +### `projects` [array\] + +Default: `undefined` + +When the `projects` configuration is provided with an array of paths or glob patterns, Jest will run tests in all of the specified projects at the same time. This is great for monorepos or when working on multiple projects at the same time. + +```json +{ + "projects": ["", "/examples/*"] +} +``` + +This example configuration will run Jest in the root directory as well as in every folder in the examples directory. You can have an unlimited amount of projects running in the same Jest instance. + +The projects feature can also be used to run multiple configurations or multiple [runners](#runner-string). For this purpose, you can pass an array of configuration objects. For example, to run both tests and ESLint (via [jest-runner-eslint](https://github.com/jest-community/jest-runner-eslint)) in the same invocation of Jest: + +```json +{ + "projects": [ + { + "displayName": "test" + }, + { + "displayName": "lint", + "runner": "jest-runner-eslint", + "testMatch": ["/**/*.js"] + } + ] +} +``` + +_Note: When using multi-project runner, it's recommended to add a `displayName` for each project. This will show the `displayName` of a project next to its tests._ + +### `reporters` [array\] + +Default: `undefined` + +Use this configuration option to add custom reporters to Jest. A custom reporter is a class that implements `onRunStart`, `onTestStart`, `onTestResult`, `onRunComplete` methods that will be called when any of those events occurs. + +If custom reporters are specified, the default Jest reporters will be overridden. To keep default reporters, `default` can be passed as a module name. + +This will override default reporters: + +```json +{ + "reporters": ["/my-custom-reporter.js"] +} +``` + +This will use custom reporter in addition to default reporters that Jest provides: + +```json +{ + "reporters": ["default", "/my-custom-reporter.js"] +} +``` + +Additionally, custom reporters can be configured by passing an `options` object as a second argument: + +```json +{ + "reporters": [ + "default", + ["/my-custom-reporter.js", {"banana": "yes", "pineapple": "no"}] + ] +} +``` + +Custom reporter modules must define a class that takes a `GlobalConfig` and reporter options as constructor arguments: + +Example reporter: + +```js +// my-custom-reporter.js +class MyCustomReporter { + constructor(globalConfig, options) { + this._globalConfig = globalConfig; + this._options = options; + } + + onRunComplete(contexts, results) { + console.log('Custom reporter output:'); + console.log('GlobalConfig: ', this._globalConfig); + console.log('Options: ', this._options); + } +} + +module.exports = MyCustomReporter; +// or export default MyCustomReporter; +``` + +Custom reporters can also force Jest to exit with non-0 code by returning an Error from `getLastError()` methods + +```js +class MyCustomReporter { + // ... + getLastError() { + if (this._shouldFail) { + return new Error('my-custom-reporter.js reported an error'); + } + } +} +``` + +For the full list of methods and argument types see `Reporter` interface in [packages/jest-reporters/src/types.ts](https://github.com/facebook/jest/blob/master/packages/jest-reporters/src/types.ts) + +### `resetMocks` [boolean] + +Default: `false` + +Automatically reset mock state before every test. Equivalent to calling `jest.resetAllMocks()` before each test. This will lead to any mocks having their fake implementations removed but does not restore their initial implementation. + +### `resetModules` [boolean] + +Default: `false` + +By default, each test file gets its own independent module registry. Enabling `resetModules` goes a step further and resets the module registry before running each individual test. This is useful to isolate modules for every test so that the local module state doesn't conflict between tests. This can be done programmatically using [`jest.resetModules()`](JestObjectAPI.md#jestresetmodules). + +### `resolver` [string] + +Default: `undefined` + +This option allows the use of a custom resolver. This resolver must be a node module that exports a function expecting a string as the first argument for the path to resolve and an object with the following structure as the second argument: + +```json +{ + "basedir": string, + "defaultResolver": "function(request, options)", + "extensions": [string], + "moduleDirectory": [string], + "paths": [string], + "packageFilter": "function(pkg, pkgdir)", + "rootDir": [string] +} +``` + +The function should either return a path to the module that should be resolved or throw an error if the module can't be found. + +Note: the defaultResolver passed as an option is the Jest default resolver which might be useful when you write your custom one. It takes the same arguments as your custom one, e.g. `(request, options)`. + +For example, if you want to respect Browserify's [`"browser"` field](https://github.com/browserify/browserify-handbook/blob/master/readme.markdown#browser-field), you can use the following configuration: + +```json +{ + ... + "jest": { + "resolver": "browser-resolve" + } +} +``` + +By combining `defaultResolver` and `packageFilter` we can implement a `package.json` "pre-processor" that allows us to change how the default resolver will resolve modules. For example, imagine we want to use the field `"module"` if it is present, otherwise fallback to `"main"`: + +```json +{ + ... + "jest": { + "resolver": "my-module-resolve" + } +} +``` + +```js +// my-module-resolve package + +module.exports = (request, options) => { + // Call the defaultResolver, so we leverage its cache, error handling, etc. + return options.defaultResolver(request, { + ...options, + // Use packageFilter to process parsed `package.json` before the resolution (see https://www.npmjs.com/package/resolve#resolveid-opts-cb) + packageFilter: pkg => { + return { + ...pkg, + // Alter the value of `main` before resolving the package + main: pkg.module || pkg.main, + }; + }, + }); +}; +``` + +### `restoreMocks` [boolean] + +Default: `false` + +Automatically restore mock state before every test. Equivalent to calling `jest.restoreAllMocks()` before each test. This will lead to any mocks having their fake implementations removed and restores their initial implementation. + +### `rootDir` [string] + +Default: The root of the directory containing your Jest [config file](#) _or_ the `package.json` _or_ the [`pwd`](http://en.wikipedia.org/wiki/Pwd) if no `package.json` is found + +The root directory that Jest should scan for tests and modules within. If you put your Jest config inside your `package.json` and want the root directory to be the root of your repo, the value for this config param will default to the directory of the `package.json`. + +Oftentimes, you'll want to set this to `'src'` or `'lib'`, corresponding to where in your repository the code is stored. + +_Note that using `''` as a string token in any other path-based config settings will refer back to this value. So, for example, if you want your [`setupFiles`](#setupfiles-array) config entry to point at the `env-setup.js` file at the root of your project, you could set its value to `["/env-setup.js"]`._ + +### `roots` [array\] + +Default: `[""]` + +A list of paths to directories that Jest should use to search for files in. + +There are times where you only want Jest to search in a single sub-directory (such as cases where you have a `src/` directory in your repo), but prevent it from accessing the rest of the repo. + +_Note: While `rootDir` is mostly used as a token to be re-used in other configuration options, `roots` is used by the internals of Jest to locate **test files and source files**. This applies also when searching for manual mocks for modules from `node_modules` (`__mocks__` will need to live in one of the `roots`)._ + +_Note: By default, `roots` has a single entry `` but there are cases where you may want to have multiple roots within one project, for example `roots: ["/src/", "/tests/"]`._ + +### `runner` [string] + +Default: `"jest-runner"` + +This option allows you to use a custom runner instead of Jest's default test runner. Examples of runners include: + +- [`jest-runner-eslint`](https://github.com/jest-community/jest-runner-eslint) +- [`jest-runner-mocha`](https://github.com/rogeliog/jest-runner-mocha) +- [`jest-runner-tsc`](https://github.com/azz/jest-runner-tsc) +- [`jest-runner-prettier`](https://github.com/keplersj/jest-runner-prettier) + +_Note: The `runner` property value can omit the `jest-runner-` prefix of the package name._ + +To write a test-runner, export a class with which accepts `globalConfig` in the constructor, and has a `runTests` method with the signature: + +```ts +async runTests( + tests: Array, + watcher: TestWatcher, + onStart: OnTestStart, + onResult: OnTestSuccess, + onFailure: OnTestFailure, + options: TestRunnerOptions, +): Promise +``` + +If you need to restrict your test-runner to only run in serial rather than being executed in parallel your class should have the property `isSerial` to be set as `true`. + +### `setupFiles` [array] + +Default: `[]` + +A list of paths to modules that run some code to configure or set up the testing environment. Each setupFile will be run once per test file. Since every test runs in its own environment, these scripts will be executed in the testing environment immediately before executing the test code itself. + +It's also worth noting that `setupFiles` will execute _before_ [`setupFilesAfterEnv`](#setupfilesafterenv-array). + +### `setupFilesAfterEnv` [array] + +Default: `[]` + +A list of paths to modules that run some code to configure or set up the testing framework before each test file in the suite is executed. Since [`setupFiles`](#setupfiles-array) executes before the test framework is installed in the environment, this script file presents you the opportunity of running some code immediately after the test framework has been installed in the environment. + +If you want a path to be [relative to the root directory of your project](#rootdir-string), please include `` inside a path's string, like `"/a-configs-folder"`. + +For example, Jest ships with several plug-ins to `jasmine` that work by monkey-patching the jasmine API. If you wanted to add even more jasmine plugins to the mix (or if you wanted some custom, project-wide matchers for example), you could do so in these modules. + +_Note: `setupTestFrameworkScriptFile` is deprecated in favor of `setupFilesAfterEnv`._ + +Example `setupFilesAfterEnv` array in a jest.config.js: + +```js +module.exports = { + setupFilesAfterEnv: ['./jest.setup.js'], +}; +``` + +Example `jest.setup.js` file + +```js +jest.setTimeout(10000); // in milliseconds +``` + +### `slowTestThreshold` [number] + +Default: `5` + +The number of seconds after which a test is considered as slow and reported as such in the results. + +### `snapshotResolver` [string] + +Default: `undefined` + +The path to a module that can resolve test<->snapshot path. This config option lets you customize where Jest stores snapshot files on disk. + +Example snapshot resolver module: + +```js +module.exports = { + // resolves from test to snapshot path + resolveSnapshotPath: (testPath, snapshotExtension) => + testPath.replace('__tests__', '__snapshots__') + snapshotExtension, + + // resolves from snapshot to test path + resolveTestPath: (snapshotFilePath, snapshotExtension) => + snapshotFilePath + .replace('__snapshots__', '__tests__') + .slice(0, -snapshotExtension.length), + + // Example test path, used for preflight consistency check of the implementation above + testPathForConsistencyCheck: 'some/__tests__/example.test.js', +}; +``` + +### `snapshotSerializers` [array\] + +Default: `[]` + +A list of paths to snapshot serializer modules Jest should use for snapshot testing. + +Jest has default serializers for built-in JavaScript types, HTML elements (Jest 20.0.0+), ImmutableJS (Jest 20.0.0+) and for React elements. See [snapshot test tutorial](TutorialReactNative.md#snapshot-test) for more information. + +Example serializer module: + +```js +// my-serializer-module +module.exports = { + serialize(val, config, indentation, depth, refs, printer) { + return 'Pretty foo: ' + printer(val.foo); + }, + + test(val) { + return val && val.hasOwnProperty('foo'); + }, +}; +``` + +`printer` is a function that serializes a value using existing plugins. + +To use `my-serializer-module` as a serializer, configuration would be as follows: + +```json +{ + ... + "jest": { + "snapshotSerializers": ["my-serializer-module"] + } +} +``` + +Finally tests would look as follows: + +```js +test(() => { + const bar = { + foo: { + x: 1, + y: 2, + }, + }; + + expect(bar).toMatchSnapshot(); +}); +``` + +Rendered snapshot: + +```json +Pretty foo: Object { + "x": 1, + "y": 2, +} +``` + +To make a dependency explicit instead of implicit, you can call [`expect.addSnapshotSerializer`](ExpectAPI.md#expectaddsnapshotserializerserializer) to add a module for an individual test file instead of adding its path to `snapshotSerializers` in Jest configuration. + +More about serializers API can be found [here](https://github.com/facebook/jest/tree/master/packages/pretty-format/README.md#serialize). + +### `testEnvironment` [string] + +Default: `"jsdom"` + +The test environment that will be used for testing. The default environment in Jest is a browser-like environment through [jsdom](https://github.com/jsdom/jsdom). If you are building a node service, you can use the `node` option to use a node-like environment instead. + +By adding a `@jest-environment` docblock at the top of the file, you can specify another environment to be used for all tests in that file: + +```js +/** + * @jest-environment jsdom + */ + +test('use jsdom in this test file', () => { + const element = document.createElement('div'); + expect(element).not.toBeNull(); +}); +``` + +You can create your own module that will be used for setting up the test environment. The module must export a class with `setup`, `teardown` and `runScript` methods. You can also pass variables from this module to your test suites by assigning them to `this.global` object – this will make them available in your test suites as global variables. + +The class may optionally expose an asynchronous `handleTestEvent` method to bind to events fired by [`jest-circus`](https://github.com/facebook/jest/tree/master/packages/jest-circus). Normally, `jest-circus` test runner would pause until a promise returned from `handleTestEvent` gets fulfilled, **except for the next events**: `start_describe_definition`, `finish_describe_definition`, `add_hook`, `add_test` or `error` (for the up-to-date list you can look at [SyncEvent type in the types definitions](https://github.com/facebook/jest/tree/master/packages/jest-types/src/Circus.ts)). That is caused by backward compatibility reasons and `process.on('unhandledRejection', callback)` signature, but that usually should not be a problem for most of the use cases. + +Any docblock pragmas in test files will be passed to the environment constructor and can be used for per-test configuration. If the pragma does not have a value, it will be present in the object with it's value set to an empty string. If the pragma is not present, it will not be present in the object. + +_Note: TestEnvironment is sandboxed. Each test suite will trigger setup/teardown in their own TestEnvironment._ + +Example: + +```js +// my-custom-environment +const NodeEnvironment = require('jest-environment-node'); + +class CustomEnvironment extends NodeEnvironment { + constructor(config, context) { + super(config, context); + this.testPath = context.testPath; + this.docblockPragmas = context.docblockPragmas; + } + + async setup() { + await super.setup(); + await someSetupTasks(this.testPath); + this.global.someGlobalObject = createGlobalObject(); + + // Will trigger if docblock contains @my-custom-pragma my-pragma-value + if (this.docblockPragmas['my-custom-pragma'] === 'my-pragma-value') { + // ... + } + } + + async teardown() { + this.global.someGlobalObject = destroyGlobalObject(); + await someTeardownTasks(); + await super.teardown(); + } + + runScript(script) { + return super.runScript(script); + } + + async handleTestEvent(event, state) { + if (event.name === 'test_start') { + // ... + } + } +} + +module.exports = CustomEnvironment; +``` + +```js +// my-test-suite +let someGlobalObject; + +beforeAll(() => { + someGlobalObject = global.someGlobalObject; +}); +``` + +### `testEnvironmentOptions` [Object] + +Default: `{}` + +Test environment options that will be passed to the `testEnvironment`. The relevant options depend on the environment. For example you can override options given to [jsdom](https://github.com/jsdom/jsdom) such as `{userAgent: "Agent/007"}`. + +### `testMatch` [array\] + +(default: `[ "**/__tests__/**/*.[jt]s?(x)", "**/?(*.)+(spec|test).[jt]s?(x)" ]`) + +The glob patterns Jest uses to detect test files. By default it looks for `.js`, `.jsx`, `.ts` and `.tsx` files inside of `__tests__` folders, as well as any files with a suffix of `.test` or `.spec` (e.g. `Component.test.js` or `Component.spec.js`). It will also find files called `test.js` or `spec.js`. + +See the [micromatch](https://github.com/jonschlinkert/micromatch) package for details of the patterns you can specify. + +See also [`testRegex` [string | array\]](#testregex-string--arraystring), but note that you cannot specify both options. + +### `testPathIgnorePatterns` [array\] + +Default: `["/node_modules/"]` + +An array of regexp pattern strings that are matched against all test paths before executing the test. If the test path matches any of the patterns, it will be skipped. + +These pattern strings match against the full path. Use the `` string token to include the path to your project's root directory to prevent it from accidentally ignoring all of your files in different environments that may have different root directories. Example: `["/build/", "/node_modules/"]`. + +### `testRegex` [string | array\] + +Default: `(/__tests__/.*|(\\.|/)(test|spec))\\.[jt]sx?$` + +The pattern or patterns Jest uses to detect test files. By default it looks for `.js`, `.jsx`, `.ts` and `.tsx` files inside of `__tests__` folders, as well as any files with a suffix of `.test` or `.spec` (e.g. `Component.test.js` or `Component.spec.js`). It will also find files called `test.js` or `spec.js`. See also [`testMatch` [array\]](#testmatch-arraystring), but note that you cannot specify both options. + +The following is a visualization of the default regex: + +```bash +├── __tests__ +│ └── component.spec.js # test +│ └── anything # test +├── package.json # not test +├── foo.test.js # test +├── bar.spec.jsx # test +└── component.js # not test +``` + +_Note: `testRegex` will try to detect test files using the **absolute file path**, therefore, having a folder with a name that matches it will run all the files as tests_ + +### `testResultsProcessor` [string] + +Default: `undefined` + +This option allows the use of a custom results processor. This processor must be a node module that exports a function expecting an object with the following structure as the first argument and return it: + +```json +{ + "success": bool, + "startTime": epoch, + "numTotalTestSuites": number, + "numPassedTestSuites": number, + "numFailedTestSuites": number, + "numRuntimeErrorTestSuites": number, + "numTotalTests": number, + "numPassedTests": number, + "numFailedTests": number, + "numPendingTests": number, + "numTodoTests": number, + "openHandles": Array, + "testResults": [{ + "numFailingTests": number, + "numPassingTests": number, + "numPendingTests": number, + "testResults": [{ + "title": string (message in it block), + "status": "failed" | "pending" | "passed", + "ancestorTitles": [string (message in describe blocks)], + "failureMessages": [string], + "numPassingAsserts": number, + "location": { + "column": number, + "line": number + } + }, + ... + ], + "perfStats": { + "start": epoch, + "end": epoch + }, + "testFilePath": absolute path to test file, + "coverage": {} + }, + ... + ] +} +``` + +### `testRunner` [string] + +Default: `jasmine2` + +This option allows the use of a custom test runner. The default is jasmine2. A custom test runner can be provided by specifying a path to a test runner implementation. + +The test runner module must export a function with the following signature: + +```ts +function testRunner( + globalConfig: GlobalConfig, + config: ProjectConfig, + environment: Environment, + runtime: Runtime, + testPath: string, +): Promise; +``` + +An example of such function can be found in our default [jasmine2 test runner package](https://github.com/facebook/jest/blob/master/packages/jest-jasmine2/src/index.ts). + +### `testSequencer` [string] + +Default: `@jest/test-sequencer` + +This option allows you to use a custom sequencer instead of Jest's default. `sort` may optionally return a Promise. + +Example: + +Sort test path alphabetically. + +```js +// testSequencer.js +const Sequencer = require('@jest/test-sequencer').default; + +class CustomSequencer extends Sequencer { + sort(tests) { + // Test structure information + // https://github.com/facebook/jest/blob/6b8b1404a1d9254e7d5d90a8934087a9c9899dab/packages/jest-runner/src/types.ts#L17-L21 + const copyTests = Array.from(tests); + return copyTests.sort((testA, testB) => (testA.path > testB.path ? 1 : -1)); + } +} + +module.exports = CustomSequencer; +``` + +Use it in your Jest config file like this: + +```json +{ + "testSequencer": "path/to/testSequencer.js" +} +``` + +### `testTimeout` [number] + +Default: `5000` + +Default timeout of a test in milliseconds. + +### `testURL` [string] + +Default: `http://localhost` + +This option sets the URL for the jsdom environment. It is reflected in properties such as `location.href`. + +### `timers` [string] + +Default: `real` + +Setting this value to `legacy` or `fake` allows the use of fake timers for functions such as `setTimeout`. Fake timers are useful when a piece of code sets a long timeout that we don't want to wait for in a test. + +If the value is `modern`, [`@sinonjs/fake-timers`](https://github.com/sinonjs/fake-timers) will be used as implementation instead of Jest's own legacy implementation. This will be the default fake implementation in Jest 27. + +### `transform` [object\] + +Default: `{"^.+\\.[jt]sx?$": "babel-jest"}` + +A map from regular expressions to paths to transformers. A transformer is a module that provides a synchronous function for transforming source files. For example, if you wanted to be able to use a new language feature in your modules or tests that aren't yet supported by node, you might plug in one of many compilers that compile a future version of JavaScript to a current one. Example: see the [examples/typescript](https://github.com/facebook/jest/blob/master/examples/typescript/package.json#L16) example or the [webpack tutorial](Webpack.md). + +Examples of such compilers include: + +- [Babel](https://babeljs.io/) +- [TypeScript](http://www.typescriptlang.org/) +- [async-to-gen](http://github.com/leebyron/async-to-gen#jest) +- To build your own please visit the [Custom Transformer](TutorialReact.md#custom-transformers) section + +You can pass configuration to a transformer like `{filePattern: ['path-to-transformer', {options}]}` For example, to configure babel-jest for non-default behavior, `{"\\.js$": ['babel-jest', {rootMode: "upward"}]}` + +_Note: a transformer is only run once per file unless the file has changed. During the development of a transformer it can be useful to run Jest with `--no-cache` to frequently [delete Jest's cache](Troubleshooting.md#caching-issues)._ + +_Note: when adding additional code transformers, this will overwrite the default config and `babel-jest` is no longer automatically loaded. If you want to use it to compile JavaScript or Typescript, it has to be explicitly defined by adding `{"^.+\\.[jt]sx?$": "babel-jest"}` to the transform property. See [babel-jest plugin](https://github.com/facebook/jest/tree/master/packages/babel-jest#setup)_ + +### `transformIgnorePatterns` [array\] + +Default: `["/node_modules/", "\\.pnp\\.[^\\\/]+$"]` + +An array of regexp pattern strings that are matched against all source file paths before transformation. If the test path matches any of the patterns, it will not be transformed. + +These pattern strings match against the full path. Use the `` string token to include the path to your project's root directory to prevent it from accidentally ignoring all of your files in different environments that may have different root directories. + +Example: `["/bower_components/", "/node_modules/"]`. + +Sometimes it happens (especially in React Native or TypeScript projects) that 3rd party modules are published as untranspiled. Since all files inside `node_modules` are not transformed by default, Jest will not understand the code in these modules, resulting in syntax errors. To overcome this, you may use `transformIgnorePatterns` to allow transpiling such modules. You'll find a good example of this use case in [React Native Guide](https://jestjs.io/docs/en/tutorial-react-native#transformignorepatterns-customization). + +### `unmockedModulePathPatterns` [array\] + +Default: `[]` + +An array of regexp pattern strings that are matched against all modules before the module loader will automatically return a mock for them. If a module's path matches any of the patterns in this list, it will not be automatically mocked by the module loader. + +This is useful for some commonly used 'utility' modules that are almost always used as implementation details almost all the time (like underscore/lo-dash, etc). It's generally a best practice to keep this list as small as possible and always use explicit `jest.mock()`/`jest.unmock()` calls in individual tests. Explicit per-test setup is far easier for other readers of the test to reason about the environment the test will run in. + +It is possible to override this setting in individual tests by explicitly calling `jest.mock()` at the top of the test file. + +### `verbose` [boolean] + +Default: `false` + +Indicates whether each individual test should be reported during the run. All errors will also still be shown on the bottom after execution. Note that if there is only one test file being run it will default to `true`. + +### `watchPathIgnorePatterns` [array\] + +Default: `[]` + +An array of RegExp patterns that are matched against all source file paths before re-running tests in watch mode. If the file path matches any of the patterns, when it is updated, it will not trigger a re-run of tests. + +These patterns match against the full path. Use the `` string token to include the path to your project's root directory to prevent it from accidentally ignoring all of your files in different environments that may have different root directories. Example: `["/node_modules/"]`. + +Even if nothing is specified here, the watcher will ignore changes to any hidden files and directories, i.e. files and folders that begin with a dot (`.`). + +### `watchPlugins` [array\] + +Default: `[]` + +This option allows you to use custom watch plugins. Read more about watch plugins [here](watch-plugins). + +Examples of watch plugins include: + +- [`jest-watch-master`](https://github.com/rickhanlonii/jest-watch-master) +- [`jest-watch-select-projects`](https://github.com/rogeliog/jest-watch-select-projects) +- [`jest-watch-suspend`](https://github.com/unional/jest-watch-suspend) +- [`jest-watch-typeahead`](https://github.com/jest-community/jest-watch-typeahead) +- [`jest-watch-yarn-workspaces`](https://github.com/cameronhunter/jest-watch-directories/tree/master/packages/jest-watch-yarn-workspaces) + +_Note: The values in the `watchPlugins` property value can omit the `jest-watch-` prefix of the package name._ + +### `//` [string] + +No default + +This option allows comments in `package.json`. Include the comment text as the value of this key anywhere in `package.json`. + +Example: + +```json +{ + "name": "my-project", + "jest": { + "//": "Comment goes here", + "verbose": true + } +} +``` diff --git a/website/versioned_docs/version-26.4/GlobalAPI.md b/website/versioned_docs/version-26.4/GlobalAPI.md new file mode 100644 index 000000000000..f4aa940b0df8 --- /dev/null +++ b/website/versioned_docs/version-26.4/GlobalAPI.md @@ -0,0 +1,841 @@ +--- +id: version-26.4-api +title: Globals +original_id: api +--- + +In your test files, Jest puts each of these methods and objects into the global environment. You don't have to require or import anything to use them. However, if you prefer explicit imports, you can do `import {describe, expect, test} from '@jest/globals'`. + +## Methods + + + +--- + +## Reference + +### `afterAll(fn, timeout)` + +Runs a function after all the tests in this file have completed. If the function returns a promise or is a generator, Jest waits for that promise to resolve before continuing. + +Optionally, you can provide a `timeout` (in milliseconds) for specifying how long to wait before aborting. _Note: The default timeout is 5 seconds._ + +This is often useful if you want to clean up some global setup state that is shared across tests. + +For example: + +```js +const globalDatabase = makeGlobalDatabase(); + +function cleanUpDatabase(db) { + db.cleanUp(); +} + +afterAll(() => { + cleanUpDatabase(globalDatabase); +}); + +test('can find things', () => { + return globalDatabase.find('thing', {}, results => { + expect(results.length).toBeGreaterThan(0); + }); +}); + +test('can insert a thing', () => { + return globalDatabase.insert('thing', makeThing(), response => { + expect(response.success).toBeTruthy(); + }); +}); +``` + +Here the `afterAll` ensures that `cleanUpDatabase` is called after all tests run. + +If `afterAll` is inside a `describe` block, it runs at the end of the describe block. + +If you want to run some cleanup after every test instead of after all tests, use `afterEach` instead. + +### `afterEach(fn, timeout)` + +Runs a function after each one of the tests in this file completes. If the function returns a promise or is a generator, Jest waits for that promise to resolve before continuing. + +Optionally, you can provide a `timeout` (in milliseconds) for specifying how long to wait before aborting. _Note: The default timeout is 5 seconds._ + +This is often useful if you want to clean up some temporary state that is created by each test. + +For example: + +```js +const globalDatabase = makeGlobalDatabase(); + +function cleanUpDatabase(db) { + db.cleanUp(); +} + +afterEach(() => { + cleanUpDatabase(globalDatabase); +}); + +test('can find things', () => { + return globalDatabase.find('thing', {}, results => { + expect(results.length).toBeGreaterThan(0); + }); +}); + +test('can insert a thing', () => { + return globalDatabase.insert('thing', makeThing(), response => { + expect(response.success).toBeTruthy(); + }); +}); +``` + +Here the `afterEach` ensures that `cleanUpDatabase` is called after each test runs. + +If `afterEach` is inside a `describe` block, it only runs after the tests that are inside this describe block. + +If you want to run some cleanup just once, after all of the tests run, use `afterAll` instead. + +### `beforeAll(fn, timeout)` + +Runs a function before any of the tests in this file run. If the function returns a promise or is a generator, Jest waits for that promise to resolve before running tests. + +Optionally, you can provide a `timeout` (in milliseconds) for specifying how long to wait before aborting. _Note: The default timeout is 5 seconds._ + +This is often useful if you want to set up some global state that will be used by many tests. + +For example: + +```js +const globalDatabase = makeGlobalDatabase(); + +beforeAll(() => { + // Clears the database and adds some testing data. + // Jest will wait for this promise to resolve before running tests. + return globalDatabase.clear().then(() => { + return globalDatabase.insert({testData: 'foo'}); + }); +}); + +// Since we only set up the database once in this example, it's important +// that our tests don't modify it. +test('can find things', () => { + return globalDatabase.find('thing', {}, results => { + expect(results.length).toBeGreaterThan(0); + }); +}); +``` + +Here the `beforeAll` ensures that the database is set up before tests run. If setup was synchronous, you could do this without `beforeAll`. The key is that Jest will wait for a promise to resolve, so you can have asynchronous setup as well. + +If `beforeAll` is inside a `describe` block, it runs at the beginning of the describe block. + +If you want to run something before every test instead of before any test runs, use `beforeEach` instead. + +### `beforeEach(fn, timeout)` + +Runs a function before each of the tests in this file runs. If the function returns a promise or is a generator, Jest waits for that promise to resolve before running the test. + +Optionally, you can provide a `timeout` (in milliseconds) for specifying how long to wait before aborting. _Note: The default timeout is 5 seconds._ + +This is often useful if you want to reset some global state that will be used by many tests. + +For example: + +```js +const globalDatabase = makeGlobalDatabase(); + +beforeEach(() => { + // Clears the database and adds some testing data. + // Jest will wait for this promise to resolve before running tests. + return globalDatabase.clear().then(() => { + return globalDatabase.insert({testData: 'foo'}); + }); +}); + +test('can find things', () => { + return globalDatabase.find('thing', {}, results => { + expect(results.length).toBeGreaterThan(0); + }); +}); + +test('can insert a thing', () => { + return globalDatabase.insert('thing', makeThing(), response => { + expect(response.success).toBeTruthy(); + }); +}); +``` + +Here the `beforeEach` ensures that the database is reset for each test. + +If `beforeEach` is inside a `describe` block, it runs for each test in the describe block. + +If you only need to run some setup code once, before any tests run, use `beforeAll` instead. + +### `describe(name, fn)` + +`describe(name, fn)` creates a block that groups together several related tests. For example, if you have a `myBeverage` object that is supposed to be delicious but not sour, you could test it with: + +```js +const myBeverage = { + delicious: true, + sour: false, +}; + +describe('my beverage', () => { + test('is delicious', () => { + expect(myBeverage.delicious).toBeTruthy(); + }); + + test('is not sour', () => { + expect(myBeverage.sour).toBeFalsy(); + }); +}); +``` + +This isn't required - you can write the `test` blocks directly at the top level. But this can be handy if you prefer your tests to be organized into groups. + +You can also nest `describe` blocks if you have a hierarchy of tests: + +```js +const binaryStringToNumber = binString => { + if (!/^[01]+$/.test(binString)) { + throw new CustomError('Not a binary number.'); + } + + return parseInt(binString, 2); +}; + +describe('binaryStringToNumber', () => { + describe('given an invalid binary string', () => { + test('composed of non-numbers throws CustomError', () => { + expect(() => binaryStringToNumber('abc')).toThrowError(CustomError); + }); + + test('with extra whitespace throws CustomError', () => { + expect(() => binaryStringToNumber(' 100')).toThrowError(CustomError); + }); + }); + + describe('given a valid binary string', () => { + test('returns the correct number', () => { + expect(binaryStringToNumber('100')).toBe(4); + }); + }); +}); +``` + +### `describe.each(table)(name, fn, timeout)` + +Use `describe.each` if you keep duplicating the same test suites with different data. `describe.each` allows you to write the test suite once and pass data in. + +`describe.each` is available with two APIs: + +#### 1. `describe.each(table)(name, fn, timeout)` + +- `table`: `Array` of Arrays with the arguments that are passed into the `fn` for each row. + - _Note_ If you pass in a 1D array of primitives, internally it will be mapped to a table i.e. `[1, 2, 3] -> [[1], [2], [3]]` +- `name`: `String` the title of the test suite. + - Generate unique test titles by positionally injecting parameters with [`printf` formatting](https://nodejs.org/api/util.html#util_util_format_format_args): + - `%p` - [pretty-format](https://www.npmjs.com/package/pretty-format). + - `%s`- String. + - `%d`- Number. + - `%i` - Integer. + - `%f` - Floating point value. + - `%j` - JSON. + - `%o` - Object. + - `%#` - Index of the test case. + - `%%` - single percent sign ('%'). This does not consume an argument. +- `fn`: `Function` the suite of tests to be ran, this is the function that will receive the parameters in each row as function arguments. +- Optionally, you can provide a `timeout` (in milliseconds) for specifying how long to wait for each row before aborting. _Note: The default timeout is 5 seconds._ + +Example: + +```js +describe.each([ + [1, 1, 2], + [1, 2, 3], + [2, 1, 3], +])('.add(%i, %i)', (a, b, expected) => { + test(`returns ${expected}`, () => { + expect(a + b).toBe(expected); + }); + + test(`returned value not be greater than ${expected}`, () => { + expect(a + b).not.toBeGreaterThan(expected); + }); + + test(`returned value not be less than ${expected}`, () => { + expect(a + b).not.toBeLessThan(expected); + }); +}); +``` + +#### 2. `` describe.each`table`(name, fn, timeout) `` + +- `table`: `Tagged Template Literal` + - First row of variable name column headings separated with `|` + - One or more subsequent rows of data supplied as template literal expressions using `${value}` syntax. +- `name`: `String` the title of the test suite, use `$variable` to inject test data into the suite title from the tagged template expressions. + - To inject nested object values use you can supply a keyPath i.e. `$variable.path.to.value` +- `fn`: `Function` the suite of tests to be ran, this is the function that will receive the test data object. +- Optionally, you can provide a `timeout` (in milliseconds) for specifying how long to wait for each row before aborting. _Note: The default timeout is 5 seconds._ + +Example: + +```js +describe.each` + a | b | expected + ${1} | ${1} | ${2} + ${1} | ${2} | ${3} + ${2} | ${1} | ${3} +`('$a + $b', ({a, b, expected}) => { + test(`returns ${expected}`, () => { + expect(a + b).toBe(expected); + }); + + test(`returned value not be greater than ${expected}`, () => { + expect(a + b).not.toBeGreaterThan(expected); + }); + + test(`returned value not be less than ${expected}`, () => { + expect(a + b).not.toBeLessThan(expected); + }); +}); +``` + +### `describe.only(name, fn)` + +Also under the alias: `fdescribe(name, fn)` + +You can use `describe.only` if you want to run only one describe block: + +```js +describe.only('my beverage', () => { + test('is delicious', () => { + expect(myBeverage.delicious).toBeTruthy(); + }); + + test('is not sour', () => { + expect(myBeverage.sour).toBeFalsy(); + }); +}); + +describe('my other beverage', () => { + // ... will be skipped +}); +``` + +### `describe.only.each(table)(name, fn)` + +Also under the aliases: `fdescribe.each(table)(name, fn)` and `` fdescribe.each`table`(name, fn) `` + +Use `describe.only.each` if you want to only run specific tests suites of data driven tests. + +`describe.only.each` is available with two APIs: + +#### `describe.only.each(table)(name, fn)` + +```js +describe.only.each([ + [1, 1, 2], + [1, 2, 3], + [2, 1, 3], +])('.add(%i, %i)', (a, b, expected) => { + test(`returns ${expected}`, () => { + expect(a + b).toBe(expected); + }); +}); + +test('will not be ran', () => { + expect(1 / 0).toBe(Infinity); +}); +``` + +#### `` describe.only.each`table`(name, fn) `` + +```js +describe.only.each` + a | b | expected + ${1} | ${1} | ${2} + ${1} | ${2} | ${3} + ${2} | ${1} | ${3} +`('returns $expected when $a is added $b', ({a, b, expected}) => { + test('passes', () => { + expect(a + b).toBe(expected); + }); +}); + +test('will not be ran', () => { + expect(1 / 0).toBe(Infinity); +}); +``` + +### `describe.skip(name, fn)` + +Also under the alias: `xdescribe(name, fn)` + +You can use `describe.skip` if you do not want to run a particular describe block: + +```js +describe('my beverage', () => { + test('is delicious', () => { + expect(myBeverage.delicious).toBeTruthy(); + }); + + test('is not sour', () => { + expect(myBeverage.sour).toBeFalsy(); + }); +}); + +describe.skip('my other beverage', () => { + // ... will be skipped +}); +``` + +Using `describe.skip` is often a cleaner alternative to temporarily commenting out a chunk of tests. + +### `describe.skip.each(table)(name, fn)` + +Also under the aliases: `xdescribe.each(table)(name, fn)` and `` xdescribe.each`table`(name, fn) `` + +Use `describe.skip.each` if you want to stop running a suite of data driven tests. + +`describe.skip.each` is available with two APIs: + +#### `describe.skip.each(table)(name, fn)` + +```js +describe.skip.each([ + [1, 1, 2], + [1, 2, 3], + [2, 1, 3], +])('.add(%i, %i)', (a, b, expected) => { + test(`returns ${expected}`, () => { + expect(a + b).toBe(expected); // will not be ran + }); +}); + +test('will be ran', () => { + expect(1 / 0).toBe(Infinity); +}); +``` + +#### `` describe.skip.each`table`(name, fn) `` + +```js +describe.skip.each` + a | b | expected + ${1} | ${1} | ${2} + ${1} | ${2} | ${3} + ${2} | ${1} | ${3} +`('returns $expected when $a is added $b', ({a, b, expected}) => { + test('will not be ran', () => { + expect(a + b).toBe(expected); // will not be ran + }); +}); + +test('will be ran', () => { + expect(1 / 0).toBe(Infinity); +}); +``` + +### `test(name, fn, timeout)` + +Also under the alias: `it(name, fn, timeout)` + +All you need in a test file is the `test` method which runs a test. For example, let's say there's a function `inchesOfRain()` that should be zero. Your whole test could be: + +```js +test('did not rain', () => { + expect(inchesOfRain()).toBe(0); +}); +``` + +The first argument is the test name; the second argument is a function that contains the expectations to test. The third argument (optional) is `timeout` (in milliseconds) for specifying how long to wait before aborting. _Note: The default timeout is 5 seconds._ + +> Note: If a **promise is returned** from `test`, Jest will wait for the promise to resolve before letting the test complete. Jest will also wait if you **provide an argument to the test function**, usually called `done`. This could be handy when you want to test callbacks. See how to test async code [here](TestingAsyncCode.md#callbacks). + +For example, let's say `fetchBeverageList()` returns a promise that is supposed to resolve to a list that has `lemon` in it. You can test this with: + +```js +test('has lemon in it', () => { + return fetchBeverageList().then(list => { + expect(list).toContain('lemon'); + }); +}); +``` + +Even though the call to `test` will return right away, the test doesn't complete until the promise resolves as well. + +### `test.concurrent(name, fn, timeout)` + +Also under the alias: `it.concurrent(name, fn, timeout)` + +Use `test.concurrent` if you want the test to run concurrently. + +> Note: `test.concurrent` is considered experimental - see [here])https://github.com/facebook/jest/labels/Area%3A%20Concurrent) for details on missing features and other issues + +The first argument is the test name; the second argument is an asynchronous function that contains the expectations to test. The third argument (optional) is `timeout` (in milliseconds) for specifying how long to wait before aborting. _Note: The default timeout is 5 seconds._ + +``` +test.concurrent('addition of 2 numbers', async () => { + expect(5 + 3).toBe(8); +}); + +test.concurrent('subtraction 2 numbers', async () => { + expect(5 - 3).toBe(2); +}); +``` + +> Note: Use `maxConcurrency` in configuration to prevents Jest from executing more than the specified amount of tests at the same time + +### `test.concurrent.each(table)(name, fn, timeout)` + +Also under the alias: `it.concurrent.each(table)(name, fn, timeout)` + +Use `test.concurrent.each` if you keep duplicating the same test with different data. `test.each` allows you to write the test once and pass data in, the tests are all run asynchronously. + +`test.concurrent.each` is available with two APIs: + +#### 1. `test.concurrent.each(table)(name, fn, timeout)` + +- `table`: `Array` of Arrays with the arguments that are passed into the test `fn` for each row. + - _Note_ If you pass in a 1D array of primitives, internally it will be mapped to a table i.e. `[1, 2, 3] -> [[1], [2], [3]]` +- `name`: `String` the title of the test block. + - Generate unique test titles by positionally injecting parameters with [`printf` formatting](https://nodejs.org/api/util.html#util_util_format_format_args): + - `%p` - [pretty-format](https://www.npmjs.com/package/pretty-format). + - `%s`- String. + - `%d`- Number. + - `%i` - Integer. + - `%f` - Floating point value. + - `%j` - JSON. + - `%o` - Object. + - `%#` - Index of the test case. + - `%%` - single percent sign ('%'). This does not consume an argument. +- `fn`: `Function` the test to be ran, this is the function that will receive the parameters in each row as function arguments, **this will have to be an asynchronous function**. +- Optionally, you can provide a `timeout` (in milliseconds) for specifying how long to wait for each row before aborting. _Note: The default timeout is 5 seconds._ + +Example: + +```js +test.concurrent.each([ + [1, 1, 2], + [1, 2, 3], + [2, 1, 3], +])('.add(%i, %i)', (a, b, expected) => { + expect(a + b).toBe(expected); +}); +``` + +#### 2. `` test.concurrent.each`table`(name, fn, timeout) `` + +- `table`: `Tagged Template Literal` + - First row of variable name column headings separated with `|` + - One or more subsequent rows of data supplied as template literal expressions using `${value}` syntax. +- `name`: `String` the title of the test, use `$variable` to inject test data into the test title from the tagged template expressions. + - To inject nested object values use you can supply a keyPath i.e. `$variable.path.to.value` +- `fn`: `Function` the test to be ran, this is the function that will receive the test data object, **this will have to be an asynchronous function**. +- Optionally, you can provide a `timeout` (in milliseconds) for specifying how long to wait for each row before aborting. _Note: The default timeout is 5 seconds._ + +Example: + +```js +test.concurrent.each` + a | b | expected + ${1} | ${1} | ${2} + ${1} | ${2} | ${3} + ${2} | ${1} | ${3} +`('returns $expected when $a is added $b', ({a, b, expected}) => { + expect(a + b).toBe(expected); +}); +``` + +### `test.concurrent.only.each(table)(name, fn)` + +Also under the alias: `it.concurrent.only.each(table)(name, fn)` + +Use `test.concurrent.only.each` if you want to only run specific tests with different test data concurrently. + +`test.concurrent.only.each` is available with two APIs: + +#### `test.concurrent.only.each(table)(name, fn)` + +```js +test.concurrent.only.each([ + [1, 1, 2], + [1, 2, 3], + [2, 1, 3], +])('.add(%i, %i)', async (a, b, expected) => { + expect(a + b).toBe(expected); +}); + +test('will not be ran', () => { + expect(1 / 0).toBe(Infinity); +}); +``` + +#### `` test.only.each`table`(name, fn) `` + +```js +test.concurrent.only.each` + a | b | expected + ${1} | ${1} | ${2} + ${1} | ${2} | ${3} + ${2} | ${1} | ${3} +`('returns $expected when $a is added $b', async ({a, b, expected}) => { + expect(a + b).toBe(expected); +}); + +test('will not be ran', () => { + expect(1 / 0).toBe(Infinity); +}); +``` + +### `test.concurrent.skip.each(table)(name, fn)` + +Also under the alias: `it.concurrent.skip.each(table)(name, fn)` + +Use `test.concurrent.skip.each` if you want to stop running a collection of asynchronous data driven tests. + +`test.concurrent.skip.each` is available with two APIs: + +#### `test.concurrent.skip.each(table)(name, fn)` + +```js +test.concurrent.skip.each([ + [1, 1, 2], + [1, 2, 3], + [2, 1, 3], +])('.add(%i, %i)', async (a, b, expected) => { + expect(a + b).toBe(expected); // will not be ran +}); + +test('will be ran', () => { + expect(1 / 0).toBe(Infinity); +}); +``` + +#### `` test.concurrent.skip.each`table`(name, fn) `` + +```js +test.concurrent.skip.each` + a | b | expected + ${1} | ${1} | ${2} + ${1} | ${2} | ${3} + ${2} | ${1} | ${3} +`('returns $expected when $a is added $b', async ({a, b, expected}) => { + expect(a + b).toBe(expected); // will not be ran +}); + +test('will be ran', () => { + expect(1 / 0).toBe(Infinity); +}); +``` + +### `test.each(table)(name, fn, timeout)` + +Also under the alias: `it.each(table)(name, fn)` and `` it.each`table`(name, fn) `` + +Use `test.each` if you keep duplicating the same test with different data. `test.each` allows you to write the test once and pass data in. + +`test.each` is available with two APIs: + +#### 1. `test.each(table)(name, fn, timeout)` + +- `table`: `Array` of Arrays with the arguments that are passed into the test `fn` for each row. + - _Note_ If you pass in a 1D array of primitives, internally it will be mapped to a table i.e. `[1, 2, 3] -> [[1], [2], [3]]` +- `name`: `String` the title of the test block. + - Generate unique test titles by positionally injecting parameters with [`printf` formatting](https://nodejs.org/api/util.html#util_util_format_format_args): + - `%p` - [pretty-format](https://www.npmjs.com/package/pretty-format). + - `%s`- String. + - `%d`- Number. + - `%i` - Integer. + - `%f` - Floating point value. + - `%j` - JSON. + - `%o` - Object. + - `%#` - Index of the test case. + - `%%` - single percent sign ('%'). This does not consume an argument. +- `fn`: `Function` the test to be ran, this is the function that will receive the parameters in each row as function arguments. +- Optionally, you can provide a `timeout` (in milliseconds) for specifying how long to wait for each row before aborting. _Note: The default timeout is 5 seconds._ + +Example: + +```js +test.each([ + [1, 1, 2], + [1, 2, 3], + [2, 1, 3], +])('.add(%i, %i)', (a, b, expected) => { + expect(a + b).toBe(expected); +}); +``` + +#### 2. `` test.each`table`(name, fn, timeout) `` + +- `table`: `Tagged Template Literal` + - First row of variable name column headings separated with `|` + - One or more subsequent rows of data supplied as template literal expressions using `${value}` syntax. +- `name`: `String` the title of the test, use `$variable` to inject test data into the test title from the tagged template expressions. + - To inject nested object values use you can supply a keyPath i.e. `$variable.path.to.value` +- `fn`: `Function` the test to be ran, this is the function that will receive the test data object. +- Optionally, you can provide a `timeout` (in milliseconds) for specifying how long to wait for each row before aborting. _Note: The default timeout is 5 seconds._ + +Example: + +```js +test.each` + a | b | expected + ${1} | ${1} | ${2} + ${1} | ${2} | ${3} + ${2} | ${1} | ${3} +`('returns $expected when $a is added $b', ({a, b, expected}) => { + expect(a + b).toBe(expected); +}); +``` + +### `test.only(name, fn, timeout)` + +Also under the aliases: `it.only(name, fn, timeout)`, and `fit(name, fn, timeout)` + +When you are debugging a large test file, you will often only want to run a subset of tests. You can use `.only` to specify which tests are the only ones you want to run in that test file. + +Optionally, you can provide a `timeout` (in milliseconds) for specifying how long to wait before aborting. _Note: The default timeout is 5 seconds._ + +For example, let's say you had these tests: + +```js +test.only('it is raining', () => { + expect(inchesOfRain()).toBeGreaterThan(0); +}); + +test('it is not snowing', () => { + expect(inchesOfSnow()).toBe(0); +}); +``` + +Only the "it is raining" test will run in that test file, since it is run with `test.only`. + +Usually you wouldn't check code using `test.only` into source control - you would use it for debugging, and remove it once you have fixed the broken tests. + +### `test.only.each(table)(name, fn)` + +Also under the aliases: `it.only.each(table)(name, fn)`, `fit.each(table)(name, fn)`, `` it.only.each`table`(name, fn) `` and `` fit.each`table`(name, fn) `` + +Use `test.only.each` if you want to only run specific tests with different test data. + +`test.only.each` is available with two APIs: + +#### `test.only.each(table)(name, fn)` + +```js +test.only.each([ + [1, 1, 2], + [1, 2, 3], + [2, 1, 3], +])('.add(%i, %i)', (a, b, expected) => { + expect(a + b).toBe(expected); +}); + +test('will not be ran', () => { + expect(1 / 0).toBe(Infinity); +}); +``` + +#### `` test.only.each`table`(name, fn) `` + +```js +test.only.each` + a | b | expected + ${1} | ${1} | ${2} + ${1} | ${2} | ${3} + ${2} | ${1} | ${3} +`('returns $expected when $a is added $b', ({a, b, expected}) => { + expect(a + b).toBe(expected); +}); + +test('will not be ran', () => { + expect(1 / 0).toBe(Infinity); +}); +``` + +### `test.skip(name, fn)` + +Also under the aliases: `it.skip(name, fn)`, `xit(name, fn)`, and `xtest(name, fn)` + +When you are maintaining a large codebase, you may sometimes find a test that is temporarily broken for some reason. If you want to skip running this test, but you don't want to delete this code, you can use `test.skip` to specify some tests to skip. + +For example, let's say you had these tests: + +```js +test('it is raining', () => { + expect(inchesOfRain()).toBeGreaterThan(0); +}); + +test.skip('it is not snowing', () => { + expect(inchesOfSnow()).toBe(0); +}); +``` + +Only the "it is raining" test will run, since the other test is run with `test.skip`. + +You could comment the test out, but it's often a bit nicer to use `test.skip` because it will maintain indentation and syntax highlighting. + +### `test.skip.each(table)(name, fn)` + +Also under the aliases: `it.skip.each(table)(name, fn)`, `xit.each(table)(name, fn)`, `xtest.each(table)(name, fn)`, `` it.skip.each`table`(name, fn) ``, `` xit.each`table`(name, fn) `` and `` xtest.each`table`(name, fn) `` + +Use `test.skip.each` if you want to stop running a collection of data driven tests. + +`test.skip.each` is available with two APIs: + +#### `test.skip.each(table)(name, fn)` + +```js +test.skip.each([ + [1, 1, 2], + [1, 2, 3], + [2, 1, 3], +])('.add(%i, %i)', (a, b, expected) => { + expect(a + b).toBe(expected); // will not be ran +}); + +test('will be ran', () => { + expect(1 / 0).toBe(Infinity); +}); +``` + +#### `` test.skip.each`table`(name, fn) `` + +```js +test.skip.each` + a | b | expected + ${1} | ${1} | ${2} + ${1} | ${2} | ${3} + ${2} | ${1} | ${3} +`('returns $expected when $a is added $b', ({a, b, expected}) => { + expect(a + b).toBe(expected); // will not be ran +}); + +test('will be ran', () => { + expect(1 / 0).toBe(Infinity); +}); +``` + +### `test.todo(name)` + +Also under the alias: `it.todo(name)` + +Use `test.todo` when you are planning on writing tests. These tests will be highlighted in the summary output at the end so you know how many tests you still need todo. + +_Note_: If you supply a test callback function then the `test.todo` will throw an error. If you have already implemented the test and it is broken and you do not want it to run, then use `test.skip` instead. + +#### API + +- `name`: `String` the title of the test plan. + +Example: + +```js +const add = (a, b) => a + b; + +test.todo('add should be associative'); +``` diff --git a/website/versioned_docs/version-26.4/SnapshotTesting.md b/website/versioned_docs/version-26.4/SnapshotTesting.md new file mode 100644 index 000000000000..776b3524e1bd --- /dev/null +++ b/website/versioned_docs/version-26.4/SnapshotTesting.md @@ -0,0 +1,319 @@ +--- +id: version-26.4-snapshot-testing +title: Snapshot Testing +original_id: snapshot-testing +--- + +Snapshot tests are a very useful tool whenever you want to make sure your UI does not change unexpectedly. + +A typical snapshot test case renders a UI component, takes a snapshot, then compares it to a reference snapshot file stored alongside the test. The test will fail if the two snapshots do not match: either the change is unexpected, or the reference snapshot needs to be updated to the new version of the UI component. + +## Snapshot Testing with Jest + +A similar approach can be taken when it comes to testing your React components. Instead of rendering the graphical UI, which would require building the entire app, you can use a test renderer to quickly generate a serializable value for your React tree. Consider this [example test](https://github.com/facebook/jest/blob/master/examples/snapshot/__tests__/link.react.test.js) for a [Link component](https://github.com/facebook/jest/blob/master/examples/snapshot/Link.react.js): + +```javascript +import React from 'react'; +import Link from '../Link.react'; +import renderer from 'react-test-renderer'; + +it('renders correctly', () => { + const tree = renderer + .create(Facebook) + .toJSON(); + expect(tree).toMatchSnapshot(); +}); +``` + +The first time this test is run, Jest creates a [snapshot file](https://github.com/facebook/jest/blob/master/examples/snapshot/__tests__/__snapshots__/link.react.test.js.snap) that looks like this: + +```javascript +exports[`renders correctly 1`] = ` + + Facebook + +`; +``` + +The snapshot artifact should be committed alongside code changes, and reviewed as part of your code review process. Jest uses [pretty-format](https://github.com/facebook/jest/tree/master/packages/pretty-format) to make snapshots human-readable during code review. On subsequent test runs, Jest will compare the rendered output with the previous snapshot. If they match, the test will pass. If they don't match, either the test runner found a bug in your code (in this case, it's `` component) that should be fixed, or the implementation has changed and the snapshot needs to be updated. + +> Note: The snapshot is directly scoped to the data you render – in our example it's `` component with page prop passed to it. This implies that even if any other file has missing props (Say, `App.js`) in the `` component, it will still pass the test as the test doesn't know the usage of `` component and it's scoped only to the `Link.react.js`. Also, Rendering the same component with different props in other snapshot tests will not affect the first one, as the tests don't know about each other. + +More information on how snapshot testing works and why we built it can be found on the [release blog post](https://jestjs.io/blog/2016/07/27/jest-14.html). We recommend reading [this blog post](http://benmccormick.org/2016/09/19/testing-with-jest-snapshots-first-impressions/) to get a good sense of when you should use snapshot testing. We also recommend watching this [egghead video](https://egghead.io/lessons/javascript-use-jest-s-snapshot-testing-feature?pl=testing-javascript-with-jest-a36c4074) on Snapshot Testing with Jest. + +### Updating Snapshots + +It's straightforward to spot when a snapshot test fails after a bug has been introduced. When that happens, go ahead and fix the issue and make sure your snapshot tests are passing again. Now, let's talk about the case when a snapshot test is failing due to an intentional implementation change. + +One such situation can arise if we intentionally change the address the Link component in our example is pointing to. + +```javascript +// Updated test case with a Link to a different address +it('renders correctly', () => { + const tree = renderer + .create(Instagram) + .toJSON(); + expect(tree).toMatchSnapshot(); +}); +``` + +In that case, Jest will print this output: + +![](/img/content/failedSnapshotTest.png) + +Since we just updated our component to point to a different address, it's reasonable to expect changes in the snapshot for this component. Our snapshot test case is failing because the snapshot for our updated component no longer matches the snapshot artifact for this test case. + +To resolve this, we will need to update our snapshot artifacts. You can run Jest with a flag that will tell it to re-generate snapshots: + +```bash +jest --updateSnapshot +``` + +Go ahead and accept the changes by running the above command. You may also use the equivalent single-character `-u` flag to re-generate snapshots if you prefer. This will re-generate snapshot artifacts for all failing snapshot tests. If we had any additional failing snapshot tests due to an unintentional bug, we would need to fix the bug before re-generating snapshots to avoid recording snapshots of the buggy behavior. + +If you'd like to limit which snapshot test cases get re-generated, you can pass an additional `--testNamePattern` flag to re-record snapshots only for those tests that match the pattern. + +You can try out this functionality by cloning the [snapshot example](https://github.com/facebook/jest/tree/master/examples/snapshot), modifying the `Link` component, and running Jest. + +### Interactive Snapshot Mode + +Failed snapshots can also be updated interactively in watch mode: + +![](/img/content/interactiveSnapshot.png) + +Once you enter Interactive Snapshot Mode, Jest will step you through the failed snapshots one test at a time and give you the opportunity to review the failed output. + +From here you can choose to update that snapshot or skip to the next: + +![](/img/content/interactiveSnapshotUpdate.gif) + +Once you're finished, Jest will give you a summary before returning back to watch mode: + +![](/img/content/interactiveSnapshotDone.png) + +### Inline Snapshots + +Inline snapshots behave identically to external snapshots (`.snap` files), except the snapshot values are written automatically back into the source code. This means you can get the benefits of automatically generated snapshots without having to switch to an external file to make sure the correct value was written. + +> Inline snapshots are powered by [Prettier](https://prettier.io). To use inline snapshots you must have `prettier` installed in your project. Your Prettier configuration will be respected when writing to test files. +> +> If you have `prettier` installed in a location where Jest can't find it, you can tell Jest how to find it using the [`"prettierPath"`](./Configuration.md#prettierpath-string) configuration property. + +**Example:** + +First, you write a test, calling `.toMatchInlineSnapshot()` with no arguments: + +```javascript +it('renders correctly', () => { + const tree = renderer + .create(Prettier) + .toJSON(); + expect(tree).toMatchInlineSnapshot(); +}); +``` + +The next time you run Jest, `tree` will be evaluated, and a snapshot will be written as an argument to `toMatchInlineSnapshot`: + +```javascript +it('renders correctly', () => { + const tree = renderer + .create(Prettier) + .toJSON(); + expect(tree).toMatchInlineSnapshot(` + + Prettier + +`); +}); +``` + +That's all there is to it! You can even update the snapshots with `--updateSnapshot` or using the `u` key in `--watch` mode. + +### Property Matchers + +Often there are fields in the object you want to snapshot which are generated (like IDs and Dates). If you try to snapshot these objects, they will force the snapshot to fail on every run: + +```javascript +it('will fail every time', () => { + const user = { + createdAt: new Date(), + id: Math.floor(Math.random() * 20), + name: 'LeBron James', + }; + + expect(user).toMatchSnapshot(); +}); + +// Snapshot +exports[`will fail every time 1`] = ` +Object { + "createdAt": 2018-05-19T23:36:09.816Z, + "id": 3, + "name": "LeBron James", +} +`; +``` + +For these cases, Jest allows providing an asymmetric matcher for any property. These matchers are checked before the snapshot is written or tested, and then saved to the snapshot file instead of the received value: + +```javascript +it('will check the matchers and pass', () => { + const user = { + createdAt: new Date(), + id: Math.floor(Math.random() * 20), + name: 'LeBron James', + }; + + expect(user).toMatchSnapshot({ + createdAt: expect.any(Date), + id: expect.any(Number), + }); +}); + +// Snapshot +exports[`will check the matchers and pass 1`] = ` +Object { + "createdAt": Any, + "id": Any, + "name": "LeBron James", +} +`; +``` + +Any given value that is not a matcher will be checked exactly and saved to the snapshot: + +```javascript +it('will check the values and pass', () => { + const user = { + createdAt: new Date(), + name: 'Bond... James Bond', + }; + + expect(user).toMatchSnapshot({ + createdAt: expect.any(Date), + name: 'Bond... James Bond', + }); +}); + +// Snapshot +exports[`will check the values and pass 1`] = ` +Object { + "createdAt": Any, + "name": 'Bond... James Bond', +} +`; +``` + +## Best Practices + +Snapshots are a fantastic tool for identifying unexpected interface changes within your application – whether that interface is an API response, UI, logs, or error messages. As with any testing strategy, there are some best-practices you should be aware of, and guidelines you should follow, in order to use them effectively. + +### 1. Treat snapshots as code + +Commit snapshots and review them as part of your regular code review process. This means treating snapshots as you would any other type of test or code in your project. + +Ensure that your snapshots are readable by keeping them focused, short, and by using tools that enforce these stylistic conventions. + +As mentioned previously, Jest uses [`pretty-format`](https://yarnpkg.com/en/package/pretty-format) to make snapshots human-readable, but you may find it useful to introduce additional tools, like [`eslint-plugin-jest`](https://yarnpkg.com/en/package/eslint-plugin-jest) with its [`no-large-snapshots`](https://github.com/jest-community/eslint-plugin-jest/blob/master/docs/rules/no-large-snapshots.md) option, or [`snapshot-diff`](https://yarnpkg.com/en/package/snapshot-diff) with its component snapshot comparison feature, to promote committing short, focused assertions. + +The goal is to make it easy to review snapshots in pull requests, and fight against the habit of regenerating snapshots when test suites fail instead of examining the root causes of their failure. + +### 2. Tests should be deterministic + +Your tests should be deterministic. Running the same tests multiple times on a component that has not changed should produce the same results every time. You're responsible for making sure your generated snapshots do not include platform specific or other non-deterministic data. + +For example, if you have a [Clock](https://github.com/facebook/jest/blob/master/examples/snapshot/Clock.react.js) component that uses `Date.now()`, the snapshot generated from this component will be different every time the test case is run. In this case we can [mock the Date.now() method](MockFunctions.md) to return a consistent value every time the test is run: + +```js +Date.now = jest.fn(() => 1482363367071); +``` + +Now, every time the snapshot test case runs, `Date.now()` will return `1482363367071` consistently. This will result in the same snapshot being generated for this component regardless of when the test is run. + +### 3. Use descriptive snapshot names + +Always strive to use descriptive test and/or snapshot names for snapshots. The best names describe the expected snapshot content. This makes it easier for reviewers to verify the snapshots during review, and for anyone to know whether or not an outdated snapshot is the correct behavior before updating. + +For example, compare: + +```js +exports[` should handle some test case`] = `null`; + +exports[` should handle some other test case`] = ` +
+ Alan Turing +
+`; +``` + +To: + +```js +exports[` should render null`] = `null`; + +exports[` should render Alan Turing`] = ` +
+ Alan Turing +
+`; +``` + +Since the later describes exactly what's expected in the output, it's more clear to see when it's wrong: + +```js +exports[` should render null`] = ` +
+ Alan Turing +
+`; + +exports[` should render Alan Turing`] = `null`; +``` + +## Frequently Asked Questions + +### Are snapshots written automatically on Continuous Integration (CI) systems? + +No, as of Jest 20, snapshots in Jest are not automatically written when Jest is run in a CI system without explicitly passing `--updateSnapshot`. It is expected that all snapshots are part of the code that is run on CI and since new snapshots automatically pass, they should not pass a test run on a CI system. It is recommended to always commit all snapshots and to keep them in version control. + +### Should snapshot files be committed? + +Yes, all snapshot files should be committed alongside the modules they are covering and their tests. They should be considered part of a test, similar to the value of any other assertion in Jest. In fact, snapshots represent the state of the source modules at any given point in time. In this way, when the source modules are modified, Jest can tell what changed from the previous version. It can also provide a lot of additional context during code review in which reviewers can study your changes better. + +### Does snapshot testing only work with React components? + +[React](TutorialReact.md) and [React Native](TutorialReactNative.md) components are a good use case for snapshot testing. However, snapshots can capture any serializable value and should be used anytime the goal is testing whether the output is correct. The Jest repository contains many examples of testing the output of Jest itself, the output of Jest's assertion library as well as log messages from various parts of the Jest codebase. See an example of [snapshotting CLI output](https://github.com/facebook/jest/blob/master/e2e/__tests__/console.test.ts) in the Jest repo. + +### What's the difference between snapshot testing and visual regression testing? + +Snapshot testing and visual regression testing are two distinct ways of testing UIs, and they serve different purposes. Visual regression testing tools take screenshots of web pages and compare the resulting images pixel by pixel. With Snapshot testing values are serialized, stored within text files, and compared using a diff algorithm. There are different trade-offs to consider and we listed the reasons why snapshot testing was built in the [Jest blog](https://jestjs.io/blog/2016/07/27/jest-14.html#why-snapshot-testing). + +### Does snapshot testing replace unit testing? + +Snapshot testing is only one of more than 20 assertions that ship with Jest. The aim of snapshot testing is not to replace existing unit tests, but to provide additional value and make testing painless. In some scenarios, snapshot testing can potentially remove the need for unit testing for a particular set of functionalities (e.g. React components), but they can work together as well. + +### What is the performance of snapshot testing regarding speed and size of the generated files? + +Jest has been rewritten with performance in mind, and snapshot testing is not an exception. Since snapshots are stored within text files, this way of testing is fast and reliable. Jest generates a new file for each test file that invokes the `toMatchSnapshot` matcher. The size of the snapshots is pretty small: For reference, the size of all snapshot files in the Jest codebase itself is less than 300 KB. + +### How do I resolve conflicts within snapshot files? + +Snapshot files must always represent the current state of the modules they are covering. Therefore, if you are merging two branches and encounter a conflict in the snapshot files, you can either resolve the conflict manually or update the snapshot file by running Jest and inspecting the result. + +### Is it possible to apply test-driven development principles with snapshot testing? + +Although it is possible to write snapshot files manually, that is usually not approachable. Snapshots help to figure out whether the output of the modules covered by tests is changed, rather than giving guidance to design the code in the first place. + +### Does code coverage work with snapshot testing? + +Yes, as well as with any other test. diff --git a/website/versions.json b/website/versions.json index ea2d30236e1d..995c100b74ee 100644 --- a/website/versions.json +++ b/website/versions.json @@ -1,4 +1,5 @@ [ + "26.4", "26.2", "26.0", "25.x",