Skip to content

sasjs test

The sasjs test command triggers deployed SAS unit tests for execution, and collects the test results in JSON, JUnit XML, LCOV and CSV formats.

Results are also displayed in the console, as follows:

sas tests

Tests are compiled & deployed as services (STPs in SAS 9, Jobs in Viya, Stored Programs in SASjs/server). In this way, every test is completely isolated with it's own SAS session.

To create a test, simply create a file with the same name as the Job / Service / Macro being tested, but with a extension. If you have multiple tests, you can add a .test.[integer].sas extension, and the tests will proceed according to the integers provided.

You can send back one or more test results in a single program by creating a table called work.test_results with the following entries:

Some description PASS Some run related comment
Another test description FAIL some explanation of the failure

The results should be sent back using the following macros (which could be in your termProgram entry):

/* do some tests */
data some sas tests;
  set whatever you like;

/* create a test_results table */
data work.test_results;
  /* mandatory values */
  test_description="some description";
  test_result="PASS"; /* or FAIL */
  /* optional value */
  test_comments="We did this & that happened";

/* send it back with the precompiled webout macro */

Examples of tests for SAS Macros are available in the SASjs/CORE library. There are also a series of assertion macros available.

Test Locations

Tests will only be compiled if they exist in a folder listed in one of the following sasjsconfig arrays:


sasjs test <filteringString> --source <testFlowPath> --outDirectory <folderPath> -t <targetName> --ignoreFail
  • Providing filteringString is optional. If not present, all tests mentioned in test flow file will be executed.
  • Providing source flag is optional. If not present, CLI will use test flow located at sasjsbuild/testFlow.json (created when running sasjs build).
  • Providing outDirectory flag is optional. If not present, CLI will save outputs into the temporary sasjsresults folder.
  • Providing ignore fail (--ignoreFail or -iF) flag is optional. If present CLI will return exit code 0 even if tests are failing. Useful when the requirement is not to make CI Pipeline fail.`


Execute all tests for the default target:

sasjs test

Execute all tests in the macros folder:

sasjs test /macros/

Execute all tests starting with "mv_" and save the output in 'myresults' folder

sasjs test mv_ --outDirectory /somedir/myresults

Prevent command fail (for example in CI Pipeline):

sasjs test --ignoreFail


Test configuration can be provided at root or target level. Configuration example:

"testConfig": {
  "initProgram": "sasjs/tests/",
  "termProgram": "sasjs/tests/",
  "macroVars": {
    "testVar": "testValue"
  "testSetUp": "sasjs/tests/",
  "testTearDown": "sasjs/tests/"
  • testSetUp will be executed prior to all tests
  • testTearDown will be executed after all tests have finished
  • initProgram is inserted at the start of every test
  • termProgram is inserted at the end of every test
  • macroVars are defined at the start of every test

File Name Convention

Only files names that match following pattern will be considered as tests. Pattern:


  • Providing a test integer is optional, if provided, the tests will be executed accordingly to numerical order - eg first and second.


A SAS Service, Job or Macro is considered covered if there is a test file with the same filename, for example:


In the example above, some_service will be considered covered, some_job will be considered not covered and some_macro.test will be considered as a standalone test.

Overall coverage is displayed, along with a group summary for Jobs, Services and Macros.

sas test coverage


We are planning a more 'intelligent' coverage system that can detect whether a macro / servivce / job was executed as part of the test suite. If this would be helpful to your project, do get in touch!

Test Body

An example of a test that provides a result:

data work.test_results;
  test_description="some description";
  test_comments="We did this & that happened";
%webout(OBJ, test_results)

Providing the test_results table with a test_result variable is required, in order for the frontend to determine if the test is a PASS or FAIL. The webout() macro definition will be deployed as precode in the compiled test, and is essentially a wrapper for or according the serverType of the target.

Test Flow

SAS unit tests will be executed one after another. Execution order is described in sasjsbuild/testFlow.json which is created as part of compilation process (sasjs compile).

Test Results

By default test results will be saved in the sasjsresults folder. An example of sasjsresults folder structure:

├── logs
│  ├── macros_some_macro.test.1.log
│  ├── macros_some_macro.test.log
│  ├── services_some_service.test.log
│  ├── jobs_some_job.test.log
│  ├── testteardown.log
│  └── testsetup.log
├── testResults.csv
└── testResults.json
└── testResults.xml

Results are saved in CSV, JSON and JUnit XML formats.

Assertion Macros

A number of ready made assertion macros are available in the SASjs Core library:

Running SAS Tests with SASjs

In order to run tests, take the following steps:

  1. Provide tests configuration (testConfig) in the sasjs/sasjsconfig.json file
  2. Create test files in services, jobs or macro folders (with extension).
  3. Execute sasjs cbd -t <targetName> to compile and deploy the tests as isolated SAS web services
  4. Execute sasjs test -t <targetName>
  5. Visit the local sasjsresults folder to review results.

To assist with debugging, all logs are captured, and we generate a URL so that you can easily click and re-run any particular individual test.

CSV Format:

sas test results CSV

JSON Format: sas test results JSON

JUnit XML Format: sas test results XML

Console Output: sas test results CONSOLE

Demo Video

A demonstration of sasjs test by the developer, Yury Shkoda was provided at a Sasensei SID event in July 2021 - available below.