1 % Testing the JDK
   2 
   3 ## Using "make test" (the run-test framework)
   4 
   5 This new way of running tests is developer-centric. It assumes that you have
   6 built a JDK locally and want to test it. Running common test targets is simple,
   7 and more complex ad-hoc combination of tests is possible. The user interface is
   8 forgiving, and clearly report errors it cannot resolve.
   9 
  10 The main target `test` uses the jdk-image as the tested product. There is
  11 also an alternate target `exploded-test` that uses the exploded image
  12 instead. Not all tests will run successfully on the exploded image, but using
  13 this target can greatly improve rebuild times for certain workflows.
  14 
  15 Previously, `make test` was used to invoke an old system for running tests, and
  16 `make run-test` was used for the new test framework. For backward compatibility
  17 with scripts and muscle memory, `run-test` (and variants like
  18 `exploded-run-test` or `run-test-tier1`) are kept as aliases.
  19 
  20 Some example command-lines:
  21 
  22     $ make test-tier1
  23     $ make test-jdk_lang JTREG="JOBS=8"
  24     $ make test TEST=jdk_lang
  25     $ make test-only TEST="gtest:LogTagSet gtest:LogTagSetDescriptions" GTEST="REPEAT=-1"
  26     $ make test TEST="hotspot:hotspot_gc" JTREG="JOBS=1;TIMEOUT_FACTOR=8;JAVA_OPTIONS=-XshowSettings -Xlog:gc+ref=debug"
  27     $ make test TEST="jtreg:test/hotspot:hotspot_gc test/hotspot/jtreg/native_sanity/JniVersion.java"
  28     $ make test TEST="micro:java.lang.reflect" MICRO="FORK=1;WARMUP_ITER=2"
  29     $ make exploded-test TEST=tier2
  30 
  31 ### Configuration
  32 
  33 To be able to run JTReg tests, `configure` needs to know where to find the
  34 JTReg test framework. If it is not picked up automatically by configure, use
  35 the `--with-jtreg=<path to jtreg home>` option to point to the JTReg framework.
  36 Note that this option should point to the JTReg home, i.e. the top directory,
  37 containing `lib/jtreg.jar` etc. (An alternative is to set the `JT_HOME`
  38 environment variable to point to the JTReg home before running `configure`.)
  39 
  40 To be able to run microbenchmarks, `configure` needs to know where to find
  41 the JMH dependency. Use `--with-jmh=<path to JMH jars>` to point to a directory
  42 containing the core JMH and transitive dependencies. The recommended dependencies
  43 can be retrieved by running `sh make/devkit/createJMHBundle.sh`, after which
  44 `--with-jmh=build/jmh/jars` should work.
  45 
  46 ## Test selection
  47 
  48 All functionality is available using the `test` make target. In this use case,
  49 the test or tests to be executed is controlled using the `TEST` variable. To
  50 speed up subsequent test runs with no source code changes, `test-only` can be
  51 used instead, which do not depend on the source and test image build.
  52 
  53 For some common top-level tests, direct make targets have been generated. This
  54 includes all JTReg test groups, the hotspot gtest, and custom tests (if
  55 present). This means that `make test-tier1` is equivalent to `make test
  56 TEST="tier1"`, but the latter is more tab-completion friendly. For more complex
  57 test runs, the `test TEST="x"` solution needs to be used.
  58 
  59 The test specifications given in `TEST` is parsed into fully qualified test
  60 descriptors, which clearly and unambigously show which tests will be run. As an
  61 example, `:tier1` will expand to `jtreg:$(TOPDIR)/test/hotspot/jtreg:tier1
  62 jtreg:$(TOPDIR)/test/jdk:tier1 jtreg:$(TOPDIR)/test/langtools:tier1
  63 jtreg:$(TOPDIR)/test/nashorn:tier1 jtreg:$(TOPDIR)/test/jaxp:tier1`. You can
  64 always submit a list of fully qualified test descriptors in the `TEST` variable
  65 if you want to shortcut the parser.
  66 
  67 ### JTReg
  68 
  69 JTReg tests can be selected either by picking a JTReg test group, or a selection
  70 of files or directories containing JTReg tests.
  71 
  72 JTReg test groups can be specified either without a test root, e.g. `:tier1`
  73 (or `tier1`, the initial colon is optional), or with, e.g. `hotspot:tier1`,
  74 `test/jdk:jdk_util` or `$(TOPDIR)/test/hotspot/jtreg:hotspot_all`. The test
  75 root can be specified either as an absolute path, or a path relative to the
  76 JDK top directory, or the `test` directory. For simplicity, the hotspot
  77 JTReg test root, which really is `hotspot/jtreg` can be abbreviated as
  78 just `hotspot`.
  79 
  80 When specified without a test root, all matching groups from all test roots
  81 will be added. Otherwise, only the group from the specified test root will be
  82 added.
  83 
  84 Individual JTReg tests or directories containing JTReg tests can also be
  85 specified, like `test/hotspot/jtreg/native_sanity/JniVersion.java` or
  86 `hotspot/jtreg/native_sanity`. Just like for test root selection, you can
  87 either specify an absolute path (which can even point to JTReg tests outside
  88 the source tree), or a path relative to either the JDK top directory or the
  89 `test` directory. `hotspot` can be used as an alias for `hotspot/jtreg` here as
  90 well.
  91 
  92 As long as the test groups or test paths can be uniquely resolved, you do not
  93 need to enter the `jtreg:` prefix. If this is not possible, or if you want to
  94 use a fully qualified test descriptor, add `jtreg:`, e.g.
  95 `jtreg:test/hotspot/jtreg/native_sanity`.
  96 
  97 ### Gtest
  98 
  99 Since the Hotspot Gtest suite is so quick, the default is to run all tests.
 100 This is specified by just `gtest`, or as a fully qualified test descriptor
 101 `gtest:all`.
 102 
 103 If you want, you can single out an individual test or a group of tests, for
 104 instance `gtest:LogDecorations` or `gtest:LogDecorations.level_test_vm`. This
 105 can be particularly useful if you want to run a shaky test repeatedly.
 106 
 107 For Gtest, there is a separate test suite for each JVM variant. The JVM variant
 108 is defined by adding `/<variant>` to the test descriptor, e.g.
 109 `gtest:Log/client`. If you specify no variant, gtest will run once for each JVM
 110 variant present (e.g. server, client). So if you only have the server JVM
 111 present, then `gtest:all` will be equivalent to `gtest:all/server`.
 112 
 113 ### Microbenchmarks
 114 
 115 Which microbenchmarks to run is selected using a regular expression
 116 following the `micro:` test descriptor, e.g., `micro:java.lang.reflect`. This
 117 delegates the test selection to JMH, meaning package name, class name and even
 118 benchmark method names can be used to select tests.
 119 
 120 Using special characters like `|` in the regular expression is possible, but
 121 needs to be escaped multiple times: `micro:ArrayCopy\\\\\|reflect`.
 122 
 123 ### Special tests
 124 
 125 A handful of odd tests that are not covered by any other testing framework are
 126 accessible using the `special:` test descriptor. Currently, this includes
 127 `failure-handler` and `make`.
 128 
 129   * Failure handler testing is run using `special:failure-handler` or just
 130     `failure-handler` as test descriptor.
 131 
 132   * Tests for the build system, including both makefiles and related
 133     functionality, is run using `special:make` or just `make` as test
 134     descriptor. This is equivalent to `special:make:all`.
 135 
 136     A specific make test can be run by supplying it as argument, e.g.
 137     `special:make:idea`. As a special syntax, this can also be expressed as
 138     `make-idea`, which allows for command lines as `make test-make-idea`.
 139 
 140 ## Test results and summary
 141 
 142 At the end of the test run, a summary of all tests run will be presented. This
 143 will have a consistent look, regardless of what test suites were used. This is
 144 a sample summary:
 145 
 146     ==============================
 147     Test summary
 148     ==============================
 149        TEST                                          TOTAL  PASS  FAIL ERROR
 150     >> jtreg:jdk/test:tier1                           1867  1865     2     0 <<
 151        jtreg:langtools/test:tier1                     4711  4711     0     0
 152        jtreg:nashorn/test:tier1                        133   133     0     0
 153     ==============================
 154     TEST FAILURE
 155 
 156 Tests where the number of TOTAL tests does not equal the number of PASSed tests
 157 will be considered a test failure. These are marked with the `>> ... <<` marker
 158 for easy identification.
 159 
 160 The classification of non-passed tests differs a bit between test suites. In
 161 the summary, ERROR is used as a catch-all for tests that neither passed nor are
 162 classified as failed by the framework. This might indicate test framework
 163 error, timeout or other problems.
 164 
 165 In case of test failures, `make test` will exit with a non-zero exit value.
 166 
 167 All tests have their result stored in `build/$BUILD/test-results/$TEST_ID`,
 168 where TEST_ID is a path-safe conversion from the fully qualified test
 169 descriptor, e.g. for `jtreg:jdk/test:tier1` the TEST_ID is
 170 `jtreg_jdk_test_tier1`. This path is also printed in the log at the end of the
 171 test run.
 172 
 173 Additional work data is stored in `build/$BUILD/test-support/$TEST_ID`. For
 174 some frameworks, this directory might contain information that is useful in
 175 determining the cause of a failed test.
 176 
 177 ## Test suite control
 178 
 179 It is possible to control various aspects of the test suites using make control
 180 variables.
 181 
 182 These variables use a keyword=value approach to allow multiple values to be
 183 set. So, for instance, `JTREG="JOBS=1;TIMEOUT_FACTOR=8"` will set the JTReg
 184 concurrency level to 1 and the timeout factor to 8. This is equivalent to
 185 setting `JTREG_JOBS=1 JTREG_TIMEOUT_FACTOR=8`, but using the keyword format means that
 186 the `JTREG` variable is parsed and verified for correctness, so
 187 `JTREG="TMIEOUT_FACTOR=8"` would give an error, while `JTREG_TMIEOUT_FACTOR=8` would just
 188 pass unnoticed.
 189 
 190 To separate multiple keyword=value pairs, use `;` (semicolon). Since the shell
 191 normally eats `;`, the recommended usage is to write the assignment inside
 192 qoutes, e.g. `JTREG="...;..."`. This will also make sure spaces are preserved,
 193 as in `JTREG="JAVA_OPTIONS=-XshowSettings -Xlog:gc+ref=debug"`.
 194 
 195 (Other ways are possible, e.g. using backslash: `JTREG=JOBS=1\;TIMEOUT_FACTOR=8`.
 196 Also, as a special technique, the string `%20` will be replaced with space for
 197 certain options, e.g. `JTREG=JAVA_OPTIONS=-XshowSettings%20-Xlog:gc+ref=debug`.
 198 This can be useful if you have layers of scripts and have trouble getting
 199 proper quoting of command line arguments through.)
 200 
 201 As far as possible, the names of the keywords have been standardized between
 202 test suites.
 203 
 204 ### General keywords (TEST_OPTS)
 205 
 206 Some keywords are valid across different test suites. If you want to run
 207 tests from multiple test suites, or just don't want to care which test suite specific
 208 control variable to use, then you can use the general TEST_OPTS control variable.
 209 
 210 There are also some keywords that applies globally to the test runner system,
 211 not to any specific test suites. These are also available as TEST_OPTS keywords.
 212 
 213 #### JOBS
 214 
 215 Currently only applies to JTReg.
 216 
 217 #### TIMEOUT_FACTOR
 218 
 219 Currently only applies to JTReg.
 220 
 221 #### JAVA_OPTIONS
 222 
 223 Applies to JTReg, GTest and Micro.
 224 
 225 #### VM_OPTIONS
 226 
 227 Applies to JTReg, GTest and Micro.
 228 
 229 #### AOT_MODULES
 230 
 231 Applies to JTReg and GTest.
 232 
 233 #### JCOV
 234 
 235 This keywords applies globally to the test runner system. If set to `true`, it
 236 enables JCov coverage reporting for all tests run. To be useful, the JDK under
 237 test must be run with a JDK built with JCov instrumentation (`configure
 238 --with-jcov=<path to directory containing lib/jcov.jar>`, `make jcov-image`).
 239 
 240 The simplest way to run tests with JCov coverage report is to use the special
 241 target `jcov-test` instead of `test`, e.g. `make jcov-test TEST=jdk_lang`. This
 242 will make sure the JCov image is built, and that JCov reporting is enabled.
 243 
 244 The JCov report is stored in `build/$BUILD/test-results/jcov-output/report`.
 245 
 246 Please note that running with JCov reporting can be very memory intensive.
 247 
 248 #### JCOV_DIFF_CHANGESET
 249 
 250 While collecting code coverage with JCov, it is also possible to find coverage
 251 for only recently changed code. JCOV_DIFF_CHANGESET specifies a source
 252 revision. A textual report will be generated showing coverage of the diff
 253 between the specified revision and the repository tip.
 254 
 255 The report is stored in `build/$BUILD/test-results/jcov-output/diff_coverage_report`
 256 file.
 257 
 258 ### JTReg keywords
 259 
 260 #### JOBS
 261 The test concurrency (`-concurrency`).
 262 
 263 Defaults to TEST_JOBS (if set by `--with-test-jobs=`), otherwise it defaults to
 264 JOBS, except for Hotspot, where the default is *number of CPU cores/2*,
 265 but never more than *memory size in GB/2*.
 266 
 267 #### TIMEOUT_FACTOR
 268 The timeout factor (`-timeoutFactor`).
 269 
 270 Defaults to 4.
 271 
 272 #### TEST_MODE
 273 The test mode (`agentvm` or `othervm`).
 274 
 275 Defaults to `agentvm`.
 276 
 277 #### ASSERT
 278 Enable asserts (`-ea -esa`, or none).
 279 
 280 Set to `true` or `false`. If true, adds `-ea -esa`. Defaults to true, except
 281 for hotspot.
 282 
 283 #### VERBOSE
 284 The verbosity level (`-verbose`).
 285 
 286 Defaults to `fail,error,summary`.
 287 
 288 #### RETAIN
 289 What test data to retain (`-retain`).
 290 
 291 Defaults to `fail,error`.
 292 
 293 #### MAX_MEM
 294 Limit memory consumption (`-Xmx` and `-vmoption:-Xmx`, or none).
 295 
 296 Limit memory consumption for JTReg test framework and VM under test. Set to 0
 297 to disable the limits.
 298 
 299 Defaults to 512m, except for hotspot, where it defaults to 0 (no limit).
 300 
 301 #### KEYWORDS
 302 
 303 JTReg kewords sent to JTReg using `-k`. Please be careful in making sure that
 304 spaces and special characters (like `!`) are properly quoted. To avoid some
 305 issues, the special value `%20` can be used instead of space.
 306 
 307 #### EXTRA_PROBLEM_LISTS
 308 
 309 Use additional problem lists file or files, in addition to the default
 310 ProblemList.txt located at the JTReg test roots.
 311 
 312 If multiple file names are specified, they should be separated by space (or, to
 313 help avoid quoting issues, the special value `%20`).
 314 
 315 The file names should be either absolute, or relative to the JTReg test root of
 316 the tests to be run.
 317 
 318 #### RUN_PROBLEM_LISTS
 319 
 320 Use the problem lists to select tests instead of excluding them.
 321 
 322 Set to `true` or `false`.
 323 If `true`, JTReg will use `-match:` option, otherwise `-exclude:` will be used.
 324 Default is `false`.
 325 
 326 
 327 #### OPTIONS
 328 Additional options to the JTReg test framework.
 329 
 330 Use `JTREG="OPTIONS=--help all"` to see all available JTReg options.
 331 
 332 #### JAVA_OPTIONS
 333 Additional Java options for running test classes (sent to JTReg as
 334 `-javaoption`).
 335 
 336 #### VM_OPTIONS
 337 Additional Java options to be used when compiling and running classes (sent to
 338 JTReg as `-vmoption`).
 339 
 340 This option is only needed in special circumstances. To pass Java options to
 341 your test classes, use `JAVA_OPTIONS`.
 342 
 343 #### AOT_MODULES
 344 
 345 Generate AOT modules before testing for the specified module, or set of
 346 modules. If multiple modules are specified, they should be separated by space
 347 (or, to help avoid quoting issues, the special value `%20`).
 348 
 349 #### RETRY_COUNT
 350 
 351 Retry failed tests up to a set number of times. Defaults to 0.
 352 
 353 ### Gtest keywords
 354 
 355 #### REPEAT
 356 The number of times to repeat the tests (`--gtest_repeat`).
 357 
 358 Default is 1. Set to -1 to repeat indefinitely. This can be especially useful
 359 combined with `OPTIONS=--gtest_break_on_failure` to reproduce an intermittent
 360 problem.
 361 
 362 #### OPTIONS
 363 Additional options to the Gtest test framework.
 364 
 365 Use `GTEST="OPTIONS=--help"` to see all available Gtest options.
 366 
 367 #### AOT_MODULES
 368 
 369 Generate AOT modules before testing for the specified module, or set of
 370 modules. If multiple modules are specified, they should be separated by space
 371 (or, to help avoid quoting issues, the special value `%20`).
 372 
 373 ### Microbenchmark keywords
 374 
 375 #### FORK
 376 Override the number of benchmark forks to spawn. Same as specifying `-f <num>`.
 377 
 378 #### ITER
 379 Number of measurement iterations per fork. Same as specifying `-i <num>`.
 380 
 381 #### TIME
 382 Amount of time to spend in each measurement iteration, in seconds. Same as
 383 specifying `-r <num>`
 384 
 385 #### WARMUP_ITER
 386 Number of warmup iterations to run before the measurement phase in each fork.
 387 Same as specifying `-wi <num>`.
 388 
 389 #### WARMUP_TIME
 390 Amount of time to spend in each warmup iteration. Same as specifying `-w <num>`.
 391 
 392 #### RESULTS_FORMAT
 393 Specify to have the test run save a log of the values. Accepts the same values
 394 as `-rff`, i.e., `text`, `csv`, `scsv`, `json`, or `latex`.
 395 
 396 #### VM_OPTIONS
 397 Additional VM arguments to provide to forked off VMs. Same as `-jvmArgs <args>`
 398 
 399 #### OPTIONS
 400 Additional arguments to send to JMH.
 401 
 402 ## Notes for Specific Tests
 403 
 404 ### Docker Tests
 405 
 406 Docker tests with default parameters may fail on systems with glibc versions not
 407 compatible with the one used in the default docker image (e.g., Oracle Linux 7.6 for x86).
 408 For example, they pass on Ubuntu 16.04 but fail on Ubuntu 18.04 if run like this on x86:
 409 
 410     $ make test TEST="jtreg:test/hotspot/jtreg/containers/docker"
 411 
 412 To run these tests correctly, additional parameters for the correct docker image are
 413 required on Ubuntu 18.04 by using `JAVA_OPTIONS`.
 414 
 415     $ make test TEST="jtreg:test/hotspot/jtreg/containers/docker" JTREG="JAVA_OPTIONS=-Djdk.test.docker.image.name=ubuntu -Djdk.test.docker.image.version=latest"
 416 
 417 ### Non-US locale
 418 
 419 If your locale is non-US, some tests are likely to fail. To work around this you can
 420 set the locale to US. On Unix platforms simply setting `LANG="en_US"` in the
 421 environment before running tests should work. On Windows, setting
 422 `JTREG="VM_OPTIONS=-Duser.language=en -Duser.country=US"` helps for most, but not all test cases.
 423 For example:
 424 
 425     $ export LANG="en_US" && make test TEST=...
 426     $ make test JTREG="VM_OPTIONS=-Duser.language=en -Duser.country=US" TEST=...
 427 
 428 ### PKCS11 Tests
 429 
 430 It is highly recommended to use the latest NSS version when running PKCS11 tests.
 431 Improper NSS version may lead to unexpected failures which are hard to diagnose.
 432 For example, sun/security/pkcs11/Secmod/AddTrustedCert.java may fail on Ubuntu
 433 18.04 with the default NSS version in the system.
 434 To run these tests correctly, the system property `test.nss.lib.paths` is required
 435 on Ubuntu 18.04 to specify the alternative NSS lib directories.
 436 For example:
 437 
 438     $ make test TEST="jtreg:sun/security/pkcs11/Secmod/AddTrustedCert.java" JTREG="JAVA_OPTIONS=-Dtest.nss.lib.paths=/path/to/your/latest/NSS-libs"
 439 
 440 For more notes about the PKCS11 tests, please refer to test/jdk/sun/security/pkcs11/README.
 441 
 442 ### Client UI Tests
 443 
 444 Some Client UI tests use key sequences which may be reserved by the operating
 445 system. Usually that causes the test failure. So it is highly recommended to disable
 446 system key shortcuts prior testing. The steps to access and disable system key shortcuts
 447 for various platforms are provided below.
 448 
 449 #### MacOS
 450 Choose Apple menu; System Preferences, click Keyboard, then click Shortcuts;
 451 select or deselect desired shortcut.
 452 
 453 For example, test/jdk/javax/swing/TooltipManager/JMenuItemToolTipKeyBindingsTest/JMenuItemToolTipKeyBindingsTest.java fails
 454 on MacOS because it uses `CTRL + F1` key sequence to show or hide tooltip message
 455 but the key combination is reserved by the operating system. To run the test correctly
 456 the default global key shortcut should be disabled using the steps described above, and then deselect
 457 "Turn keyboard access on or off" option which is responsible for `CTRL + F1` combination.
 458 
 459 #### Linux
 460 Open the Activities overview and start typing Settings; Choose Settings, click Devices,
 461 then click Keyboard; set or override desired shortcut.
 462 
 463 #### Windows
 464 Type `gpedit` in the Search and then click Edit group policy; navigate to
 465 User Configuration -> Administrative Templates -> Windows Components -> File Explorer;
 466 in the right-side pane look for "Turn off Windows key hotkeys" and double click on it;
 467 enable or disable hotkeys.
 468 
 469 Note: restart is required to make the settings take effect.
 470 
 471 ---
 472 # Override some definitions in the global css file that are not optimal for
 473 # this document.
 474 header-includes:
 475  - '<style type="text/css">pre, code, tt { color: #1d6ae5; }</style>'
 476 ---