linux-kselftest-kunit-5.17-rc1

This KUnit update for Linux 5.17-rc1 consists of several fixes and
 enhancements. A few highlights:
 
 - Option --kconfig_add option allows easily tweaking kunitconfigs
 - make build subcommand can reconfigure if needed
 - doesn't error on tests without test plans
 - doesn't crash if no parameters are generated
 - defaults --jobs to # of cups
 - reports test parameter results as (K)TAP subtests
 -----BEGIN PGP SIGNATURE-----
 
 iQIzBAABCgAdFiEEPZKym/RZuOCGeA/kCwJExA0NQxwFAmHY3T4ACgkQCwJExA0N
 QxwpSA//ZuAuMvAjedj0lgCBU5ocBQAHs7RsTmo6n3ORdTgZ/hjWF9dyyAgvIcb1
 x+BW2M0KXVvpsl5UEuyWz1jQAc1aT4DCMJp/vUYeuwDXqtPxioZhJ9XeGtT+pBDy
 L6GoJeZYQXIGGnRigF0QDY9gQsmvGMQFSJ/NIADeU7XUqlyZlLMgWWa2fO3OKYw+
 33nUBFgObyElGwikyvjACiG+jSZgq9a0eWW1mdZ06sLa7Z+cZvsAyBa4bSdvoQt3
 9s+3JAEHzQEDBwwRt2na6p18m3AA5vi8xyeu7Xz/0agv17TSPuKofx0L7F60sIQW
 oAyHQkHSj9X9s67kjCobu3TlswwsOaB4TEIOolHoqHjrwRPrQGcE4gddyVPGvs52
 3Iu8lAgiCUjNbXKMcEismjrqWe8o4ICk+uVRnAOWjGT4zF/XmAtXnwM4ddZmoFZM
 mS/UmJscoTSV8wxN0QHcZw6TADvX+QNmdOMe3AlQMhhsIklmaWFg5Pf91QafbjST
 yBkXPoqbFlfpKUJ7oCzK3MvvmFzhBOTMIO2lWTSlMPR5xIw/wUR9Go0rKBCm29rf
 YPgwvM1RPkyY+37ZTbPqgpX0oIw5VTRteYdMJTDUzyO4nqSWCp8QYeIKUT/4YJqc
 mY7+wNdqhuXHdvVbsPvObeWqw7DDYZySVf2QJeta7dycBcMYKcE=
 =vGqB
 -----END PGP SIGNATURE-----

Merge tag 'linux-kselftest-kunit-5.17-rc1' of git://git.kernel.org/pub/scm/linux/kernel/git/shuah/linux-kselftest

Pull KUnit updates from Shuah Khan:
 "This consists of several fixes and enhancements. A few highlights:

   - Option --kconfig_add option allows easily tweaking kunitconfigs

   - make build subcommand can reconfigure if needed

   - doesn't error on tests without test plans

   - doesn't crash if no parameters are generated

   - defaults --jobs to # of cups

   - reports test parameter results as (K)TAP subtests"

* tag 'linux-kselftest-kunit-5.17-rc1' of git://git.kernel.org/pub/scm/linux/kernel/git/shuah/linux-kselftest:
  kunit: tool: Default --jobs to number of CPUs
  kunit: tool: fix newly introduced typechecker errors
  kunit: tool: make `build` subcommand also reconfigure if needed
  kunit: tool: delete kunit_parser.TestResult type
  kunit: tool: use dataclass instead of collections.namedtuple
  kunit: tool: suggest using decode_stacktrace.sh on kernel crash
  kunit: tool: reconfigure when the used kunitconfig changes
  kunit: tool: revamp message for invalid kunitconfig
  kunit: tool: add --kconfig_add to allow easily tweaking kunitconfigs
  kunit: tool: move Kconfig read_from_file/parse_from_string to package-level
  kunit: tool: print parsed test results fully incrementally
  kunit: Report test parameter results as (K)TAP subtests
  kunit: Don't crash if no parameters are generated
  kunit: tool: Report an error if any test has no subtests
  kunit: tool: Do not error on tests without test plans
  kunit: add run_checks.py script to validate kunit changes
  Documentation: kunit: remove claims that kunit is a mocking framework
  kunit: tool: fix --json output for skipped tests
This commit is contained in:
Linus Torvalds 2022-01-10 12:16:48 -08:00
commit bf4eebf8cf
13 changed files with 476 additions and 200 deletions

View File

@ -12,5 +12,4 @@ following sections:
Documentation/dev-tools/kunit/api/test.rst Documentation/dev-tools/kunit/api/test.rst
- documents all of the standard testing API excluding mocking - documents all of the standard testing API
or mocking related features.

View File

@ -4,8 +4,7 @@
Test API Test API
======== ========
This file documents all of the standard testing API excluding mocking or mocking This file documents all of the standard testing API.
related features.
.. kernel-doc:: include/kunit/test.h .. kernel-doc:: include/kunit/test.h
:internal: :internal:

View File

@ -19,7 +19,7 @@ KUnit - Unit Testing for the Linux Kernel
What is KUnit? What is KUnit?
============== ==============
KUnit is a lightweight unit testing and mocking framework for the Linux kernel. KUnit is a lightweight unit testing framework for the Linux kernel.
KUnit is heavily inspired by JUnit, Python's unittest.mock, and KUnit is heavily inspired by JUnit, Python's unittest.mock, and
Googletest/Googlemock for C++. KUnit provides facilities for defining unit test Googletest/Googlemock for C++. KUnit provides facilities for defining unit test

View File

@ -50,10 +50,10 @@ It'll warn you if you haven't included the dependencies of the options you're
using. using.
.. note:: .. note::
Note that removing something from the ``.kunitconfig`` will not trigger a If you change the ``.kunitconfig``, kunit.py will trigger a rebuild of the
rebuild of the ``.config`` file: the configuration is only updated if the ``.config`` file. But you can edit the ``.config`` file directly or with
``.kunitconfig`` is not a subset of ``.config``. This means that you can use tools like ``make menuconfig O=.kunit``. As long as its a superset of
other tools (such as make menuconfig) to adjust other config options. ``.kunitconfig``, kunit.py won't overwrite your changes.
Running the tests (KUnit Wrapper) Running the tests (KUnit Wrapper)

View File

@ -504,25 +504,28 @@ int kunit_run_tests(struct kunit_suite *suite)
struct kunit_result_stats param_stats = { 0 }; struct kunit_result_stats param_stats = { 0 };
test_case->status = KUNIT_SKIPPED; test_case->status = KUNIT_SKIPPED;
if (test_case->generate_params) { if (!test_case->generate_params) {
/* Non-parameterised test. */
kunit_run_case_catch_errors(suite, test_case, &test);
kunit_update_stats(&param_stats, test.status);
} else {
/* Get initial param. */ /* Get initial param. */
param_desc[0] = '\0'; param_desc[0] = '\0';
test.param_value = test_case->generate_params(NULL, param_desc); test.param_value = test_case->generate_params(NULL, param_desc);
} kunit_log(KERN_INFO, &test, KUNIT_SUBTEST_INDENT KUNIT_SUBTEST_INDENT
"# Subtest: %s", test_case->name);
do { while (test.param_value) {
kunit_run_case_catch_errors(suite, test_case, &test); kunit_run_case_catch_errors(suite, test_case, &test);
if (test_case->generate_params) {
if (param_desc[0] == '\0') { if (param_desc[0] == '\0') {
snprintf(param_desc, sizeof(param_desc), snprintf(param_desc, sizeof(param_desc),
"param-%d", test.param_index); "param-%d", test.param_index);
} }
kunit_log(KERN_INFO, &test, kunit_log(KERN_INFO, &test,
KUNIT_SUBTEST_INDENT KUNIT_SUBTEST_INDENT KUNIT_SUBTEST_INDENT
"# %s: %s %d - %s", "%s %d - %s",
test_case->name,
kunit_status_to_ok_not_ok(test.status), kunit_status_to_ok_not_ok(test.status),
test.param_index + 1, param_desc); test.param_index + 1, param_desc);
@ -530,11 +533,11 @@ int kunit_run_tests(struct kunit_suite *suite)
param_desc[0] = '\0'; param_desc[0] = '\0';
test.param_value = test_case->generate_params(test.param_value, param_desc); test.param_value = test_case->generate_params(test.param_value, param_desc);
test.param_index++; test.param_index++;
kunit_update_stats(&param_stats, test.status);
} }
}
kunit_update_stats(&param_stats, test.status);
} while (test.param_value);
kunit_print_test_stats(&test, param_stats); kunit_print_test_stats(&test, param_stats);

View File

@ -15,38 +15,57 @@ import time
assert sys.version_info >= (3, 7), "Python version is too old" assert sys.version_info >= (3, 7), "Python version is too old"
from collections import namedtuple from dataclasses import dataclass
from enum import Enum, auto from enum import Enum, auto
from typing import Iterable, Sequence, List from typing import Any, Iterable, Sequence, List, Optional
import kunit_json import kunit_json
import kunit_kernel import kunit_kernel
import kunit_parser import kunit_parser
KunitResult = namedtuple('KunitResult', ['status','result','elapsed_time'])
KunitConfigRequest = namedtuple('KunitConfigRequest',
['build_dir', 'make_options'])
KunitBuildRequest = namedtuple('KunitBuildRequest',
['jobs', 'build_dir', 'alltests',
'make_options'])
KunitExecRequest = namedtuple('KunitExecRequest',
['timeout', 'build_dir', 'alltests',
'filter_glob', 'kernel_args', 'run_isolated'])
KunitParseRequest = namedtuple('KunitParseRequest',
['raw_output', 'build_dir', 'json'])
KunitRequest = namedtuple('KunitRequest', ['raw_output','timeout', 'jobs',
'build_dir', 'alltests', 'filter_glob',
'kernel_args', 'run_isolated', 'json', 'make_options'])
KernelDirectoryPath = sys.argv[0].split('tools/testing/kunit/')[0]
class KunitStatus(Enum): class KunitStatus(Enum):
SUCCESS = auto() SUCCESS = auto()
CONFIG_FAILURE = auto() CONFIG_FAILURE = auto()
BUILD_FAILURE = auto() BUILD_FAILURE = auto()
TEST_FAILURE = auto() TEST_FAILURE = auto()
@dataclass
class KunitResult:
status: KunitStatus
result: Any
elapsed_time: float
@dataclass
class KunitConfigRequest:
build_dir: str
make_options: Optional[List[str]]
@dataclass
class KunitBuildRequest(KunitConfigRequest):
jobs: int
alltests: bool
@dataclass
class KunitParseRequest:
raw_output: Optional[str]
build_dir: str
json: Optional[str]
@dataclass
class KunitExecRequest(KunitParseRequest):
timeout: int
alltests: bool
filter_glob: str
kernel_args: Optional[List[str]]
run_isolated: Optional[str]
@dataclass
class KunitRequest(KunitExecRequest, KunitBuildRequest):
pass
KernelDirectoryPath = sys.argv[0].split('tools/testing/kunit/')[0]
def get_kernel_root_path() -> str: def get_kernel_root_path() -> str:
path = sys.argv[0] if not __file__ else __file__ path = sys.argv[0] if not __file__ else __file__
parts = os.path.realpath(path).split('tools/testing/kunit') parts = os.path.realpath(path).split('tools/testing/kunit')
@ -91,6 +110,14 @@ def build_tests(linux: kunit_kernel.LinuxSourceTree,
'built kernel successfully', 'built kernel successfully',
build_end - build_start) build_end - build_start)
def config_and_build_tests(linux: kunit_kernel.LinuxSourceTree,
request: KunitBuildRequest) -> KunitResult:
config_result = config_tests(linux, request)
if config_result.status != KunitStatus.SUCCESS:
return config_result
return build_tests(linux, request)
def _list_tests(linux: kunit_kernel.LinuxSourceTree, request: KunitExecRequest) -> List[str]: def _list_tests(linux: kunit_kernel.LinuxSourceTree, request: KunitExecRequest) -> List[str]:
args = ['kunit.action=list'] args = ['kunit.action=list']
if request.kernel_args: if request.kernel_args:
@ -121,8 +148,7 @@ def _suites_from_test_list(tests: List[str]) -> List[str]:
def exec_tests(linux: kunit_kernel.LinuxSourceTree, request: KunitExecRequest, def exec_tests(linux: kunit_kernel.LinuxSourceTree, request: KunitExecRequest) -> KunitResult:
parse_request: KunitParseRequest) -> KunitResult:
filter_globs = [request.filter_glob] filter_globs = [request.filter_glob]
if request.run_isolated: if request.run_isolated:
tests = _list_tests(linux, request) tests = _list_tests(linux, request)
@ -147,17 +173,23 @@ def exec_tests(linux: kunit_kernel.LinuxSourceTree, request: KunitExecRequest,
filter_glob=filter_glob, filter_glob=filter_glob,
build_dir=request.build_dir) build_dir=request.build_dir)
result = parse_tests(parse_request, run_result) result = parse_tests(request, run_result)
# run_kernel() doesn't block on the kernel exiting. # run_kernel() doesn't block on the kernel exiting.
# That only happens after we get the last line of output from `run_result`. # That only happens after we get the last line of output from `run_result`.
# So exec_time here actually contains parsing + execution time, which is fine. # So exec_time here actually contains parsing + execution time, which is fine.
test_end = time.time() test_end = time.time()
exec_time += test_end - test_start exec_time += test_end - test_start
test_counts.add_subtest_counts(result.result.test.counts) test_counts.add_subtest_counts(result.result.counts)
if len(filter_globs) == 1 and test_counts.crashed > 0:
bd = request.build_dir
print('The kernel seems to have crashed; you can decode the stack traces with:')
print('$ scripts/decode_stacktrace.sh {}/vmlinux {} < {} | tee {}/decoded.log | {} parse'.format(
bd, bd, kunit_kernel.get_outfile_path(bd), bd, sys.argv[0]))
kunit_status = _map_to_overall_status(test_counts.get_status()) kunit_status = _map_to_overall_status(test_counts.get_status())
return KunitResult(status=kunit_status, result=result.result, elapsed_time=exec_time) return KunitResult(status=kunit_status, result=result, elapsed_time=exec_time)
def _map_to_overall_status(test_status: kunit_parser.TestStatus) -> KunitStatus: def _map_to_overall_status(test_status: kunit_parser.TestStatus) -> KunitStatus:
if test_status in (kunit_parser.TestStatus.SUCCESS, kunit_parser.TestStatus.SKIPPED): if test_status in (kunit_parser.TestStatus.SUCCESS, kunit_parser.TestStatus.SKIPPED):
@ -168,14 +200,12 @@ def _map_to_overall_status(test_status: kunit_parser.TestStatus) -> KunitStatus:
def parse_tests(request: KunitParseRequest, input_data: Iterable[str]) -> KunitResult: def parse_tests(request: KunitParseRequest, input_data: Iterable[str]) -> KunitResult:
parse_start = time.time() parse_start = time.time()
test_result = kunit_parser.TestResult(kunit_parser.TestStatus.SUCCESS, test_result = kunit_parser.Test()
kunit_parser.Test(),
'Tests not Parsed.')
if request.raw_output: if request.raw_output:
# Treat unparsed results as one passing test. # Treat unparsed results as one passing test.
test_result.test.status = kunit_parser.TestStatus.SUCCESS test_result.status = kunit_parser.TestStatus.SUCCESS
test_result.test.counts.passed = 1 test_result.counts.passed = 1
output: Iterable[str] = input_data output: Iterable[str] = input_data
if request.raw_output == 'all': if request.raw_output == 'all':
@ -193,7 +223,7 @@ def parse_tests(request: KunitParseRequest, input_data: Iterable[str]) -> KunitR
if request.json: if request.json:
json_obj = kunit_json.get_json_result( json_obj = kunit_json.get_json_result(
test_result=test_result, test=test_result,
def_config='kunit_defconfig', def_config='kunit_defconfig',
build_dir=request.build_dir, build_dir=request.build_dir,
json_path=request.json) json_path=request.json)
@ -211,27 +241,15 @@ def run_tests(linux: kunit_kernel.LinuxSourceTree,
request: KunitRequest) -> KunitResult: request: KunitRequest) -> KunitResult:
run_start = time.time() run_start = time.time()
config_request = KunitConfigRequest(request.build_dir, config_result = config_tests(linux, request)
request.make_options)
config_result = config_tests(linux, config_request)
if config_result.status != KunitStatus.SUCCESS: if config_result.status != KunitStatus.SUCCESS:
return config_result return config_result
build_request = KunitBuildRequest(request.jobs, request.build_dir, build_result = build_tests(linux, request)
request.alltests,
request.make_options)
build_result = build_tests(linux, build_request)
if build_result.status != KunitStatus.SUCCESS: if build_result.status != KunitStatus.SUCCESS:
return build_result return build_result
exec_request = KunitExecRequest(request.timeout, request.build_dir, exec_result = exec_tests(linux, request)
request.alltests, request.filter_glob,
request.kernel_args, request.run_isolated)
parse_request = KunitParseRequest(request.raw_output,
request.build_dir,
request.json)
exec_result = exec_tests(linux, exec_request, parse_request)
run_end = time.time() run_end = time.time()
@ -264,6 +282,9 @@ def massage_argv(argv: Sequence[str]) -> Sequence[str]:
return f'{arg}={pseudo_bool_flag_defaults[arg]}' return f'{arg}={pseudo_bool_flag_defaults[arg]}'
return list(map(massage_arg, argv)) return list(map(massage_arg, argv))
def get_default_jobs() -> int:
return len(os.sched_getaffinity(0))
def add_common_opts(parser) -> None: def add_common_opts(parser) -> None:
parser.add_argument('--build_dir', parser.add_argument('--build_dir',
help='As in the make command, it specifies the build ' help='As in the make command, it specifies the build '
@ -280,6 +301,10 @@ def add_common_opts(parser) -> None:
' If given a directory, (e.g. lib/kunit), "/.kunitconfig" ' ' If given a directory, (e.g. lib/kunit), "/.kunitconfig" '
'will get automatically appended.', 'will get automatically appended.',
metavar='kunitconfig') metavar='kunitconfig')
parser.add_argument('--kconfig_add',
help='Additional Kconfig options to append to the '
'.kunitconfig, e.g. CONFIG_KASAN=y. Can be repeated.',
action='append')
parser.add_argument('--arch', parser.add_argument('--arch',
help=('Specifies the architecture to run tests under. ' help=('Specifies the architecture to run tests under. '
@ -310,7 +335,7 @@ def add_build_opts(parser) -> None:
parser.add_argument('--jobs', parser.add_argument('--jobs',
help='As in the make command, "Specifies the number of ' help='As in the make command, "Specifies the number of '
'jobs (commands) to run simultaneously."', 'jobs (commands) to run simultaneously."',
type=int, default=8, metavar='jobs') type=int, default=get_default_jobs(), metavar='jobs')
def add_exec_opts(parser) -> None: def add_exec_opts(parser) -> None:
parser.add_argument('--timeout', parser.add_argument('--timeout',
@ -398,20 +423,21 @@ def main(argv, linux=None):
if not linux: if not linux:
linux = kunit_kernel.LinuxSourceTree(cli_args.build_dir, linux = kunit_kernel.LinuxSourceTree(cli_args.build_dir,
kunitconfig_path=cli_args.kunitconfig, kunitconfig_path=cli_args.kunitconfig,
kconfig_add=cli_args.kconfig_add,
arch=cli_args.arch, arch=cli_args.arch,
cross_compile=cli_args.cross_compile, cross_compile=cli_args.cross_compile,
qemu_config_path=cli_args.qemu_config) qemu_config_path=cli_args.qemu_config)
request = KunitRequest(cli_args.raw_output, request = KunitRequest(build_dir=cli_args.build_dir,
cli_args.timeout, make_options=cli_args.make_options,
cli_args.jobs, jobs=cli_args.jobs,
cli_args.build_dir, alltests=cli_args.alltests,
cli_args.alltests, raw_output=cli_args.raw_output,
cli_args.filter_glob, json=cli_args.json,
cli_args.kernel_args, timeout=cli_args.timeout,
cli_args.run_isolated, filter_glob=cli_args.filter_glob,
cli_args.json, kernel_args=cli_args.kernel_args,
cli_args.make_options) run_isolated=cli_args.run_isolated)
result = run_tests(linux, request) result = run_tests(linux, request)
if result.status != KunitStatus.SUCCESS: if result.status != KunitStatus.SUCCESS:
sys.exit(1) sys.exit(1)
@ -423,12 +449,13 @@ def main(argv, linux=None):
if not linux: if not linux:
linux = kunit_kernel.LinuxSourceTree(cli_args.build_dir, linux = kunit_kernel.LinuxSourceTree(cli_args.build_dir,
kunitconfig_path=cli_args.kunitconfig, kunitconfig_path=cli_args.kunitconfig,
kconfig_add=cli_args.kconfig_add,
arch=cli_args.arch, arch=cli_args.arch,
cross_compile=cli_args.cross_compile, cross_compile=cli_args.cross_compile,
qemu_config_path=cli_args.qemu_config) qemu_config_path=cli_args.qemu_config)
request = KunitConfigRequest(cli_args.build_dir, request = KunitConfigRequest(build_dir=cli_args.build_dir,
cli_args.make_options) make_options=cli_args.make_options)
result = config_tests(linux, request) result = config_tests(linux, request)
kunit_parser.print_with_timestamp(( kunit_parser.print_with_timestamp((
'Elapsed time: %.3fs\n') % ( 'Elapsed time: %.3fs\n') % (
@ -439,15 +466,16 @@ def main(argv, linux=None):
if not linux: if not linux:
linux = kunit_kernel.LinuxSourceTree(cli_args.build_dir, linux = kunit_kernel.LinuxSourceTree(cli_args.build_dir,
kunitconfig_path=cli_args.kunitconfig, kunitconfig_path=cli_args.kunitconfig,
kconfig_add=cli_args.kconfig_add,
arch=cli_args.arch, arch=cli_args.arch,
cross_compile=cli_args.cross_compile, cross_compile=cli_args.cross_compile,
qemu_config_path=cli_args.qemu_config) qemu_config_path=cli_args.qemu_config)
request = KunitBuildRequest(cli_args.jobs, request = KunitBuildRequest(build_dir=cli_args.build_dir,
cli_args.build_dir, make_options=cli_args.make_options,
cli_args.alltests, jobs=cli_args.jobs,
cli_args.make_options) alltests=cli_args.alltests)
result = build_tests(linux, request) result = config_and_build_tests(linux, request)
kunit_parser.print_with_timestamp(( kunit_parser.print_with_timestamp((
'Elapsed time: %.3fs\n') % ( 'Elapsed time: %.3fs\n') % (
result.elapsed_time)) result.elapsed_time))
@ -457,20 +485,20 @@ def main(argv, linux=None):
if not linux: if not linux:
linux = kunit_kernel.LinuxSourceTree(cli_args.build_dir, linux = kunit_kernel.LinuxSourceTree(cli_args.build_dir,
kunitconfig_path=cli_args.kunitconfig, kunitconfig_path=cli_args.kunitconfig,
kconfig_add=cli_args.kconfig_add,
arch=cli_args.arch, arch=cli_args.arch,
cross_compile=cli_args.cross_compile, cross_compile=cli_args.cross_compile,
qemu_config_path=cli_args.qemu_config) qemu_config_path=cli_args.qemu_config)
exec_request = KunitExecRequest(cli_args.timeout, exec_request = KunitExecRequest(raw_output=cli_args.raw_output,
cli_args.build_dir, build_dir=cli_args.build_dir,
cli_args.alltests, json=cli_args.json,
cli_args.filter_glob, timeout=cli_args.timeout,
cli_args.kernel_args, alltests=cli_args.alltests,
cli_args.run_isolated) filter_glob=cli_args.filter_glob,
parse_request = KunitParseRequest(cli_args.raw_output, kernel_args=cli_args.kernel_args,
cli_args.build_dir, run_isolated=cli_args.run_isolated)
cli_args.json) result = exec_tests(linux, exec_request)
result = exec_tests(linux, exec_request, parse_request)
kunit_parser.print_with_timestamp(( kunit_parser.print_with_timestamp((
'Elapsed time: %.3fs\n') % (result.elapsed_time)) 'Elapsed time: %.3fs\n') % (result.elapsed_time))
if result.status != KunitStatus.SUCCESS: if result.status != KunitStatus.SUCCESS:
@ -482,9 +510,9 @@ def main(argv, linux=None):
else: else:
with open(cli_args.file, 'r', errors='backslashreplace') as f: with open(cli_args.file, 'r', errors='backslashreplace') as f:
kunit_output = f.read().splitlines() kunit_output = f.read().splitlines()
request = KunitParseRequest(cli_args.raw_output, request = KunitParseRequest(raw_output=cli_args.raw_output,
None, build_dir='',
cli_args.json) json=cli_args.json)
result = parse_tests(request, kunit_output) result = parse_tests(request, kunit_output)
if result.status != KunitStatus.SUCCESS: if result.status != KunitStatus.SUCCESS:
sys.exit(1) sys.exit(1)

View File

@ -62,33 +62,34 @@ class Kconfig(object):
for entry in self.entries(): for entry in self.entries():
f.write(str(entry) + '\n') f.write(str(entry) + '\n')
def parse_from_string(self, blob: str) -> None: def parse_file(path: str) -> Kconfig:
"""Parses a string containing KconfigEntrys and populates this Kconfig.""" with open(path, 'r') as f:
self._entries = [] return parse_from_string(f.read())
is_not_set_matcher = re.compile(CONFIG_IS_NOT_SET_PATTERN)
config_matcher = re.compile(CONFIG_PATTERN)
for line in blob.split('\n'):
line = line.strip()
if not line:
continue
match = config_matcher.match(line) def parse_from_string(blob: str) -> Kconfig:
if match: """Parses a string containing Kconfig entries."""
entry = KconfigEntry(match.group(1), match.group(2)) kconfig = Kconfig()
self.add_entry(entry) is_not_set_matcher = re.compile(CONFIG_IS_NOT_SET_PATTERN)
continue config_matcher = re.compile(CONFIG_PATTERN)
for line in blob.split('\n'):
line = line.strip()
if not line:
continue
empty_match = is_not_set_matcher.match(line) match = config_matcher.match(line)
if empty_match: if match:
entry = KconfigEntry(empty_match.group(1), 'n') entry = KconfigEntry(match.group(1), match.group(2))
self.add_entry(entry) kconfig.add_entry(entry)
continue continue
if line[0] == '#': empty_match = is_not_set_matcher.match(line)
continue if empty_match:
else: entry = KconfigEntry(empty_match.group(1), 'n')
raise KconfigParseError('Failed to parse: ' + line) kconfig.add_entry(entry)
continue
def read_from_file(self, path: str) -> None: if line[0] == '#':
with open(path, 'r') as f: continue
self.parse_from_string(f.read()) else:
raise KconfigParseError('Failed to parse: ' + line)
return kconfig

View File

@ -11,7 +11,7 @@ import os
import kunit_parser import kunit_parser
from kunit_parser import Test, TestResult, TestStatus from kunit_parser import Test, TestStatus
from typing import Any, Dict, Optional from typing import Any, Dict, Optional
JsonObj = Dict[str, Any] JsonObj = Dict[str, Any]
@ -30,6 +30,8 @@ def _get_group_json(test: Test, def_config: str,
test_case = {"name": subtest.name, "status": "FAIL"} test_case = {"name": subtest.name, "status": "FAIL"}
if subtest.status == TestStatus.SUCCESS: if subtest.status == TestStatus.SUCCESS:
test_case["status"] = "PASS" test_case["status"] = "PASS"
elif subtest.status == TestStatus.SKIPPED:
test_case["status"] = "SKIP"
elif subtest.status == TestStatus.TEST_CRASHED: elif subtest.status == TestStatus.TEST_CRASHED:
test_case["status"] = "ERROR" test_case["status"] = "ERROR"
test_cases.append(test_case) test_cases.append(test_case)
@ -48,9 +50,9 @@ def _get_group_json(test: Test, def_config: str,
} }
return test_group return test_group
def get_json_result(test_result: TestResult, def_config: str, def get_json_result(test: Test, def_config: str,
build_dir: Optional[str], json_path: str) -> str: build_dir: Optional[str], json_path: str) -> str:
test_group = _get_group_json(test_result.test, def_config, build_dir) test_group = _get_group_json(test, def_config, build_dir)
test_group["name"] = "KUnit Test Group" test_group["name"] = "KUnit Test Group"
json_obj = json.dumps(test_group, indent=4) json_obj = json.dumps(test_group, indent=4)
if json_path != 'stdout': if json_path != 'stdout':

View File

@ -21,6 +21,7 @@ import qemu_config
KCONFIG_PATH = '.config' KCONFIG_PATH = '.config'
KUNITCONFIG_PATH = '.kunitconfig' KUNITCONFIG_PATH = '.kunitconfig'
OLD_KUNITCONFIG_PATH = 'last_used_kunitconfig'
DEFAULT_KUNITCONFIG_PATH = 'tools/testing/kunit/configs/default.config' DEFAULT_KUNITCONFIG_PATH = 'tools/testing/kunit/configs/default.config'
BROKEN_ALLCONFIG_PATH = 'tools/testing/kunit/configs/broken_on_uml.config' BROKEN_ALLCONFIG_PATH = 'tools/testing/kunit/configs/broken_on_uml.config'
OUTFILE_PATH = 'test.log' OUTFILE_PATH = 'test.log'
@ -116,8 +117,7 @@ class LinuxSourceTreeOperationsQemu(LinuxSourceTreeOperations):
self._extra_qemu_params = qemu_arch_params.extra_qemu_params self._extra_qemu_params = qemu_arch_params.extra_qemu_params
def make_arch_qemuconfig(self, base_kunitconfig: kunit_config.Kconfig) -> None: def make_arch_qemuconfig(self, base_kunitconfig: kunit_config.Kconfig) -> None:
kconfig = kunit_config.Kconfig() kconfig = kunit_config.parse_from_string(self._kconfig)
kconfig.parse_from_string(self._kconfig)
base_kunitconfig.merge_in_entries(kconfig) base_kunitconfig.merge_in_entries(kconfig)
def start(self, params: List[str], build_dir: str) -> subprocess.Popen: def start(self, params: List[str], build_dir: str) -> subprocess.Popen:
@ -180,6 +180,9 @@ def get_kconfig_path(build_dir) -> str:
def get_kunitconfig_path(build_dir) -> str: def get_kunitconfig_path(build_dir) -> str:
return get_file_path(build_dir, KUNITCONFIG_PATH) return get_file_path(build_dir, KUNITCONFIG_PATH)
def get_old_kunitconfig_path(build_dir) -> str:
return get_file_path(build_dir, OLD_KUNITCONFIG_PATH)
def get_outfile_path(build_dir) -> str: def get_outfile_path(build_dir) -> str:
return get_file_path(build_dir, OUTFILE_PATH) return get_file_path(build_dir, OUTFILE_PATH)
@ -206,6 +209,7 @@ def get_source_tree_ops_from_qemu_config(config_path: str,
# exists as a file. # exists as a file.
module_path = '.' + os.path.join(os.path.basename(QEMU_CONFIGS_DIR), os.path.basename(config_path)) module_path = '.' + os.path.join(os.path.basename(QEMU_CONFIGS_DIR), os.path.basename(config_path))
spec = importlib.util.spec_from_file_location(module_path, config_path) spec = importlib.util.spec_from_file_location(module_path, config_path)
assert spec is not None
config = importlib.util.module_from_spec(spec) config = importlib.util.module_from_spec(spec)
# See https://github.com/python/typeshed/pull/2626 for context. # See https://github.com/python/typeshed/pull/2626 for context.
assert isinstance(spec.loader, importlib.abc.Loader) assert isinstance(spec.loader, importlib.abc.Loader)
@ -225,6 +229,7 @@ class LinuxSourceTree(object):
build_dir: str, build_dir: str,
load_config=True, load_config=True,
kunitconfig_path='', kunitconfig_path='',
kconfig_add: Optional[List[str]]=None,
arch=None, arch=None,
cross_compile=None, cross_compile=None,
qemu_config_path=None) -> None: qemu_config_path=None) -> None:
@ -249,8 +254,11 @@ class LinuxSourceTree(object):
if not os.path.exists(kunitconfig_path): if not os.path.exists(kunitconfig_path):
shutil.copyfile(DEFAULT_KUNITCONFIG_PATH, kunitconfig_path) shutil.copyfile(DEFAULT_KUNITCONFIG_PATH, kunitconfig_path)
self._kconfig = kunit_config.Kconfig() self._kconfig = kunit_config.parse_file(kunitconfig_path)
self._kconfig.read_from_file(kunitconfig_path) if kconfig_add:
kconfig = kunit_config.parse_from_string('\n'.join(kconfig_add))
self._kconfig.merge_in_entries(kconfig)
def clean(self) -> bool: def clean(self) -> bool:
try: try:
@ -262,17 +270,18 @@ class LinuxSourceTree(object):
def validate_config(self, build_dir) -> bool: def validate_config(self, build_dir) -> bool:
kconfig_path = get_kconfig_path(build_dir) kconfig_path = get_kconfig_path(build_dir)
validated_kconfig = kunit_config.Kconfig() validated_kconfig = kunit_config.parse_file(kconfig_path)
validated_kconfig.read_from_file(kconfig_path) if self._kconfig.is_subset_of(validated_kconfig):
if not self._kconfig.is_subset_of(validated_kconfig): return True
invalid = self._kconfig.entries() - validated_kconfig.entries() invalid = self._kconfig.entries() - validated_kconfig.entries()
message = 'Provided Kconfig is not contained in validated .config. Following fields found in kunitconfig, ' \ message = 'Not all Kconfig options selected in kunitconfig were in the generated .config.\n' \
'but not in .config: %s' % ( 'This is probably due to unsatisfied dependencies.\n' \
', '.join([str(e) for e in invalid]) 'Missing: ' + ', '.join([str(e) for e in invalid])
) if self._arch == 'um':
logging.error(message) message += '\nNote: many Kconfig options aren\'t available on UML. You can try running ' \
return False 'on a different architecture with something like "--arch=x86_64".'
return True logging.error(message)
return False
def build_config(self, build_dir, make_options) -> bool: def build_config(self, build_dir, make_options) -> bool:
kconfig_path = get_kconfig_path(build_dir) kconfig_path = get_kconfig_path(build_dir)
@ -285,25 +294,38 @@ class LinuxSourceTree(object):
except ConfigError as e: except ConfigError as e:
logging.error(e) logging.error(e)
return False return False
return self.validate_config(build_dir) if not self.validate_config(build_dir):
return False
old_path = get_old_kunitconfig_path(build_dir)
if os.path.exists(old_path):
os.remove(old_path) # write_to_file appends to the file
self._kconfig.write_to_file(old_path)
return True
def _kunitconfig_changed(self, build_dir: str) -> bool:
old_path = get_old_kunitconfig_path(build_dir)
if not os.path.exists(old_path):
return True
old_kconfig = kunit_config.parse_file(old_path)
return old_kconfig.entries() != self._kconfig.entries()
def build_reconfig(self, build_dir, make_options) -> bool: def build_reconfig(self, build_dir, make_options) -> bool:
"""Creates a new .config if it is not a subset of the .kunitconfig.""" """Creates a new .config if it is not a subset of the .kunitconfig."""
kconfig_path = get_kconfig_path(build_dir) kconfig_path = get_kconfig_path(build_dir)
if os.path.exists(kconfig_path): if not os.path.exists(kconfig_path):
existing_kconfig = kunit_config.Kconfig()
existing_kconfig.read_from_file(kconfig_path)
self._ops.make_arch_qemuconfig(self._kconfig)
if not self._kconfig.is_subset_of(existing_kconfig):
print('Regenerating .config ...')
os.remove(kconfig_path)
return self.build_config(build_dir, make_options)
else:
return True
else:
print('Generating .config ...') print('Generating .config ...')
return self.build_config(build_dir, make_options) return self.build_config(build_dir, make_options)
existing_kconfig = kunit_config.parse_file(kconfig_path)
self._ops.make_arch_qemuconfig(self._kconfig)
if self._kconfig.is_subset_of(existing_kconfig) and not self._kunitconfig_changed(build_dir):
return True
print('Regenerating .config ...')
os.remove(kconfig_path)
return self.build_config(build_dir, make_options)
def build_kernel(self, alltests, jobs, build_dir, make_options) -> bool: def build_kernel(self, alltests, jobs, build_dir, make_options) -> bool:
try: try:
if alltests: if alltests:

View File

@ -12,14 +12,11 @@
from __future__ import annotations from __future__ import annotations
import re import re
from collections import namedtuple import datetime
from datetime import datetime
from enum import Enum, auto from enum import Enum, auto
from functools import reduce from functools import reduce
from typing import Iterable, Iterator, List, Optional, Tuple from typing import Iterable, Iterator, List, Optional, Tuple
TestResult = namedtuple('TestResult', ['status','test','log'])
class Test(object): class Test(object):
""" """
A class to represent a test parsed from KTAP results. All KTAP A class to represent a test parsed from KTAP results. All KTAP
@ -168,42 +165,51 @@ class TestCounts:
class LineStream: class LineStream:
""" """
A class to represent the lines of kernel output. A class to represent the lines of kernel output.
Provides a peek()/pop() interface over an iterator of Provides a lazy peek()/pop() interface over an iterator of
(line#, text). (line#, text).
""" """
_lines: Iterator[Tuple[int, str]] _lines: Iterator[Tuple[int, str]]
_next: Tuple[int, str] _next: Tuple[int, str]
_need_next: bool
_done: bool _done: bool
def __init__(self, lines: Iterator[Tuple[int, str]]): def __init__(self, lines: Iterator[Tuple[int, str]]):
"""Creates a new LineStream that wraps the given iterator.""" """Creates a new LineStream that wraps the given iterator."""
self._lines = lines self._lines = lines
self._done = False self._done = False
self._need_next = True
self._next = (0, '') self._next = (0, '')
self._get_next()
def _get_next(self) -> None: def _get_next(self) -> None:
"""Advances the LineSteam to the next line.""" """Advances the LineSteam to the next line, if necessary."""
if not self._need_next:
return
try: try:
self._next = next(self._lines) self._next = next(self._lines)
except StopIteration: except StopIteration:
self._done = True self._done = True
finally:
self._need_next = False
def peek(self) -> str: def peek(self) -> str:
"""Returns the current line, without advancing the LineStream. """Returns the current line, without advancing the LineStream.
""" """
self._get_next()
return self._next[1] return self._next[1]
def pop(self) -> str: def pop(self) -> str:
"""Returns the current line and advances the LineStream to """Returns the current line and advances the LineStream to
the next line. the next line.
""" """
n = self._next s = self.peek()
self._get_next() if self._done:
return n[1] raise ValueError(f'LineStream: going past EOF, last line was {s}')
self._need_next = True
return s
def __bool__(self) -> bool: def __bool__(self) -> bool:
"""Returns True if stream has more lines.""" """Returns True if stream has more lines."""
self._get_next()
return not self._done return not self._done
# Only used by kunit_tool_test.py. # Only used by kunit_tool_test.py.
@ -216,6 +222,7 @@ class LineStream:
def line_number(self) -> int: def line_number(self) -> int:
"""Returns the line number of the current line.""" """Returns the line number of the current line."""
self._get_next()
return self._next[0] return self._next[0]
# Parsing helper methods: # Parsing helper methods:
@ -340,8 +347,8 @@ def parse_test_plan(lines: LineStream, test: Test) -> bool:
""" """
Parses test plan line and stores the expected number of subtests in Parses test plan line and stores the expected number of subtests in
test object. Reports an error if expected count is 0. test object. Reports an error if expected count is 0.
Returns False and reports missing test plan error if fails to parse Returns False and sets expected_count to None if there is no valid test
test plan. plan.
Accepted format: Accepted format:
- '1..[number of subtests]' - '1..[number of subtests]'
@ -356,14 +363,10 @@ def parse_test_plan(lines: LineStream, test: Test) -> bool:
match = TEST_PLAN.match(lines.peek()) match = TEST_PLAN.match(lines.peek())
if not match: if not match:
test.expected_count = None test.expected_count = None
test.add_error('missing plan line!')
return False return False
test.log.append(lines.pop()) test.log.append(lines.pop())
expected_count = int(match.group(1)) expected_count = int(match.group(1))
test.expected_count = expected_count test.expected_count = expected_count
if expected_count == 0:
test.status = TestStatus.NO_TESTS
test.add_error('0 tests run!')
return True return True
TEST_RESULT = re.compile(r'^(ok|not ok) ([0-9]+) (- )?([^#]*)( # .*)?$') TEST_RESULT = re.compile(r'^(ok|not ok) ([0-9]+) (- )?([^#]*)( # .*)?$')
@ -514,7 +517,7 @@ ANSI_LEN = len(red(''))
def print_with_timestamp(message: str) -> None: def print_with_timestamp(message: str) -> None:
"""Prints message with timestamp at beginning.""" """Prints message with timestamp at beginning."""
print('[%s] %s' % (datetime.now().strftime('%H:%M:%S'), message)) print('[%s] %s' % (datetime.datetime.now().strftime('%H:%M:%S'), message))
def format_test_divider(message: str, len_message: int) -> str: def format_test_divider(message: str, len_message: int) -> str:
""" """
@ -590,6 +593,8 @@ def format_test_result(test: Test) -> str:
return (green('[PASSED] ') + test.name) return (green('[PASSED] ') + test.name)
elif test.status == TestStatus.SKIPPED: elif test.status == TestStatus.SKIPPED:
return (yellow('[SKIPPED] ') + test.name) return (yellow('[SKIPPED] ') + test.name)
elif test.status == TestStatus.NO_TESTS:
return (yellow('[NO TESTS RUN] ') + test.name)
elif test.status == TestStatus.TEST_CRASHED: elif test.status == TestStatus.TEST_CRASHED:
print_log(test.log) print_log(test.log)
return (red('[CRASHED] ') + test.name) return (red('[CRASHED] ') + test.name)
@ -732,6 +737,7 @@ def parse_test(lines: LineStream, expected_num: int, log: List[str]) -> Test:
# test plan # test plan
test.name = "main" test.name = "main"
parse_test_plan(lines, test) parse_test_plan(lines, test)
parent_test = True
else: else:
# If KTAP/TAP header is not found, test must be subtest # If KTAP/TAP header is not found, test must be subtest
# header or test result line so parse attempt to parser # header or test result line so parse attempt to parser
@ -745,7 +751,7 @@ def parse_test(lines: LineStream, expected_num: int, log: List[str]) -> Test:
expected_count = test.expected_count expected_count = test.expected_count
subtests = [] subtests = []
test_num = 1 test_num = 1
while expected_count is None or test_num <= expected_count: while parent_test and (expected_count is None or test_num <= expected_count):
# Loop to parse any subtests. # Loop to parse any subtests.
# Break after parsing expected number of tests or # Break after parsing expected number of tests or
# if expected number of tests is unknown break when test # if expected number of tests is unknown break when test
@ -780,9 +786,15 @@ def parse_test(lines: LineStream, expected_num: int, log: List[str]) -> Test:
parse_test_result(lines, test, expected_num) parse_test_result(lines, test, expected_num)
else: else:
test.add_error('missing subtest result line!') test.add_error('missing subtest result line!')
# Check for there being no tests
if parent_test and len(subtests) == 0:
test.status = TestStatus.NO_TESTS
test.add_error('0 tests run!')
# Add statuses to TestCounts attribute in Test object # Add statuses to TestCounts attribute in Test object
bubble_up_test_results(test) bubble_up_test_results(test)
if parent_test: if parent_test and not main:
# If test has subtests and is not the main test object, print # If test has subtests and is not the main test object, print
# footer. # footer.
print_test_footer(test) print_test_footer(test)
@ -790,7 +802,7 @@ def parse_test(lines: LineStream, expected_num: int, log: List[str]) -> Test:
print_test_result(test) print_test_result(test)
return test return test
def parse_run_tests(kernel_output: Iterable[str]) -> TestResult: def parse_run_tests(kernel_output: Iterable[str]) -> Test:
""" """
Using kernel output, extract KTAP lines, parse the lines for test Using kernel output, extract KTAP lines, parse the lines for test
results and print condensed test results and summary line . results and print condensed test results and summary line .
@ -799,8 +811,7 @@ def parse_run_tests(kernel_output: Iterable[str]) -> TestResult:
kernel_output - Iterable object contains lines of kernel output kernel_output - Iterable object contains lines of kernel output
Return: Return:
TestResult - Tuple containg status of main test object, main test Test - the main test object with all subtests.
object with all subtests, and log of all KTAP lines.
""" """
print_with_timestamp(DIVIDER) print_with_timestamp(DIVIDER)
lines = extract_tap_lines(kernel_output) lines = extract_tap_lines(kernel_output)
@ -814,4 +825,4 @@ def parse_run_tests(kernel_output: Iterable[str]) -> TestResult:
test.status = test.counts.get_status() test.status = test.counts.get_status()
print_with_timestamp(DIVIDER) print_with_timestamp(DIVIDER)
print_summary_line(test) print_summary_line(test)
return TestResult(test.status, test, lines) return test

View File

@ -13,9 +13,10 @@ import tempfile, shutil # Handling test_tmpdir
import itertools import itertools
import json import json
import os
import signal import signal
import subprocess import subprocess
import os from typing import Iterable
import kunit_config import kunit_config
import kunit_parser import kunit_parser
@ -50,10 +51,9 @@ class KconfigTest(unittest.TestCase):
self.assertFalse(kconfig1.is_subset_of(kconfig0)) self.assertFalse(kconfig1.is_subset_of(kconfig0))
def test_read_from_file(self): def test_read_from_file(self):
kconfig = kunit_config.Kconfig()
kconfig_path = test_data_path('test_read_from_file.kconfig') kconfig_path = test_data_path('test_read_from_file.kconfig')
kconfig.read_from_file(kconfig_path) kconfig = kunit_config.parse_file(kconfig_path)
expected_kconfig = kunit_config.Kconfig() expected_kconfig = kunit_config.Kconfig()
expected_kconfig.add_entry( expected_kconfig.add_entry(
@ -86,8 +86,7 @@ class KconfigTest(unittest.TestCase):
expected_kconfig.write_to_file(kconfig_path) expected_kconfig.write_to_file(kconfig_path)
actual_kconfig = kunit_config.Kconfig() actual_kconfig = kunit_config.parse_file(kconfig_path)
actual_kconfig.read_from_file(kconfig_path)
self.assertEqual(actual_kconfig.entries(), self.assertEqual(actual_kconfig.entries(),
expected_kconfig.entries()) expected_kconfig.entries())
@ -179,7 +178,7 @@ class KUnitParserTest(unittest.TestCase):
with open(empty_log) as file: with open(empty_log) as file:
result = kunit_parser.parse_run_tests( result = kunit_parser.parse_run_tests(
kunit_parser.extract_tap_lines(file.readlines())) kunit_parser.extract_tap_lines(file.readlines()))
self.assertEqual(0, len(result.test.subtests)) self.assertEqual(0, len(result.subtests))
self.assertEqual( self.assertEqual(
kunit_parser.TestStatus.FAILURE_TO_PARSE_TESTS, kunit_parser.TestStatus.FAILURE_TO_PARSE_TESTS,
result.status) result.status)
@ -191,7 +190,10 @@ class KUnitParserTest(unittest.TestCase):
result = kunit_parser.parse_run_tests( result = kunit_parser.parse_run_tests(
kunit_parser.extract_tap_lines( kunit_parser.extract_tap_lines(
file.readlines())) file.readlines()))
self.assertEqual(2, result.test.counts.errors) # A missing test plan is not an error.
self.assertEqual(0, result.counts.errors)
# All tests should be accounted for.
self.assertEqual(10, result.counts.total())
self.assertEqual( self.assertEqual(
kunit_parser.TestStatus.SUCCESS, kunit_parser.TestStatus.SUCCESS,
result.status) result.status)
@ -201,11 +203,23 @@ class KUnitParserTest(unittest.TestCase):
with open(header_log) as file: with open(header_log) as file:
result = kunit_parser.parse_run_tests( result = kunit_parser.parse_run_tests(
kunit_parser.extract_tap_lines(file.readlines())) kunit_parser.extract_tap_lines(file.readlines()))
self.assertEqual(0, len(result.test.subtests)) self.assertEqual(0, len(result.subtests))
self.assertEqual( self.assertEqual(
kunit_parser.TestStatus.NO_TESTS, kunit_parser.TestStatus.NO_TESTS,
result.status) result.status)
def test_no_tests_no_plan(self):
no_plan_log = test_data_path('test_is_test_passed-no_tests_no_plan.log')
with open(no_plan_log) as file:
result = kunit_parser.parse_run_tests(
kunit_parser.extract_tap_lines(file.readlines()))
self.assertEqual(0, len(result.subtests[0].subtests[0].subtests))
self.assertEqual(
kunit_parser.TestStatus.NO_TESTS,
result.subtests[0].subtests[0].status)
self.assertEqual(1, result.counts.errors)
def test_no_kunit_output(self): def test_no_kunit_output(self):
crash_log = test_data_path('test_insufficient_memory.log') crash_log = test_data_path('test_insufficient_memory.log')
print_mock = mock.patch('builtins.print').start() print_mock = mock.patch('builtins.print').start()
@ -214,7 +228,7 @@ class KUnitParserTest(unittest.TestCase):
kunit_parser.extract_tap_lines(file.readlines())) kunit_parser.extract_tap_lines(file.readlines()))
print_mock.assert_any_call(StrContains('invalid KTAP input!')) print_mock.assert_any_call(StrContains('invalid KTAP input!'))
print_mock.stop() print_mock.stop()
self.assertEqual(0, len(result.test.subtests)) self.assertEqual(0, len(result.subtests))
def test_crashed_test(self): def test_crashed_test(self):
crashed_log = test_data_path('test_is_test_passed-crash.log') crashed_log = test_data_path('test_is_test_passed-crash.log')
@ -255,10 +269,10 @@ class KUnitParserTest(unittest.TestCase):
result.status) result.status)
self.assertEqual( self.assertEqual(
"sysctl_test", "sysctl_test",
result.test.subtests[0].name) result.subtests[0].name)
self.assertEqual( self.assertEqual(
"example", "example",
result.test.subtests[1].name) result.subtests[1].name)
file.close() file.close()
@ -269,7 +283,7 @@ class KUnitParserTest(unittest.TestCase):
self.assertEqual( self.assertEqual(
kunit_parser.TestStatus.SUCCESS, kunit_parser.TestStatus.SUCCESS,
result.status) result.status)
self.assertEqual('kunit-resource-test', result.test.subtests[0].name) self.assertEqual('kunit-resource-test', result.subtests[0].name)
def test_ignores_multiple_prefixes(self): def test_ignores_multiple_prefixes(self):
prefix_log = test_data_path('test_multiple_prefixes.log') prefix_log = test_data_path('test_multiple_prefixes.log')
@ -278,7 +292,7 @@ class KUnitParserTest(unittest.TestCase):
self.assertEqual( self.assertEqual(
kunit_parser.TestStatus.SUCCESS, kunit_parser.TestStatus.SUCCESS,
result.status) result.status)
self.assertEqual('kunit-resource-test', result.test.subtests[0].name) self.assertEqual('kunit-resource-test', result.subtests[0].name)
def test_prefix_mixed_kernel_output(self): def test_prefix_mixed_kernel_output(self):
mixed_prefix_log = test_data_path('test_interrupted_tap_output.log') mixed_prefix_log = test_data_path('test_interrupted_tap_output.log')
@ -287,7 +301,7 @@ class KUnitParserTest(unittest.TestCase):
self.assertEqual( self.assertEqual(
kunit_parser.TestStatus.SUCCESS, kunit_parser.TestStatus.SUCCESS,
result.status) result.status)
self.assertEqual('kunit-resource-test', result.test.subtests[0].name) self.assertEqual('kunit-resource-test', result.subtests[0].name)
def test_prefix_poundsign(self): def test_prefix_poundsign(self):
pound_log = test_data_path('test_pound_sign.log') pound_log = test_data_path('test_pound_sign.log')
@ -296,7 +310,7 @@ class KUnitParserTest(unittest.TestCase):
self.assertEqual( self.assertEqual(
kunit_parser.TestStatus.SUCCESS, kunit_parser.TestStatus.SUCCESS,
result.status) result.status)
self.assertEqual('kunit-resource-test', result.test.subtests[0].name) self.assertEqual('kunit-resource-test', result.subtests[0].name)
def test_kernel_panic_end(self): def test_kernel_panic_end(self):
panic_log = test_data_path('test_kernel_panic_interrupt.log') panic_log = test_data_path('test_kernel_panic_interrupt.log')
@ -305,7 +319,7 @@ class KUnitParserTest(unittest.TestCase):
self.assertEqual( self.assertEqual(
kunit_parser.TestStatus.TEST_CRASHED, kunit_parser.TestStatus.TEST_CRASHED,
result.status) result.status)
self.assertEqual('kunit-resource-test', result.test.subtests[0].name) self.assertEqual('kunit-resource-test', result.subtests[0].name)
def test_pound_no_prefix(self): def test_pound_no_prefix(self):
pound_log = test_data_path('test_pound_no_prefix.log') pound_log = test_data_path('test_pound_no_prefix.log')
@ -314,7 +328,46 @@ class KUnitParserTest(unittest.TestCase):
self.assertEqual( self.assertEqual(
kunit_parser.TestStatus.SUCCESS, kunit_parser.TestStatus.SUCCESS,
result.status) result.status)
self.assertEqual('kunit-resource-test', result.test.subtests[0].name) self.assertEqual('kunit-resource-test', result.subtests[0].name)
def line_stream_from_strs(strs: Iterable[str]) -> kunit_parser.LineStream:
return kunit_parser.LineStream(enumerate(strs, start=1))
class LineStreamTest(unittest.TestCase):
def test_basic(self):
stream = line_stream_from_strs(['hello', 'world'])
self.assertTrue(stream, msg='Should be more input')
self.assertEqual(stream.line_number(), 1)
self.assertEqual(stream.peek(), 'hello')
self.assertEqual(stream.pop(), 'hello')
self.assertTrue(stream, msg='Should be more input')
self.assertEqual(stream.line_number(), 2)
self.assertEqual(stream.peek(), 'world')
self.assertEqual(stream.pop(), 'world')
self.assertFalse(stream, msg='Should be no more input')
with self.assertRaisesRegex(ValueError, 'LineStream: going past EOF'):
stream.pop()
def test_is_lazy(self):
called_times = 0
def generator():
nonlocal called_times
for i in range(1,5):
called_times += 1
yield called_times, str(called_times)
stream = kunit_parser.LineStream(generator())
self.assertEqual(called_times, 0)
self.assertEqual(stream.pop(), '1')
self.assertEqual(called_times, 1)
self.assertEqual(stream.pop(), '2')
self.assertEqual(called_times, 2)
class LinuxSourceTreeTest(unittest.TestCase): class LinuxSourceTreeTest(unittest.TestCase):
@ -336,6 +389,10 @@ class LinuxSourceTreeTest(unittest.TestCase):
pass pass
kunit_kernel.LinuxSourceTree('', kunitconfig_path=dir) kunit_kernel.LinuxSourceTree('', kunitconfig_path=dir)
def test_kconfig_add(self):
tree = kunit_kernel.LinuxSourceTree('', kconfig_add=['CONFIG_NOT_REAL=y'])
self.assertIn(kunit_config.KconfigEntry('NOT_REAL', 'y'), tree._kconfig.entries())
def test_invalid_arch(self): def test_invalid_arch(self):
with self.assertRaisesRegex(kunit_kernel.ConfigError, 'not a valid arch, options are.*x86_64'): with self.assertRaisesRegex(kunit_kernel.ConfigError, 'not a valid arch, options are.*x86_64'):
kunit_kernel.LinuxSourceTree('', arch='invalid') kunit_kernel.LinuxSourceTree('', arch='invalid')
@ -356,6 +413,51 @@ class LinuxSourceTreeTest(unittest.TestCase):
with open(kunit_kernel.get_outfile_path(build_dir), 'rt') as outfile: with open(kunit_kernel.get_outfile_path(build_dir), 'rt') as outfile:
self.assertEqual(outfile.read(), 'hi\nbye\n', msg='Missing some output') self.assertEqual(outfile.read(), 'hi\nbye\n', msg='Missing some output')
def test_build_reconfig_no_config(self):
with tempfile.TemporaryDirectory('') as build_dir:
with open(kunit_kernel.get_kunitconfig_path(build_dir), 'w') as f:
f.write('CONFIG_KUNIT=y')
tree = kunit_kernel.LinuxSourceTree(build_dir)
mock_build_config = mock.patch.object(tree, 'build_config').start()
# Should generate the .config
self.assertTrue(tree.build_reconfig(build_dir, make_options=[]))
mock_build_config.assert_called_once_with(build_dir, [])
def test_build_reconfig_existing_config(self):
with tempfile.TemporaryDirectory('') as build_dir:
# Existing .config is a superset, should not touch it
with open(kunit_kernel.get_kunitconfig_path(build_dir), 'w') as f:
f.write('CONFIG_KUNIT=y')
with open(kunit_kernel.get_old_kunitconfig_path(build_dir), 'w') as f:
f.write('CONFIG_KUNIT=y')
with open(kunit_kernel.get_kconfig_path(build_dir), 'w') as f:
f.write('CONFIG_KUNIT=y\nCONFIG_KUNIT_TEST=y')
tree = kunit_kernel.LinuxSourceTree(build_dir)
mock_build_config = mock.patch.object(tree, 'build_config').start()
self.assertTrue(tree.build_reconfig(build_dir, make_options=[]))
self.assertEqual(mock_build_config.call_count, 0)
def test_build_reconfig_remove_option(self):
with tempfile.TemporaryDirectory('') as build_dir:
# We removed CONFIG_KUNIT_TEST=y from our .kunitconfig...
with open(kunit_kernel.get_kunitconfig_path(build_dir), 'w') as f:
f.write('CONFIG_KUNIT=y')
with open(kunit_kernel.get_old_kunitconfig_path(build_dir), 'w') as f:
f.write('CONFIG_KUNIT=y\nCONFIG_KUNIT_TEST=y')
with open(kunit_kernel.get_kconfig_path(build_dir), 'w') as f:
f.write('CONFIG_KUNIT=y\nCONFIG_KUNIT_TEST=y')
tree = kunit_kernel.LinuxSourceTree(build_dir)
mock_build_config = mock.patch.object(tree, 'build_config').start()
# ... so we should trigger a call to build_config()
self.assertTrue(tree.build_reconfig(build_dir, make_options=[]))
mock_build_config.assert_called_once_with(build_dir, [])
# TODO: add more test cases. # TODO: add more test cases.
@ -365,7 +467,7 @@ class KUnitJsonTest(unittest.TestCase):
with open(test_data_path(log_file)) as file: with open(test_data_path(log_file)) as file:
test_result = kunit_parser.parse_run_tests(file) test_result = kunit_parser.parse_run_tests(file)
json_obj = kunit_json.get_json_result( json_obj = kunit_json.get_json_result(
test_result=test_result, test=test_result,
def_config='kunit_defconfig', def_config='kunit_defconfig',
build_dir=None, build_dir=None,
json_path='stdout') json_path='stdout')
@ -383,6 +485,12 @@ class KUnitJsonTest(unittest.TestCase):
{'name': 'example_simple_test', 'status': 'ERROR'}, {'name': 'example_simple_test', 'status': 'ERROR'},
result["sub_groups"][1]["test_cases"][0]) result["sub_groups"][1]["test_cases"][0])
def test_skipped_test_json(self):
result = self._json_for('test_skip_tests.log')
self.assertEqual(
{'name': 'example_skip_test', 'status': 'SKIP'},
result["sub_groups"][1]["test_cases"][1])
def test_no_tests_json(self): def test_no_tests_json(self):
result = self._json_for('test_is_test_passed-no_tests_run_with_header.log') result = self._json_for('test_is_test_passed-no_tests_run_with_header.log')
self.assertEqual(0, len(result['sub_groups'])) self.assertEqual(0, len(result['sub_groups']))
@ -418,8 +526,8 @@ class KUnitMainTest(unittest.TestCase):
def test_build_passes_args_pass(self): def test_build_passes_args_pass(self):
kunit.main(['build'], self.linux_source_mock) kunit.main(['build'], self.linux_source_mock)
self.assertEqual(self.linux_source_mock.build_reconfig.call_count, 0) self.assertEqual(self.linux_source_mock.build_reconfig.call_count, 1)
self.linux_source_mock.build_kernel.assert_called_once_with(False, 8, '.kunit', None) self.linux_source_mock.build_kernel.assert_called_once_with(False, kunit.get_default_jobs(), '.kunit', None)
self.assertEqual(self.linux_source_mock.run_kernel.call_count, 0) self.assertEqual(self.linux_source_mock.run_kernel.call_count, 0)
def test_exec_passes_args_pass(self): def test_exec_passes_args_pass(self):
@ -525,8 +633,9 @@ class KUnitMainTest(unittest.TestCase):
def test_build_builddir(self): def test_build_builddir(self):
build_dir = '.kunit' build_dir = '.kunit'
jobs = kunit.get_default_jobs()
kunit.main(['build', '--build_dir', build_dir], self.linux_source_mock) kunit.main(['build', '--build_dir', build_dir], self.linux_source_mock)
self.linux_source_mock.build_kernel.assert_called_once_with(False, 8, build_dir, None) self.linux_source_mock.build_kernel.assert_called_once_with(False, jobs, build_dir, None)
def test_exec_builddir(self): def test_exec_builddir(self):
build_dir = '.kunit' build_dir = '.kunit'
@ -542,6 +651,7 @@ class KUnitMainTest(unittest.TestCase):
# Just verify that we parsed and initialized it correctly here. # Just verify that we parsed and initialized it correctly here.
mock_linux_init.assert_called_once_with('.kunit', mock_linux_init.assert_called_once_with('.kunit',
kunitconfig_path='mykunitconfig', kunitconfig_path='mykunitconfig',
kconfig_add=None,
arch='um', arch='um',
cross_compile=None, cross_compile=None,
qemu_config_path=None) qemu_config_path=None)
@ -553,6 +663,19 @@ class KUnitMainTest(unittest.TestCase):
# Just verify that we parsed and initialized it correctly here. # Just verify that we parsed and initialized it correctly here.
mock_linux_init.assert_called_once_with('.kunit', mock_linux_init.assert_called_once_with('.kunit',
kunitconfig_path='mykunitconfig', kunitconfig_path='mykunitconfig',
kconfig_add=None,
arch='um',
cross_compile=None,
qemu_config_path=None)
@mock.patch.object(kunit_kernel, 'LinuxSourceTree')
def test_run_kconfig_add(self, mock_linux_init):
mock_linux_init.return_value = self.linux_source_mock
kunit.main(['run', '--kconfig_add=CONFIG_KASAN=y', '--kconfig_add=CONFIG_KCSAN=y'])
# Just verify that we parsed and initialized it correctly here.
mock_linux_init.assert_called_once_with('.kunit',
kunitconfig_path=None,
kconfig_add=['CONFIG_KASAN=y', 'CONFIG_KCSAN=y'],
arch='um', arch='um',
cross_compile=None, cross_compile=None,
qemu_config_path=None) qemu_config_path=None)
@ -569,7 +692,7 @@ class KUnitMainTest(unittest.TestCase):
self.linux_source_mock.run_kernel.return_value = ['TAP version 14', 'init: random output'] + want self.linux_source_mock.run_kernel.return_value = ['TAP version 14', 'init: random output'] + want
got = kunit._list_tests(self.linux_source_mock, got = kunit._list_tests(self.linux_source_mock,
kunit.KunitExecRequest(300, '.kunit', False, 'suite*', None, 'suite')) kunit.KunitExecRequest(None, '.kunit', None, 300, False, 'suite*', None, 'suite'))
self.assertEqual(got, want) self.assertEqual(got, want)
# Should respect the user's filter glob when listing tests. # Should respect the user's filter glob when listing tests.
@ -584,7 +707,7 @@ class KUnitMainTest(unittest.TestCase):
# Should respect the user's filter glob when listing tests. # Should respect the user's filter glob when listing tests.
mock_tests.assert_called_once_with(mock.ANY, mock_tests.assert_called_once_with(mock.ANY,
kunit.KunitExecRequest(300, '.kunit', False, 'suite*.test*', None, 'suite')) kunit.KunitExecRequest(None, '.kunit', None, 300, False, 'suite*.test*', None, 'suite'))
self.linux_source_mock.run_kernel.assert_has_calls([ self.linux_source_mock.run_kernel.assert_has_calls([
mock.call(args=None, build_dir='.kunit', filter_glob='suite.test*', timeout=300), mock.call(args=None, build_dir='.kunit', filter_glob='suite.test*', timeout=300),
mock.call(args=None, build_dir='.kunit', filter_glob='suite2.test*', timeout=300), mock.call(args=None, build_dir='.kunit', filter_glob='suite2.test*', timeout=300),
@ -597,7 +720,7 @@ class KUnitMainTest(unittest.TestCase):
# Should respect the user's filter glob when listing tests. # Should respect the user's filter glob when listing tests.
mock_tests.assert_called_once_with(mock.ANY, mock_tests.assert_called_once_with(mock.ANY,
kunit.KunitExecRequest(300, '.kunit', False, 'suite*', None, 'test')) kunit.KunitExecRequest(None, '.kunit', None, 300, False, 'suite*', None, 'test'))
self.linux_source_mock.run_kernel.assert_has_calls([ self.linux_source_mock.run_kernel.assert_has_calls([
mock.call(args=None, build_dir='.kunit', filter_glob='suite.test1', timeout=300), mock.call(args=None, build_dir='.kunit', filter_glob='suite.test1', timeout=300),
mock.call(args=None, build_dir='.kunit', filter_glob='suite.test2', timeout=300), mock.call(args=None, build_dir='.kunit', filter_glob='suite.test2', timeout=300),

View File

@ -0,0 +1,81 @@
#!/usr/bin/env python3
# SPDX-License-Identifier: GPL-2.0
#
# This file runs some basic checks to verify kunit works.
# It is only of interest if you're making changes to KUnit itself.
#
# Copyright (C) 2021, Google LLC.
# Author: Daniel Latypov <dlatypov@google.com.com>
from concurrent import futures
import datetime
import os
import shutil
import subprocess
import sys
import textwrap
from typing import Dict, List, Sequence, Tuple
ABS_TOOL_PATH = os.path.abspath(os.path.dirname(__file__))
TIMEOUT = datetime.timedelta(minutes=5).total_seconds()
commands: Dict[str, Sequence[str]] = {
'kunit_tool_test.py': ['./kunit_tool_test.py'],
'kunit smoke test': ['./kunit.py', 'run', '--kunitconfig=lib/kunit', '--build_dir=kunit_run_checks'],
'pytype': ['/bin/sh', '-c', 'pytype *.py'],
'mypy': ['/bin/sh', '-c', 'mypy *.py'],
}
# The user might not have mypy or pytype installed, skip them if so.
# Note: you can install both via `$ pip install mypy pytype`
necessary_deps : Dict[str, str] = {
'pytype': 'pytype',
'mypy': 'mypy',
}
def main(argv: Sequence[str]) -> None:
if argv:
raise RuntimeError('This script takes no arguments')
future_to_name: Dict[futures.Future, str] = {}
executor = futures.ThreadPoolExecutor(max_workers=len(commands))
for name, argv in commands.items():
if name in necessary_deps and shutil.which(necessary_deps[name]) is None:
print(f'{name}: SKIPPED, {necessary_deps[name]} not in $PATH')
continue
f = executor.submit(run_cmd, argv)
future_to_name[f] = name
has_failures = False
print(f'Waiting on {len(future_to_name)} checks ({", ".join(future_to_name.values())})...')
for f in futures.as_completed(future_to_name.keys()):
name = future_to_name[f]
ex = f.exception()
if not ex:
print(f'{name}: PASSED')
continue
has_failures = True
if isinstance(ex, subprocess.TimeoutExpired):
print(f'{name}: TIMED OUT')
elif isinstance(ex, subprocess.CalledProcessError):
print(f'{name}: FAILED')
else:
print('{name}: unexpected exception: {ex}')
continue
output = ex.output
if output:
print(textwrap.indent(output.decode(), '> '))
executor.shutdown()
if has_failures:
sys.exit(1)
def run_cmd(argv: Sequence[str]):
subprocess.check_output(argv, stderr=subprocess.STDOUT, cwd=ABS_TOOL_PATH, timeout=TIMEOUT)
if __name__ == '__main__':
main(sys.argv[1:])

View File

@ -0,0 +1,7 @@
TAP version 14
1..1
# Subtest: suite
1..1
# Subtest: case
ok 1 - case # SKIP
ok 1 - suite