linux/arch/arm64/lib
Peter Collingbourne 1cbdf60bd1 kasan: arm64: support specialized outlined tag mismatch checks
By using outlined checks we can achieve a significant code size
improvement by moving the tag-based ASAN checks into separate
functions. Unlike the existing CONFIG_KASAN_OUTLINE mode these
functions have a custom calling convention that preserves most
registers and is specialized to the register containing the address
and the type of access, and as a result we can eliminate the code
size and performance overhead of a standard calling convention such
as AAPCS for these functions.

This change depends on a separate series of changes to Clang [1] to
support outlined checks in the kernel, although the change works fine
without them (we just don't get outlined checks). This is because the
flag -mllvm -hwasan-inline-all-checks=0 has no effect until the Clang
changes land. The flag was introduced in the Clang 9.0 timeframe as
part of the support for outlined checks in userspace and because our
minimum Clang version is 10.0 we can pass it unconditionally.

Outlined checks require a new runtime function with a custom calling
convention. Add this function to arch/arm64/lib.

I measured the code size of defconfig + tag-based KASAN, as well
as boot time (i.e. time to init launch) on a DragonBoard 845c with
an Android arm64 GKI kernel. The results are below:

                               code size    boot time
CONFIG_KASAN_INLINE=y before    92824064      6.18s
CONFIG_KASAN_INLINE=y after     38822400      6.65s
CONFIG_KASAN_OUTLINE=y          39215616     11.48s

We can see straight away that specialized outlined checks beat the
existing CONFIG_KASAN_OUTLINE=y on both code size and boot time
for tag-based ASAN.

As for the comparison between CONFIG_KASAN_INLINE=y before and after
we saw similar performance numbers in userspace [2] and decided
that since the performance overhead is minimal compared to the
overhead of tag-based ASAN itself as well as compared to the code
size improvements we would just replace the inlined checks with the
specialized outlined checks without the option to select between them,
and that is what I have implemented in this patch.

Signed-off-by: Peter Collingbourne <pcc@google.com>
Acked-by: Andrey Konovalov <andreyknvl@gmail.com>
Reviewed-by: Mark Rutland <mark.rutland@arm.com>
Tested-by: Mark Rutland <mark.rutland@arm.com>
Link: https://linux-review.googlesource.com/id/I1a30036c70ab3c3ee78d75ed9b87ef7cdc3fdb76
Link: [1] https://reviews.llvm.org/D90426
Link: [2] https://reviews.llvm.org/D56954
Link: https://lore.kernel.org/r/20210526174927.2477847-3-pcc@google.com
Signed-off-by: Will Deacon <will@kernel.org>
2021-05-26 23:31:26 +01:00
..
clear_page.S arm64: lib: Annotate {clear, copy}_page() as position-independent 2021-03-19 12:01:19 +00:00
clear_user.S arm64: uaccess cleanup macro naming 2020-12-02 19:49:11 +00:00
copy_from_user.S arm64: uaccess cleanup macro naming 2020-12-02 19:49:11 +00:00
copy_in_user.S arm64: uaccess cleanup macro naming 2020-12-02 19:49:11 +00:00
copy_page.S arm64: lib: Annotate {clear, copy}_page() as position-independent 2021-03-19 12:01:19 +00:00
copy_template.S treewide: Replace GPLv2 boilerplate/reference with SPDX - rule 234 2019-06-19 17:09:07 +02:00
copy_to_user.S arm64: uaccess cleanup macro naming 2020-12-02 19:49:11 +00:00
crc32.S arm64: lib: Consistently enable crc32 extension 2020-04-28 14:36:32 +01:00
csum.c arm64: csum: Disable KASAN for do_csum() 2020-04-15 21:36:41 +01:00
delay.c treewide: Replace GPLv2 boilerplate/reference with SPDX - rule 234 2019-06-19 17:09:07 +02:00
error-inject.c arm64: Add support for function error injection 2019-08-07 13:53:09 +01:00
kasan_sw_tags.S kasan: arm64: support specialized outlined tag mismatch checks 2021-05-26 23:31:26 +01:00
Makefile kasan: arm64: support specialized outlined tag mismatch checks 2021-05-26 23:31:26 +01:00
memchr.S arm64: lib: Use modern annotations for assembly functions 2020-01-08 12:23:02 +00:00
memcmp.S arm64: lib: Use modern annotations for assembly functions 2020-01-08 12:23:02 +00:00
memcpy.S arm64: Change .weak to SYM_FUNC_START_WEAK_PI for arch/arm64/lib/mem*.S 2020-10-30 08:32:31 +00:00
memmove.S arm64: Change .weak to SYM_FUNC_START_WEAK_PI for arch/arm64/lib/mem*.S 2020-10-30 08:32:31 +00:00
memset.S arm64: Change .weak to SYM_FUNC_START_WEAK_PI for arch/arm64/lib/mem*.S 2020-10-30 08:32:31 +00:00
mte.S arm64: kasan: simplify and inline MTE functions 2021-02-26 09:41:03 -08:00
strchr.S arm64: lib: Use modern annotations for assembly functions 2020-01-08 12:23:02 +00:00
strcmp.S arm64: fix spelling mistake "ca not" -> "cannot" 2020-03-17 18:22:40 +00:00
strlen.S arm64: lib: Use modern annotations for assembly functions 2020-01-08 12:23:02 +00:00
strncmp.S arm64: lib: Use modern annotations for assembly functions 2020-01-08 12:23:02 +00:00
strnlen.S arm64: lib: Use modern annotations for assembly functions 2020-01-08 12:23:02 +00:00
strrchr.S arm64: lib: Use modern annotations for assembly functions 2020-01-08 12:23:02 +00:00
tishift.S arm64: lib: Use modern annotations for assembly functions 2020-01-08 12:23:02 +00:00
uaccess_flushcache.c arm64: uaccess: simplify __copy_user_flushcache() 2020-12-02 19:49:10 +00:00
xor-neon.c treewide: Replace GPLv2 boilerplate/reference with SPDX - rule 500 2019-06-19 17:09:55 +02:00