2017-12-12 04:34:59 +00:00
const std = @import ( " std " ) ;
2021-04-12 23:44:51 +00:00
const builtin = std . builtin ;
2017-12-12 04:34:59 +00:00
const Builder = std . build . Builder ;
2017-04-19 05:13:15 +00:00
const tests = @import ( " test/tests.zig " ) ;
2017-12-12 04:34:59 +00:00
const BufMap = std . BufMap ;
const warn = std . debug . warn ;
const mem = std . mem ;
const ArrayList = std . ArrayList ;
const io = std . io ;
2019-05-26 17:17:34 +00:00
const fs = std . fs ;
2019-07-14 07:06:20 +00:00
const InstallDirectoryOptions = std . build . InstallDirectoryOptions ;
2020-09-04 03:23:00 +00:00
const assert = std . debug . assert ;
2017-04-19 05:13:15 +00:00
2021-06-04 18:28:39 +00:00
const zig_version = std . builtin . Version { . major = 0 , . minor = 9 , . patch = 0 } ;
2020-08-15 21:17:40 +00:00
2018-05-31 14:56:59 +00:00
pub fn build ( b : * Builder ) ! void {
2019-07-04 21:43:56 +00:00
b . setPreferredReleaseMode ( . ReleaseFast ) ;
2017-10-21 21:31:06 +00:00
const mode = b . standardReleaseOptions ( ) ;
2020-07-21 18:18:51 +00:00
const target = b . standardTargetOptions ( . { } ) ;
2021-06-03 17:06:22 +00:00
const single_threaded = b . option ( bool , " single-threaded " , " Build artifacts that run in single threaded mode " ) orelse false ;
2021-09-16 20:09:32 +00:00
const use_zig_libcxx = b . option ( bool , " use-zig-libcxx " , " If libc++ is needed, use zig's bundled version, don't try to integrate with the system " ) orelse false ;
2017-10-21 21:31:06 +00:00
2017-11-07 08:22:27 +00:00
var docgen_exe = b . addExecutable ( " docgen " , " doc/docgen.zig " ) ;
2021-06-03 17:06:22 +00:00
docgen_exe . single_threaded = single_threaded ;
2017-11-07 08:22:27 +00:00
2019-05-26 17:17:34 +00:00
const rel_zig_exe = try fs . path . relative ( b . allocator , b . build_root , b . zig_exe ) ;
const langref_out_path = fs . path . join (
2019-02-07 05:42:41 +00:00
b . allocator ,
2019-11-27 08:30:39 +00:00
& [ _ ] [ ] const u8 { b . cache_root , " langref.html " } ,
2019-02-07 05:42:41 +00:00
) catch unreachable ;
breaking changes to zig build API and improved caching
* in Zig build scripts, getOutputPath() is no longer a valid function
to call, unless setOutputDir() was used, or within a custom make()
function. Instead there is more convenient API to use which takes
advantage of the caching system. Search this commit diff for
`exe.run()` for an example.
* Zig build by default enables caching. All build artifacts will go
into zig-cache. If you want to access build artifacts in a convenient
location, it is recommended to add an `install` step. Otherwise
you can use the `run()` API mentioned above to execute programs
directly from their location in the cache. Closes #330.
`addSystemCommand` is available for programs not built with Zig
build.
* Please note that Zig does no cache evicting yet. You may have to
manually delete zig-cache directories periodically to keep disk
usage down. It's planned for this to be a simple Least Recently
Used eviction system eventually.
* `--output`, `--output-lib`, and `--output-h` are removed. Instead,
use `--output-dir` which defaults to the current working directory.
Or take advantage of `--cache on`, which will print the main output
path to stdout, and the other artifacts will be in the same directory
with predictable file names. `--disable-gen-h` is available when
one wants to prevent .h file generation.
* `@cImport` is always independently cached now. Closes #2015.
It always writes the generated Zig code to disk which makes debug
info and compile errors better. No more "TODO: remember C source
location to display here"
* Fix .d file parsing. (Fixes the MacOS CI failure)
* Zig no longer creates "temporary files" other than inside a
zig-cache directory.
This breaks the CLI API that Godbolt uses. The suggested new invocation
can be found in this commit diff, in the changes to `test/cli.zig`.
2019-03-09 03:53:35 +00:00
var docgen_cmd = docgen_exe . run ( ) ;
2019-11-27 08:30:39 +00:00
docgen_cmd . addArgs ( & [ _ ] [ ] const u8 {
2018-01-17 06:50:35 +00:00
rel_zig_exe ,
2019-05-26 17:17:34 +00:00
" doc " + + fs . path . sep_str + + " langref.html.in " ,
2018-09-12 15:49:46 +00:00
langref_out_path ,
2017-11-07 08:22:27 +00:00
} ) ;
docgen_cmd . step . dependOn ( & docgen_exe . step ) ;
const docs_step = b . step ( " docs " , " Build documentation " ) ;
docs_step . dependOn ( & docgen_cmd . step ) ;
2021-04-30 21:41:28 +00:00
const toolchain_step = b . step ( " test-toolchain " , " Run the tests for the toolchain " ) ;
2018-01-03 23:25:17 +00:00
2020-09-22 01:38:55 +00:00
var test_stage2 = b . addTest ( " src/test.zig " ) ;
2020-08-20 20:16:04 +00:00
test_stage2 . setBuildMode ( mode ) ;
avoid calling into stage1 backend when AstGen fails
The motivation for this commit is that there exists source files which
produce ast-check errors, but crash stage1 or otherwise trigger stage1
bugs. Previously to this commit, Zig would run AstGen, collect the
compile errors, run stage1, report stage1 compile errors and exit if
any, and then report AstGen compile errors.
The main change in this commit is to report AstGen errors prior to
invoking stage1, and in fact if any AstGen errors occur, do not invoke
stage1 at all.
This caused most of the compile error tests to fail due to things such
as unused local variables and mismatched stage1/stage2 error messages.
It was taking a long time to update the test cases one-by-one, so I
took this opportunity to unify the stage1 and stage2 testing harness,
specifically with regards to compile errors. In this way we can start
keeping track of which tests pass for 1, 2, or both.
`zig build test-compile-errors` no longer works; it is now integrated
into `zig build test-stage2`.
This is one step closer to executing compile error tests in parallel; in
fact the ThreadPool object is already in scope.
There are some cases where the stage1 compile errors were actually
better; those are left failing in this commit, to be addressed in a
follow-up commit.
Other changes in this commit:
* build.zig: improve support for -Dstage1 used with the test step.
* AstGen: minor cosmetic changes to error messages.
* stage2: add -fstage1 and -fno-stage1 flags. This now allows one to
download a binary of the zig compiler and use the llvm backend of
self-hosted. This was also needed for hooking up the test harness.
However, I realized that stage1 calls exit() and also has memory
leaks, so had to complicate the test harness by not using this flag
after all and instead invoking as a child process.
- These CLI flags will disappear once we start shipping the
self-hosted compiler as the main compiler. Until then, they can be
used to try out the work-in-progress stage2.
* stage2: select the LLVM backend by default for release modes, as long
as the target architecture is supported by LLVM.
* test harness: support setting the optimize mode
2021-06-30 18:27:39 +00:00
test_stage2 . addPackagePath ( " test_cases " , " test/cases.zig " ) ;
2021-06-03 17:06:22 +00:00
test_stage2 . single_threaded = single_threaded ;
2018-01-04 02:32:50 +00:00
2019-11-27 08:30:39 +00:00
const fmt_build_zig = b . addFmt ( & [ _ ] [ ] const u8 { " build.zig " } ) ;
2019-02-26 23:10:40 +00:00
2021-04-24 20:39:26 +00:00
const skip_debug = b . option ( bool , " skip-debug " , " Main test suite skips debug builds " ) orelse false ;
2018-07-11 18:09:05 +00:00
const skip_release = b . option ( bool , " skip-release " , " Main test suite skips release builds " ) orelse false ;
2018-09-12 18:50:26 +00:00
const skip_release_small = b . option ( bool , " skip-release-small " , " Main test suite skips release-small builds " ) orelse skip_release ;
const skip_release_fast = b . option ( bool , " skip-release-fast " , " Main test suite skips release-fast builds " ) orelse skip_release ;
const skip_release_safe = b . option ( bool , " skip-release-safe " , " Main test suite skips release-safe builds " ) orelse skip_release ;
2019-05-15 01:21:59 +00:00
const skip_non_native = b . option ( bool , " skip-non-native " , " Main test suite skips non-native builds " ) orelse false ;
2019-09-22 03:55:56 +00:00
const skip_libc = b . option ( bool , " skip-libc " , " Main test suite skips tests that link libc " ) orelse false ;
2020-08-31 21:54:05 +00:00
const skip_compile_errors = b . option ( bool , " skip-compile-errors " , " Main test suite skips compile error tests " ) orelse false ;
2021-04-15 06:10:51 +00:00
const skip_run_translated_c = b . option ( bool , " skip-run-translated-c " , " Main test suite skips run-translated-c tests " ) orelse false ;
2021-04-15 23:20:43 +00:00
const skip_stage2_tests = b . option ( bool , " skip-stage2-tests " , " Main test suite skips self-hosted compiler tests " ) orelse false ;
2021-04-17 01:58:27 +00:00
const skip_install_lib_files = b . option ( bool , " skip-install-lib-files " , " Do not copy lib/ files to installation prefix " ) orelse false ;
2017-10-21 21:31:06 +00:00
2019-07-14 07:06:20 +00:00
const only_install_lib_files = b . option ( bool , " lib-files-only " , " Only install library files " ) orelse false ;
ability to build stage1 using only a zig tarball
The main idea here is that there are now 2 ways to get a stage1 zig
binary:
* The cmake path. Requirements: cmake, system C++ compiler, system
LLVM, LLD, Clang libraries, compiled by the system C++ compiler.
* The zig path. Requirements: a zig installation, system LLVM, LLD,
Clang libraries, compiled by the zig installation.
Note that the former can be used to now take the latter path.
Removed config.h.in and config.zig.in. The build.zig script no longer is
coupled to the cmake script.
cmake no longer tries to determine the zig version. A build with cmake
will yield a stage1 zig binary that reports 0.0.0+zig0. This is going to
get reverted.
`zig build` now accepts `-Dstage1` which will build the stage1 compiler,
and put the stage2 backend behind a feature flag.
build.zig is simplified to only support the use case of enabling LLVM
support when the LLVM, LLD, and Clang libraries were built by zig. This
part is probably sadly going to have to get reverted to make package
maintainers happy.
Zig build system addBuildOption supports a couple new types.
The biggest reason to make this change is that the zig path is an
attractive option for doing compiler development work on Windows. It
allows people to work on the compiler without having MSVC installed,
using only a .zip file that contains Zig + LLVM/LLD/Clang libraries.
2020-12-05 04:33:29 +00:00
const is_stage1 = b . option ( bool , " stage1 " , " Build the stage1 compiler, put stage2 behind a feature flag " ) orelse false ;
2021-04-15 09:02:36 +00:00
const omit_stage2 = b . option ( bool , " omit-stage2 " , " Do not include stage2 behind a feature flag inside stage1 " ) orelse false ;
2020-12-08 00:23:17 +00:00
const static_llvm = b . option ( bool , " static-llvm " , " Disable integration with system-installed LLVM, Clang, LLD, and libc++ " ) orelse false ;
const enable_llvm = b . option ( bool , " enable-llvm " , " Build self-hosted compiler with LLVM backend enabled " ) orelse ( is_stage1 or static_llvm ) ;
2021-09-01 05:50:10 +00:00
const llvm_has_m68k = b . option (
bool ,
" llvm-has-m68k " ,
" Whether LLVM has the experimental target m68k enabled " ,
) orelse false ;
const llvm_has_csky = b . option (
bool ,
" llvm-has-csky " ,
" Whether LLVM has the experimental target csky enabled " ,
) orelse false ;
const llvm_has_ve = b . option (
bool ,
" llvm-has-ve " ,
" Whether LLVM has the experimental target ve enabled " ,
) orelse false ;
const llvm_has_arc = b . option (
bool ,
" llvm-has-arc " ,
" Whether LLVM has the experimental target arc enabled " ,
) orelse false ;
2021-08-02 08:04:54 +00:00
const enable_macos_sdk = b . option ( bool , " enable-macos-sdk " , " Run tests requiring presence of macOS SDK and frameworks " ) orelse false ;
2020-07-04 01:15:01 +00:00
const config_h_path_option = b . option ( [ ] const u8 , " config_h " , " Path to the generated config.h " ) ;
2020-04-06 17:07:19 +00:00
2021-04-17 01:58:27 +00:00
if ( ! skip_install_lib_files ) {
b . installDirectory ( InstallDirectoryOptions {
. source_dir = " lib " ,
2021-06-11 08:50:48 +00:00
. install_dir = . lib ,
2021-04-17 01:58:27 +00:00
. install_subdir = " zig " ,
. exclude_extensions = & [ _ ] [ ] const u8 {
" README.md " ,
" .z.0 " ,
" .z.9 " ,
" .gz " ,
" rfc1951.txt " ,
} ,
2021-05-19 00:18:42 +00:00
. blank_extensions = & [ _ ] [ ] const u8 {
" test.zig " ,
} ,
2021-04-17 01:58:27 +00:00
} ) ;
}
2017-10-21 21:31:06 +00:00
2020-09-08 18:15:32 +00:00
if ( only_install_lib_files )
return ;
2020-09-17 00:17:37 +00:00
const tracy = b . option ( [ ] const u8 , " tracy " , " Enable Tracy integration. Supply path to Tracy source " ) ;
const link_libc = b . option ( bool , " force-link-libc " , " Force self-hosted compiler to link libc " ) orelse enable_llvm ;
2021-02-24 23:36:27 +00:00
const strip = b . option ( bool , " strip " , " Omit debug information " ) orelse false ;
2020-09-17 00:17:37 +00:00
2021-05-04 03:42:36 +00:00
const mem_leak_frames : u32 = b . option ( u32 , " mem-leak-frames " , " How many stack frames to print when a memory leak occurs. Tests get 2x this amount. " ) orelse blk : {
if ( strip ) break : blk @as ( u32 , 0 ) ;
if ( mode ! = . Debug ) break : blk 0 ;
break : blk 4 ;
} ;
ability to build stage1 using only a zig tarball
The main idea here is that there are now 2 ways to get a stage1 zig
binary:
* The cmake path. Requirements: cmake, system C++ compiler, system
LLVM, LLD, Clang libraries, compiled by the system C++ compiler.
* The zig path. Requirements: a zig installation, system LLVM, LLD,
Clang libraries, compiled by the zig installation.
Note that the former can be used to now take the latter path.
Removed config.h.in and config.zig.in. The build.zig script no longer is
coupled to the cmake script.
cmake no longer tries to determine the zig version. A build with cmake
will yield a stage1 zig binary that reports 0.0.0+zig0. This is going to
get reverted.
`zig build` now accepts `-Dstage1` which will build the stage1 compiler,
and put the stage2 backend behind a feature flag.
build.zig is simplified to only support the use case of enabling LLVM
support when the LLVM, LLD, and Clang libraries were built by zig. This
part is probably sadly going to have to get reverted to make package
maintainers happy.
Zig build system addBuildOption supports a couple new types.
The biggest reason to make this change is that the zig path is an
attractive option for doing compiler development work on Windows. It
allows people to work on the compiler without having MSVC installed,
using only a .zip file that contains Zig + LLVM/LLD/Clang libraries.
2020-12-05 04:33:29 +00:00
const main_file = if ( is_stage1 ) " src/stage1.zig " else " src/main.zig " ;
2020-12-05 05:24:04 +00:00
var exe = b . addExecutable ( " zig " , main_file ) ;
2021-02-24 23:36:27 +00:00
exe . strip = strip ;
2020-09-08 18:15:32 +00:00
exe . install ( ) ;
exe . setBuildMode ( mode ) ;
exe . setTarget ( target ) ;
2021-09-16 18:21:19 +00:00
if ( ! skip_stage2_tests ) {
toolchain_step . dependOn ( & exe . step ) ;
}
2020-09-08 18:15:32 +00:00
b . default_step . dependOn ( & exe . step ) ;
2021-06-03 17:06:22 +00:00
exe . single_threaded = single_threaded ;
2020-09-08 18:15:32 +00:00
2021-09-16 21:51:29 +00:00
if ( target . isWindows ( ) and target . getAbi ( ) = = . gnu ) {
// LTO is currently broken on mingw, this can be removed when it's fixed.
exe . want_lto = false ;
2021-09-22 19:39:02 +00:00
test_stage2 . want_lto = false ;
2021-09-16 21:51:29 +00:00
}
2021-08-26 23:53:23 +00:00
const exe_options = b . addOptions ( ) ;
exe . addOptions ( " build_options " , exe_options ) ;
exe_options . addOption ( u32 , " mem_leak_frames " , mem_leak_frames ) ;
exe_options . addOption ( bool , " skip_non_native " , skip_non_native ) ;
exe_options . addOption ( bool , " have_llvm " , enable_llvm ) ;
2021-09-01 05:50:10 +00:00
exe_options . addOption ( bool , " llvm_has_m68k " , llvm_has_m68k ) ;
exe_options . addOption ( bool , " llvm_has_csky " , llvm_has_csky ) ;
exe_options . addOption ( bool , " llvm_has_ve " , llvm_has_ve ) ;
exe_options . addOption ( bool , " llvm_has_arc " , llvm_has_arc ) ;
2021-08-26 23:53:23 +00:00
2020-09-08 18:15:32 +00:00
if ( enable_llvm ) {
2020-12-08 00:23:17 +00:00
const cmake_cfg = if ( static_llvm ) null else findAndParseConfigH ( b , config_h_path_option ) ;
2020-12-29 17:52:53 +00:00
ability to build stage1 using only a zig tarball
The main idea here is that there are now 2 ways to get a stage1 zig
binary:
* The cmake path. Requirements: cmake, system C++ compiler, system
LLVM, LLD, Clang libraries, compiled by the system C++ compiler.
* The zig path. Requirements: a zig installation, system LLVM, LLD,
Clang libraries, compiled by the zig installation.
Note that the former can be used to now take the latter path.
Removed config.h.in and config.zig.in. The build.zig script no longer is
coupled to the cmake script.
cmake no longer tries to determine the zig version. A build with cmake
will yield a stage1 zig binary that reports 0.0.0+zig0. This is going to
get reverted.
`zig build` now accepts `-Dstage1` which will build the stage1 compiler,
and put the stage2 backend behind a feature flag.
build.zig is simplified to only support the use case of enabling LLVM
support when the LLVM, LLD, and Clang libraries were built by zig. This
part is probably sadly going to have to get reverted to make package
maintainers happy.
Zig build system addBuildOption supports a couple new types.
The biggest reason to make this change is that the zig path is an
attractive option for doing compiler development work on Windows. It
allows people to work on the compiler without having MSVC installed,
using only a .zip file that contains Zig + LLVM/LLD/Clang libraries.
2020-12-05 04:33:29 +00:00
if ( is_stage1 ) {
exe . addIncludeDir ( " src " ) ;
exe . addIncludeDir ( " deps/SoftFloat-3e/source/include " ) ;
avoid calling into stage1 backend when AstGen fails
The motivation for this commit is that there exists source files which
produce ast-check errors, but crash stage1 or otherwise trigger stage1
bugs. Previously to this commit, Zig would run AstGen, collect the
compile errors, run stage1, report stage1 compile errors and exit if
any, and then report AstGen compile errors.
The main change in this commit is to report AstGen errors prior to
invoking stage1, and in fact if any AstGen errors occur, do not invoke
stage1 at all.
This caused most of the compile error tests to fail due to things such
as unused local variables and mismatched stage1/stage2 error messages.
It was taking a long time to update the test cases one-by-one, so I
took this opportunity to unify the stage1 and stage2 testing harness,
specifically with regards to compile errors. In this way we can start
keeping track of which tests pass for 1, 2, or both.
`zig build test-compile-errors` no longer works; it is now integrated
into `zig build test-stage2`.
This is one step closer to executing compile error tests in parallel; in
fact the ThreadPool object is already in scope.
There are some cases where the stage1 compile errors were actually
better; those are left failing in this commit, to be addressed in a
follow-up commit.
Other changes in this commit:
* build.zig: improve support for -Dstage1 used with the test step.
* AstGen: minor cosmetic changes to error messages.
* stage2: add -fstage1 and -fno-stage1 flags. This now allows one to
download a binary of the zig compiler and use the llvm backend of
self-hosted. This was also needed for hooking up the test harness.
However, I realized that stage1 calls exit() and also has memory
leaks, so had to complicate the test harness by not using this flag
after all and instead invoking as a child process.
- These CLI flags will disappear once we start shipping the
self-hosted compiler as the main compiler. Until then, they can be
used to try out the work-in-progress stage2.
* stage2: select the LLVM backend by default for release modes, as long
as the target architecture is supported by LLVM.
* test harness: support setting the optimize mode
2021-06-30 18:27:39 +00:00
test_stage2 . addIncludeDir ( " src " ) ;
test_stage2 . addIncludeDir ( " deps/SoftFloat-3e/source/include " ) ;
2020-12-05 05:24:04 +00:00
// This is intentionally a dummy path. stage1.zig tries to @import("compiler_rt") in case
// of being built by cmake. But when built by zig it's gonna get a compiler_rt so that
// is pointless.
exe . addPackagePath ( " compiler_rt " , " src/empty.zig " ) ;
2021-03-23 22:58:24 +00:00
exe . defineCMacro ( " ZIG_LINK_MODE " , " Static " ) ;
avoid calling into stage1 backend when AstGen fails
The motivation for this commit is that there exists source files which
produce ast-check errors, but crash stage1 or otherwise trigger stage1
bugs. Previously to this commit, Zig would run AstGen, collect the
compile errors, run stage1, report stage1 compile errors and exit if
any, and then report AstGen compile errors.
The main change in this commit is to report AstGen errors prior to
invoking stage1, and in fact if any AstGen errors occur, do not invoke
stage1 at all.
This caused most of the compile error tests to fail due to things such
as unused local variables and mismatched stage1/stage2 error messages.
It was taking a long time to update the test cases one-by-one, so I
took this opportunity to unify the stage1 and stage2 testing harness,
specifically with regards to compile errors. In this way we can start
keeping track of which tests pass for 1, 2, or both.
`zig build test-compile-errors` no longer works; it is now integrated
into `zig build test-stage2`.
This is one step closer to executing compile error tests in parallel; in
fact the ThreadPool object is already in scope.
There are some cases where the stage1 compile errors were actually
better; those are left failing in this commit, to be addressed in a
follow-up commit.
Other changes in this commit:
* build.zig: improve support for -Dstage1 used with the test step.
* AstGen: minor cosmetic changes to error messages.
* stage2: add -fstage1 and -fno-stage1 flags. This now allows one to
download a binary of the zig compiler and use the llvm backend of
self-hosted. This was also needed for hooking up the test harness.
However, I realized that stage1 calls exit() and also has memory
leaks, so had to complicate the test harness by not using this flag
after all and instead invoking as a child process.
- These CLI flags will disappear once we start shipping the
self-hosted compiler as the main compiler. Until then, they can be
used to try out the work-in-progress stage2.
* stage2: select the LLVM backend by default for release modes, as long
as the target architecture is supported by LLVM.
* test harness: support setting the optimize mode
2021-06-30 18:27:39 +00:00
test_stage2 . defineCMacro ( " ZIG_LINK_MODE " , " Static " ) ;
ability to build stage1 using only a zig tarball
The main idea here is that there are now 2 ways to get a stage1 zig
binary:
* The cmake path. Requirements: cmake, system C++ compiler, system
LLVM, LLD, Clang libraries, compiled by the system C++ compiler.
* The zig path. Requirements: a zig installation, system LLVM, LLD,
Clang libraries, compiled by the zig installation.
Note that the former can be used to now take the latter path.
Removed config.h.in and config.zig.in. The build.zig script no longer is
coupled to the cmake script.
cmake no longer tries to determine the zig version. A build with cmake
will yield a stage1 zig binary that reports 0.0.0+zig0. This is going to
get reverted.
`zig build` now accepts `-Dstage1` which will build the stage1 compiler,
and put the stage2 backend behind a feature flag.
build.zig is simplified to only support the use case of enabling LLVM
support when the LLVM, LLD, and Clang libraries were built by zig. This
part is probably sadly going to have to get reverted to make package
maintainers happy.
Zig build system addBuildOption supports a couple new types.
The biggest reason to make this change is that the zig path is an
attractive option for doing compiler development work on Windows. It
allows people to work on the compiler without having MSVC installed,
using only a .zip file that contains Zig + LLVM/LLD/Clang libraries.
2020-12-05 04:33:29 +00:00
const softfloat = b . addStaticLibrary ( " softfloat " , null ) ;
softfloat . setBuildMode ( . ReleaseFast ) ;
softfloat . setTarget ( target ) ;
softfloat . addIncludeDir ( " deps/SoftFloat-3e-prebuilt " ) ;
softfloat . addIncludeDir ( " deps/SoftFloat-3e/source/8086 " ) ;
softfloat . addIncludeDir ( " deps/SoftFloat-3e/source/include " ) ;
softfloat . addCSourceFiles ( & softfloat_sources , & [ _ ] [ ] const u8 { " -std=c99 " , " -O3 " } ) ;
2021-06-03 17:06:22 +00:00
softfloat . single_threaded = single_threaded ;
avoid calling into stage1 backend when AstGen fails
The motivation for this commit is that there exists source files which
produce ast-check errors, but crash stage1 or otherwise trigger stage1
bugs. Previously to this commit, Zig would run AstGen, collect the
compile errors, run stage1, report stage1 compile errors and exit if
any, and then report AstGen compile errors.
The main change in this commit is to report AstGen errors prior to
invoking stage1, and in fact if any AstGen errors occur, do not invoke
stage1 at all.
This caused most of the compile error tests to fail due to things such
as unused local variables and mismatched stage1/stage2 error messages.
It was taking a long time to update the test cases one-by-one, so I
took this opportunity to unify the stage1 and stage2 testing harness,
specifically with regards to compile errors. In this way we can start
keeping track of which tests pass for 1, 2, or both.
`zig build test-compile-errors` no longer works; it is now integrated
into `zig build test-stage2`.
This is one step closer to executing compile error tests in parallel; in
fact the ThreadPool object is already in scope.
There are some cases where the stage1 compile errors were actually
better; those are left failing in this commit, to be addressed in a
follow-up commit.
Other changes in this commit:
* build.zig: improve support for -Dstage1 used with the test step.
* AstGen: minor cosmetic changes to error messages.
* stage2: add -fstage1 and -fno-stage1 flags. This now allows one to
download a binary of the zig compiler and use the llvm backend of
self-hosted. This was also needed for hooking up the test harness.
However, I realized that stage1 calls exit() and also has memory
leaks, so had to complicate the test harness by not using this flag
after all and instead invoking as a child process.
- These CLI flags will disappear once we start shipping the
self-hosted compiler as the main compiler. Until then, they can be
used to try out the work-in-progress stage2.
* stage2: select the LLVM backend by default for release modes, as long
as the target architecture is supported by LLVM.
* test harness: support setting the optimize mode
2021-06-30 18:27:39 +00:00
ability to build stage1 using only a zig tarball
The main idea here is that there are now 2 ways to get a stage1 zig
binary:
* The cmake path. Requirements: cmake, system C++ compiler, system
LLVM, LLD, Clang libraries, compiled by the system C++ compiler.
* The zig path. Requirements: a zig installation, system LLVM, LLD,
Clang libraries, compiled by the zig installation.
Note that the former can be used to now take the latter path.
Removed config.h.in and config.zig.in. The build.zig script no longer is
coupled to the cmake script.
cmake no longer tries to determine the zig version. A build with cmake
will yield a stage1 zig binary that reports 0.0.0+zig0. This is going to
get reverted.
`zig build` now accepts `-Dstage1` which will build the stage1 compiler,
and put the stage2 backend behind a feature flag.
build.zig is simplified to only support the use case of enabling LLVM
support when the LLVM, LLD, and Clang libraries were built by zig. This
part is probably sadly going to have to get reverted to make package
maintainers happy.
Zig build system addBuildOption supports a couple new types.
The biggest reason to make this change is that the zig path is an
attractive option for doing compiler development work on Windows. It
allows people to work on the compiler without having MSVC installed,
using only a .zip file that contains Zig + LLVM/LLD/Clang libraries.
2020-12-05 04:33:29 +00:00
exe . linkLibrary ( softfloat ) ;
avoid calling into stage1 backend when AstGen fails
The motivation for this commit is that there exists source files which
produce ast-check errors, but crash stage1 or otherwise trigger stage1
bugs. Previously to this commit, Zig would run AstGen, collect the
compile errors, run stage1, report stage1 compile errors and exit if
any, and then report AstGen compile errors.
The main change in this commit is to report AstGen errors prior to
invoking stage1, and in fact if any AstGen errors occur, do not invoke
stage1 at all.
This caused most of the compile error tests to fail due to things such
as unused local variables and mismatched stage1/stage2 error messages.
It was taking a long time to update the test cases one-by-one, so I
took this opportunity to unify the stage1 and stage2 testing harness,
specifically with regards to compile errors. In this way we can start
keeping track of which tests pass for 1, 2, or both.
`zig build test-compile-errors` no longer works; it is now integrated
into `zig build test-stage2`.
This is one step closer to executing compile error tests in parallel; in
fact the ThreadPool object is already in scope.
There are some cases where the stage1 compile errors were actually
better; those are left failing in this commit, to be addressed in a
follow-up commit.
Other changes in this commit:
* build.zig: improve support for -Dstage1 used with the test step.
* AstGen: minor cosmetic changes to error messages.
* stage2: add -fstage1 and -fno-stage1 flags. This now allows one to
download a binary of the zig compiler and use the llvm backend of
self-hosted. This was also needed for hooking up the test harness.
However, I realized that stage1 calls exit() and also has memory
leaks, so had to complicate the test harness by not using this flag
after all and instead invoking as a child process.
- These CLI flags will disappear once we start shipping the
self-hosted compiler as the main compiler. Until then, they can be
used to try out the work-in-progress stage2.
* stage2: select the LLVM backend by default for release modes, as long
as the target architecture is supported by LLVM.
* test harness: support setting the optimize mode
2021-06-30 18:27:39 +00:00
test_stage2 . linkLibrary ( softfloat ) ;
ability to build stage1 using only a zig tarball
The main idea here is that there are now 2 ways to get a stage1 zig
binary:
* The cmake path. Requirements: cmake, system C++ compiler, system
LLVM, LLD, Clang libraries, compiled by the system C++ compiler.
* The zig path. Requirements: a zig installation, system LLVM, LLD,
Clang libraries, compiled by the zig installation.
Note that the former can be used to now take the latter path.
Removed config.h.in and config.zig.in. The build.zig script no longer is
coupled to the cmake script.
cmake no longer tries to determine the zig version. A build with cmake
will yield a stage1 zig binary that reports 0.0.0+zig0. This is going to
get reverted.
`zig build` now accepts `-Dstage1` which will build the stage1 compiler,
and put the stage2 backend behind a feature flag.
build.zig is simplified to only support the use case of enabling LLVM
support when the LLVM, LLD, and Clang libraries were built by zig. This
part is probably sadly going to have to get reverted to make package
maintainers happy.
Zig build system addBuildOption supports a couple new types.
The biggest reason to make this change is that the zig path is an
attractive option for doing compiler development work on Windows. It
allows people to work on the compiler without having MSVC installed,
using only a .zip file that contains Zig + LLVM/LLD/Clang libraries.
2020-12-05 04:33:29 +00:00
2020-12-05 05:55:50 +00:00
exe . addCSourceFiles ( & stage1_sources , & exe_cflags ) ;
ability to build stage1 using only a zig tarball
The main idea here is that there are now 2 ways to get a stage1 zig
binary:
* The cmake path. Requirements: cmake, system C++ compiler, system
LLVM, LLD, Clang libraries, compiled by the system C++ compiler.
* The zig path. Requirements: a zig installation, system LLVM, LLD,
Clang libraries, compiled by the zig installation.
Note that the former can be used to now take the latter path.
Removed config.h.in and config.zig.in. The build.zig script no longer is
coupled to the cmake script.
cmake no longer tries to determine the zig version. A build with cmake
will yield a stage1 zig binary that reports 0.0.0+zig0. This is going to
get reverted.
`zig build` now accepts `-Dstage1` which will build the stage1 compiler,
and put the stage2 backend behind a feature flag.
build.zig is simplified to only support the use case of enabling LLVM
support when the LLVM, LLD, and Clang libraries were built by zig. This
part is probably sadly going to have to get reverted to make package
maintainers happy.
Zig build system addBuildOption supports a couple new types.
The biggest reason to make this change is that the zig path is an
attractive option for doing compiler development work on Windows. It
allows people to work on the compiler without having MSVC installed,
using only a .zip file that contains Zig + LLVM/LLD/Clang libraries.
2020-12-05 04:33:29 +00:00
exe . addCSourceFiles ( & optimized_c_sources , & [ _ ] [ ] const u8 { " -std=c99 " , " -O3 " } ) ;
avoid calling into stage1 backend when AstGen fails
The motivation for this commit is that there exists source files which
produce ast-check errors, but crash stage1 or otherwise trigger stage1
bugs. Previously to this commit, Zig would run AstGen, collect the
compile errors, run stage1, report stage1 compile errors and exit if
any, and then report AstGen compile errors.
The main change in this commit is to report AstGen errors prior to
invoking stage1, and in fact if any AstGen errors occur, do not invoke
stage1 at all.
This caused most of the compile error tests to fail due to things such
as unused local variables and mismatched stage1/stage2 error messages.
It was taking a long time to update the test cases one-by-one, so I
took this opportunity to unify the stage1 and stage2 testing harness,
specifically with regards to compile errors. In this way we can start
keeping track of which tests pass for 1, 2, or both.
`zig build test-compile-errors` no longer works; it is now integrated
into `zig build test-stage2`.
This is one step closer to executing compile error tests in parallel; in
fact the ThreadPool object is already in scope.
There are some cases where the stage1 compile errors were actually
better; those are left failing in this commit, to be addressed in a
follow-up commit.
Other changes in this commit:
* build.zig: improve support for -Dstage1 used with the test step.
* AstGen: minor cosmetic changes to error messages.
* stage2: add -fstage1 and -fno-stage1 flags. This now allows one to
download a binary of the zig compiler and use the llvm backend of
self-hosted. This was also needed for hooking up the test harness.
However, I realized that stage1 calls exit() and also has memory
leaks, so had to complicate the test harness by not using this flag
after all and instead invoking as a child process.
- These CLI flags will disappear once we start shipping the
self-hosted compiler as the main compiler. Until then, they can be
used to try out the work-in-progress stage2.
* stage2: select the LLVM backend by default for release modes, as long
as the target architecture is supported by LLVM.
* test harness: support setting the optimize mode
2021-06-30 18:27:39 +00:00
test_stage2 . addCSourceFiles ( & stage1_sources , & exe_cflags ) ;
test_stage2 . addCSourceFiles ( & optimized_c_sources , & [ _ ] [ ] const u8 { " -std=c99 " , " -O3 " } ) ;
ability to build stage1 using only a zig tarball
The main idea here is that there are now 2 ways to get a stage1 zig
binary:
* The cmake path. Requirements: cmake, system C++ compiler, system
LLVM, LLD, Clang libraries, compiled by the system C++ compiler.
* The zig path. Requirements: a zig installation, system LLVM, LLD,
Clang libraries, compiled by the zig installation.
Note that the former can be used to now take the latter path.
Removed config.h.in and config.zig.in. The build.zig script no longer is
coupled to the cmake script.
cmake no longer tries to determine the zig version. A build with cmake
will yield a stage1 zig binary that reports 0.0.0+zig0. This is going to
get reverted.
`zig build` now accepts `-Dstage1` which will build the stage1 compiler,
and put the stage2 backend behind a feature flag.
build.zig is simplified to only support the use case of enabling LLVM
support when the LLVM, LLD, and Clang libraries were built by zig. This
part is probably sadly going to have to get reverted to make package
maintainers happy.
Zig build system addBuildOption supports a couple new types.
The biggest reason to make this change is that the zig path is an
attractive option for doing compiler development work on Windows. It
allows people to work on the compiler without having MSVC installed,
using only a .zip file that contains Zig + LLVM/LLD/Clang libraries.
2020-12-05 04:33:29 +00:00
}
2020-12-08 00:23:17 +00:00
if ( cmake_cfg ) | cfg | {
// Inside this code path, we have to coordinate with system packaged LLVM, Clang, and LLD.
// That means we also have to rely on stage1 compiled c++ files. We parse config.h to find
// the information passed on to us from cmake.
if ( cfg . cmake_prefix_path . len > 0 ) {
b . addSearchPrefix ( cfg . cmake_prefix_path ) ;
}
2020-09-08 18:15:32 +00:00
2021-09-16 20:09:32 +00:00
try addCmakeCfgOptionsToExe ( b , cfg , exe , use_zig_libcxx ) ;
try addCmakeCfgOptionsToExe ( b , cfg , test_stage2 , use_zig_libcxx ) ;
2020-12-08 00:23:17 +00:00
} else {
// Here we are -Denable-llvm but no cmake integration.
2021-01-03 15:48:52 +00:00
try addStaticLlvmOptionsToExe ( exe ) ;
try addStaticLlvmOptionsToExe ( test_stage2 ) ;
2020-12-05 05:55:50 +00:00
}
2020-09-08 18:15:32 +00:00
}
if ( link_libc ) {
exe . linkLibC ( ) ;
test_stage2 . linkLibC ( ) ;
}
2021-01-19 22:49:08 +00:00
const enable_logging = b . option ( bool , " log " , " Whether to enable logging " ) orelse false ;
2020-09-08 18:15:32 +00:00
const opt_version_string = b . option ( [ ] const u8 , " version-string " , " Override Zig version string. Default is to find out with git. " ) ;
const version = if ( opt_version_string ) | version | version else v : {
2020-11-26 12:28:38 +00:00
const version_string = b . fmt ( " {d}.{d}.{d} " , . { zig_version . major , zig_version . minor , zig_version . patch } ) ;
2020-09-08 18:15:32 +00:00
var code : u8 = undefined ;
2021-01-02 04:01:51 +00:00
const git_describe_untrimmed = b . execAllowFail ( & [ _ ] [ ] const u8 {
" git " , " -C " , b . build_root , " describe " , " --match " , " *.*.* " , " --tags " ,
2020-09-08 18:15:32 +00:00
} , & code , . Ignore ) catch {
break : v version_string ;
} ;
2021-01-02 04:01:51 +00:00
const git_describe = mem . trim ( u8 , git_describe_untrimmed , " \n \r " ) ;
switch ( mem . count ( u8 , git_describe , " - " ) ) {
0 = > {
2021-06-04 18:21:32 +00:00
// Tagged release version (e.g. 0.8.0).
2021-01-02 04:01:51 +00:00
if ( ! mem . eql ( u8 , git_describe , version_string ) ) {
2020-11-26 12:28:38 +00:00
std . debug . print ( " Zig version '{s}' does not match Git tag '{s}' \n " , . { version_string , git_describe } ) ;
2021-01-02 04:01:51 +00:00
std . process . exit ( 1 ) ;
}
break : v version_string ;
} ,
2 = > {
2021-06-04 18:21:32 +00:00
// Untagged development build (e.g. 0.8.0-684-gbbe2cca1a).
2021-08-06 09:01:47 +00:00
var it = mem . split ( u8 , git_describe , " - " ) ;
2021-01-02 04:01:51 +00:00
const tagged_ancestor = it . next ( ) orelse unreachable ;
const commit_height = it . next ( ) orelse unreachable ;
const commit_id = it . next ( ) orelse unreachable ;
const ancestor_ver = try std . builtin . Version . parse ( tagged_ancestor ) ;
if ( zig_version . order ( ancestor_ver ) ! = . gt ) {
std . debug . print ( " Zig version '{}' must be greater than tagged ancestor '{}' \n " , . { zig_version , ancestor_ver } ) ;
std . process . exit ( 1 ) ;
}
// Check that the commit hash is prefixed with a 'g' (a Git convention).
if ( commit_id . len < 1 or commit_id [ 0 ] ! = 'g' ) {
2020-11-26 12:28:38 +00:00
std . debug . print ( " Unexpected `git describe` output: {s} \n " , . { git_describe } ) ;
2021-01-02 04:01:51 +00:00
break : v version_string ;
}
// The version is reformatted in accordance with the https://semver.org specification.
2020-11-26 12:28:38 +00:00
break : v b . fmt ( " {s}-dev.{s}+{s} " , . { version_string , commit_height , commit_id [ 1 . . ] } ) ;
2021-01-02 04:01:51 +00:00
} ,
else = > {
2020-11-26 12:28:38 +00:00
std . debug . print ( " Unexpected `git describe` output: {s} \n " , . { git_describe } ) ;
2021-01-02 04:01:51 +00:00
break : v version_string ;
} ,
2020-09-08 18:15:32 +00:00
}
} ;
2021-08-26 23:53:23 +00:00
exe_options . addOption ( [ : 0 ] const u8 , " version " , try b . allocator . dupeZ ( u8 , version ) ) ;
ability to build stage1 using only a zig tarball
The main idea here is that there are now 2 ways to get a stage1 zig
binary:
* The cmake path. Requirements: cmake, system C++ compiler, system
LLVM, LLD, Clang libraries, compiled by the system C++ compiler.
* The zig path. Requirements: a zig installation, system LLVM, LLD,
Clang libraries, compiled by the zig installation.
Note that the former can be used to now take the latter path.
Removed config.h.in and config.zig.in. The build.zig script no longer is
coupled to the cmake script.
cmake no longer tries to determine the zig version. A build with cmake
will yield a stage1 zig binary that reports 0.0.0+zig0. This is going to
get reverted.
`zig build` now accepts `-Dstage1` which will build the stage1 compiler,
and put the stage2 backend behind a feature flag.
build.zig is simplified to only support the use case of enabling LLVM
support when the LLVM, LLD, and Clang libraries were built by zig. This
part is probably sadly going to have to get reverted to make package
maintainers happy.
Zig build system addBuildOption supports a couple new types.
The biggest reason to make this change is that the zig path is an
attractive option for doing compiler development work on Windows. It
allows people to work on the compiler without having MSVC installed,
using only a .zip file that contains Zig + LLVM/LLD/Clang libraries.
2020-12-05 04:33:29 +00:00
const semver = try std . SemanticVersion . parse ( version ) ;
2021-08-26 23:53:23 +00:00
exe_options . addOption ( std . SemanticVersion , " semver " , semver ) ;
2020-09-08 18:15:32 +00:00
2021-08-26 23:53:23 +00:00
exe_options . addOption ( bool , " enable_logging " , enable_logging ) ;
exe_options . addOption ( bool , " enable_tracy " , tracy ! = null ) ;
exe_options . addOption ( bool , " is_stage1 " , is_stage1 ) ;
exe_options . addOption ( bool , " omit_stage2 " , omit_stage2 ) ;
2020-09-08 18:15:32 +00:00
if ( tracy ) | tracy_path | {
const client_cpp = fs . path . join (
b . allocator ,
& [ _ ] [ ] const u8 { tracy_path , " TracyClient.cpp " } ,
) catch unreachable ;
2021-09-16 21:51:29 +00:00
// On mingw, we need to opt into windows 7+ to get some features required by tracy.
const tracy_c_flags : [ ] const [ ] const u8 = if ( target . isWindows ( ) and target . getAbi ( ) = = . gnu )
& [ _ ] [ ] const u8 { " -DTRACY_ENABLE=1 " , " -fno-sanitize=undefined " , " -D_WIN32_WINNT=0x601 " }
else
& [ _ ] [ ] const u8 { " -DTRACY_ENABLE=1 " , " -fno-sanitize=undefined " } ;
2020-09-08 18:15:32 +00:00
exe . addIncludeDir ( tracy_path ) ;
2021-09-16 21:51:29 +00:00
exe . addCSourceFile ( client_cpp , tracy_c_flags ) ;
2020-09-17 00:17:37 +00:00
if ( ! enable_llvm ) {
exe . linkSystemLibraryName ( " c++ " ) ;
}
2020-09-08 18:15:32 +00:00
exe . linkLibC ( ) ;
2021-09-16 21:51:29 +00:00
if ( target . isWindows ( ) ) {
exe . linkSystemLibrary ( " dbghelp " ) ;
exe . linkSystemLibrary ( " ws2_32 " ) ;
}
2020-09-08 18:15:32 +00:00
}
2017-04-19 05:13:15 +00:00
const test_filter = b . option ( [ ] const u8 , " test-filter " , " Skip tests that do not match filter " ) ;
2019-09-22 03:55:56 +00:00
const is_wine_enabled = b . option ( bool , " enable-wine " , " Use Wine to run cross compiled Windows tests " ) orelse false ;
const is_qemu_enabled = b . option ( bool , " enable-qemu " , " Use QEMU to run cross compiled foreign architecture tests " ) orelse false ;
2020-05-13 06:08:16 +00:00
const is_wasmtime_enabled = b . option ( bool , " enable-wasmtime " , " Use Wasmtime to enable and run WASI libstd tests " ) orelse false ;
2021-05-13 09:03:44 +00:00
const is_darling_enabled = b . option ( bool , " enable-darling " , " [Experimental] Use Darling to run cross compiled macOS tests " ) orelse false ;
2019-09-22 03:55:56 +00:00
const glibc_multi_dir = b . option ( [ ] const u8 , " enable-foreign-glibc " , " Provide directory with glibc installations to run cross compiled tests that link glibc " ) ;
2021-08-26 23:53:23 +00:00
const test_stage2_options = b . addOptions ( ) ;
test_stage2 . addOptions ( " build_options " , test_stage2_options ) ;
test_stage2_options . addOption ( bool , " enable_logging " , enable_logging ) ;
test_stage2_options . addOption ( bool , " skip_non_native " , skip_non_native ) ;
test_stage2_options . addOption ( bool , " skip_compile_errors " , skip_compile_errors ) ;
test_stage2_options . addOption ( bool , " is_stage1 " , is_stage1 ) ;
test_stage2_options . addOption ( bool , " omit_stage2 " , omit_stage2 ) ;
test_stage2_options . addOption ( bool , " have_llvm " , enable_llvm ) ;
2021-09-01 05:50:10 +00:00
test_stage2_options . addOption ( bool , " llvm_has_m68k " , llvm_has_m68k ) ;
test_stage2_options . addOption ( bool , " llvm_has_csky " , llvm_has_csky ) ;
test_stage2_options . addOption ( bool , " llvm_has_ve " , llvm_has_ve ) ;
test_stage2_options . addOption ( bool , " llvm_has_arc " , llvm_has_arc ) ;
2021-08-26 23:53:23 +00:00
test_stage2_options . addOption ( bool , " enable_qemu " , is_qemu_enabled ) ;
test_stage2_options . addOption ( bool , " enable_wine " , is_wine_enabled ) ;
test_stage2_options . addOption ( bool , " enable_wasmtime " , is_wasmtime_enabled ) ;
test_stage2_options . addOption ( u32 , " mem_leak_frames " , mem_leak_frames * 2 ) ;
test_stage2_options . addOption ( bool , " enable_darling " , is_darling_enabled ) ;
test_stage2_options . addOption ( ? [ ] const u8 , " glibc_multi_install_dir " , glibc_multi_dir ) ;
test_stage2_options . addOption ( [ : 0 ] const u8 , " version " , try b . allocator . dupeZ ( u8 , version ) ) ;
test_stage2_options . addOption ( std . SemanticVersion , " semver " , semver ) ;
2020-08-04 22:32:16 +00:00
2018-07-11 00:18:43 +00:00
const test_stage2_step = b . step ( " test-stage2 " , " Run the stage2 compiler tests " ) ;
test_stage2_step . dependOn ( & test_stage2 . step ) ;
2021-04-15 23:20:43 +00:00
if ( ! skip_stage2_tests ) {
2021-04-30 21:41:28 +00:00
toolchain_step . dependOn ( test_stage2_step ) ;
2021-04-15 23:20:43 +00:00
}
2018-07-11 00:18:43 +00:00
2018-09-12 18:50:26 +00:00
var chosen_modes : [ 4 ] builtin . Mode = undefined ;
var chosen_mode_index : usize = 0 ;
2021-04-24 20:39:26 +00:00
if ( ! skip_debug ) {
chosen_modes [ chosen_mode_index ] = builtin . Mode . Debug ;
chosen_mode_index + = 1 ;
}
2018-09-12 18:50:26 +00:00
if ( ! skip_release_safe ) {
chosen_modes [ chosen_mode_index ] = builtin . Mode . ReleaseSafe ;
chosen_mode_index + = 1 ;
}
if ( ! skip_release_fast ) {
chosen_modes [ chosen_mode_index ] = builtin . Mode . ReleaseFast ;
chosen_mode_index + = 1 ;
}
if ( ! skip_release_small ) {
chosen_modes [ chosen_mode_index ] = builtin . Mode . ReleaseSmall ;
chosen_mode_index + = 1 ;
}
const modes = chosen_modes [ 0 . . chosen_mode_index ] ;
2017-11-07 08:22:27 +00:00
2019-02-26 23:10:40 +00:00
// run stage1 `zig fmt` on this build.zig file just to make sure it works
2021-04-30 21:41:28 +00:00
toolchain_step . dependOn ( & fmt_build_zig . step ) ;
2019-02-26 23:10:40 +00:00
const fmt_step = b . step ( " test-fmt " , " Run zig fmt against build.zig to make sure it works " ) ;
fmt_step . dependOn ( & fmt_build_zig . step ) ;
2021-05-13 09:03:44 +00:00
toolchain_step . dependOn ( tests . addPkgTests (
b ,
test_filter ,
2021-05-16 04:44:38 +00:00
" test/behavior.zig " ,
2021-05-13 09:03:44 +00:00
" behavior " ,
" Run the behavior tests " ,
modes ,
2021-06-03 17:06:22 +00:00
false , // skip_single_threaded
2021-05-13 09:03:44 +00:00
skip_non_native ,
skip_libc ,
is_wine_enabled ,
is_qemu_enabled ,
is_wasmtime_enabled ,
is_darling_enabled ,
glibc_multi_dir ,
) ) ;
toolchain_step . dependOn ( tests . addPkgTests (
b ,
test_filter ,
" lib/std/special/compiler_rt.zig " ,
" compiler-rt " ,
" Run the compiler_rt tests " ,
modes ,
2021-06-03 17:06:22 +00:00
true , // skip_single_threaded
2021-05-13 09:03:44 +00:00
skip_non_native ,
2021-06-03 17:06:22 +00:00
true , // skip_libc
2021-05-13 09:03:44 +00:00
is_wine_enabled ,
is_qemu_enabled ,
is_wasmtime_enabled ,
is_darling_enabled ,
glibc_multi_dir ,
) ) ;
toolchain_step . dependOn ( tests . addPkgTests (
b ,
test_filter ,
" lib/std/special/c.zig " ,
" minilibc " ,
" Run the mini libc tests " ,
modes ,
2021-06-03 17:06:22 +00:00
true , // skip_single_threaded
2021-05-13 09:03:44 +00:00
skip_non_native ,
2021-06-03 17:06:22 +00:00
true , // skip_libc
2021-05-13 09:03:44 +00:00
is_wine_enabled ,
is_qemu_enabled ,
is_wasmtime_enabled ,
is_darling_enabled ,
glibc_multi_dir ,
) ) ;
2021-04-30 21:41:28 +00:00
toolchain_step . dependOn ( tests . addCompareOutputTests ( b , test_filter , modes ) ) ;
2021-08-02 08:04:54 +00:00
toolchain_step . dependOn ( tests . addStandaloneTests ( b , test_filter , modes , skip_non_native , enable_macos_sdk , target ) ) ;
2021-04-30 21:41:28 +00:00
toolchain_step . dependOn ( tests . addStackTraceTests ( b , test_filter , modes ) ) ;
toolchain_step . dependOn ( tests . addCliTests ( b , test_filter , modes ) ) ;
toolchain_step . dependOn ( tests . addAssembleAndLinkTests ( b , test_filter , modes ) ) ;
toolchain_step . dependOn ( tests . addRuntimeSafetyTests ( b , test_filter , modes ) ) ;
toolchain_step . dependOn ( tests . addTranslateCTests ( b , test_filter ) ) ;
2021-04-15 06:10:51 +00:00
if ( ! skip_run_translated_c ) {
2021-04-30 21:41:28 +00:00
toolchain_step . dependOn ( tests . addRunTranslatedCTests ( b , test_filter , target ) ) ;
2021-04-15 06:10:51 +00:00
}
2020-03-20 22:33:36 +00:00
// tests for this feature are disabled until we have the self-hosted compiler available
2021-04-30 21:41:28 +00:00
// toolchain_step.dependOn(tests.addGenHTests(b, test_filter));
2021-05-13 09:03:44 +00:00
const std_step = tests . addPkgTests (
b ,
test_filter ,
" lib/std/std.zig " ,
" std " ,
" Run the standard library tests " ,
modes ,
false ,
skip_non_native ,
skip_libc ,
is_wine_enabled ,
is_qemu_enabled ,
is_wasmtime_enabled ,
is_darling_enabled ,
glibc_multi_dir ,
) ;
2021-04-30 21:41:28 +00:00
const test_step = b . step ( " test " , " Run all the tests " ) ;
test_step . dependOn ( toolchain_step ) ;
test_step . dependOn ( std_step ) ;
2018-07-11 18:09:05 +00:00
test_step . dependOn ( docs_step ) ;
2017-04-19 05:13:15 +00:00
}
2017-12-12 04:34:59 +00:00
2021-01-03 15:48:52 +00:00
const exe_cflags = [ _ ] [ ] const u8 {
" -std=c++14 " ,
" -D__STDC_CONSTANT_MACROS " ,
" -D__STDC_FORMAT_MACROS " ,
" -D__STDC_LIMIT_MACROS " ,
" -D_GNU_SOURCE " ,
" -fvisibility-inlines-hidden " ,
" -fno-exceptions " ,
" -fno-rtti " ,
" -Werror=type-limits " ,
" -Wno-missing-braces " ,
" -Wno-comment " ,
} ;
fn addCmakeCfgOptionsToExe (
b : * Builder ,
cfg : CMakeConfig ,
exe : * std . build . LibExeObjStep ,
2021-09-16 20:09:32 +00:00
use_zig_libcxx : bool ,
2021-01-03 15:48:52 +00:00
) ! void {
exe . addObjectFile ( fs . path . join ( b . allocator , & [ _ ] [ ] const u8 {
cfg . cmake_binary_dir ,
" zigcpp " ,
b . fmt ( " {s}{s}{s} " , . { exe . target . libPrefix ( ) , " zigcpp " , exe . target . staticLibSuffix ( ) } ) ,
} ) catch unreachable ) ;
assert ( cfg . lld_include_dir . len ! = 0 ) ;
exe . addIncludeDir ( cfg . lld_include_dir ) ;
addCMakeLibraryList ( exe , cfg . clang_libraries ) ;
addCMakeLibraryList ( exe , cfg . lld_libraries ) ;
addCMakeLibraryList ( exe , cfg . llvm_libraries ) ;
2021-09-16 20:09:32 +00:00
if ( use_zig_libcxx ) {
exe . linkLibCpp ( ) ;
} else {
const need_cpp_includes = true ;
// System -lc++ must be used because in this code path we are attempting to link
// against system-provided LLVM, Clang, LLD.
if ( exe . target . getOsTag ( ) = = . linux ) {
// First we try to static link against gcc libstdc++. If that doesn't work,
// we fall back to -lc++ and cross our fingers.
addCxxKnownPath ( b , cfg , exe , " libstdc++.a " , " " , need_cpp_includes ) catch | err | switch ( err ) {
error . RequiredLibraryNotFound = > {
exe . linkSystemLibrary ( " c++ " ) ;
} ,
else = > | e | return e ,
} ;
exe . linkSystemLibrary ( " unwind " ) ;
} else if ( exe . target . isFreeBSD ( ) ) {
try addCxxKnownPath ( b , cfg , exe , " libc++.a " , null , need_cpp_includes ) ;
exe . linkSystemLibrary ( " pthread " ) ;
} else if ( exe . target . getOsTag ( ) = = . openbsd ) {
try addCxxKnownPath ( b , cfg , exe , " libc++.a " , null , need_cpp_includes ) ;
try addCxxKnownPath ( b , cfg , exe , " libc++abi.a " , null , need_cpp_includes ) ;
} else if ( exe . target . isDarwin ( ) ) {
exe . linkSystemLibrary ( " c++ " ) ;
}
2021-01-03 15:48:52 +00:00
}
if ( cfg . dia_guids_lib . len ! = 0 ) {
exe . addObjectFile ( cfg . dia_guids_lib ) ;
}
}
fn addStaticLlvmOptionsToExe (
exe : * std . build . LibExeObjStep ,
) ! void {
// Adds the Zig C++ sources which both stage1 and stage2 need.
//
// We need this because otherwise zig_clang_cc1_main.cpp ends up pulling
// in a dependency on llvm::cfg::Update<llvm::BasicBlock*>::dump() which is
// unavailable when LLVM is compiled in Release mode.
const zig_cpp_cflags = exe_cflags + + [ _ ] [ ] const u8 { " -DNDEBUG=1 " } ;
exe . addCSourceFiles ( & zig_cpp_sources , & zig_cpp_cflags ) ;
for ( clang_libs ) | lib_name | {
exe . linkSystemLibrary ( lib_name ) ;
}
for ( lld_libs ) | lib_name | {
exe . linkSystemLibrary ( lib_name ) ;
}
for ( llvm_libs ) | lib_name | {
exe . linkSystemLibrary ( lib_name ) ;
}
// This means we rely on clang-or-zig-built LLVM, Clang, LLD libraries.
exe . linkSystemLibrary ( " c++ " ) ;
if ( exe . target . getOs ( ) . tag = = . windows ) {
exe . linkSystemLibrary ( " version " ) ;
exe . linkSystemLibrary ( " uuid " ) ;
2021-09-22 19:39:02 +00:00
exe . linkSystemLibrary ( " ole32 " ) ;
2021-01-03 15:48:52 +00:00
}
}
2020-12-08 00:23:17 +00:00
fn addCxxKnownPath (
b : * Builder ,
ctx : CMakeConfig ,
exe : * std . build . LibExeObjStep ,
objname : [ ] const u8 ,
errtxt : ? [ ] const u8 ,
need_cpp_includes : bool ,
) ! void {
const path_padded = try b . exec ( & [ _ ] [ ] const u8 {
ctx . cxx_compiler ,
2020-11-26 12:28:38 +00:00
b . fmt ( " -print-file-name={s} " , . { objname } ) ,
2020-12-08 00:23:17 +00:00
} ) ;
2021-08-06 09:01:47 +00:00
const path_unpadded = mem . tokenize ( u8 , path_padded , " \r \n " ) . next ( ) . ? ;
2020-12-08 00:23:17 +00:00
if ( mem . eql ( u8 , path_unpadded , objname ) ) {
if ( errtxt ) | msg | {
2020-11-26 12:28:38 +00:00
warn ( " {s} " , . { msg } ) ;
2020-12-08 00:23:17 +00:00
} else {
2020-11-26 12:28:38 +00:00
warn ( " Unable to determine path to {s} \n " , . { objname } ) ;
2020-12-08 00:23:17 +00:00
}
return error . RequiredLibraryNotFound ;
}
exe . addObjectFile ( path_unpadded ) ;
// TODO a way to integrate with system c++ include files here
// cc -E -Wp,-v -xc++ /dev/null
if ( need_cpp_includes ) {
// I used these temporarily for testing something but we obviously need a
// more general purpose solution here.
avoid calling into stage1 backend when AstGen fails
The motivation for this commit is that there exists source files which
produce ast-check errors, but crash stage1 or otherwise trigger stage1
bugs. Previously to this commit, Zig would run AstGen, collect the
compile errors, run stage1, report stage1 compile errors and exit if
any, and then report AstGen compile errors.
The main change in this commit is to report AstGen errors prior to
invoking stage1, and in fact if any AstGen errors occur, do not invoke
stage1 at all.
This caused most of the compile error tests to fail due to things such
as unused local variables and mismatched stage1/stage2 error messages.
It was taking a long time to update the test cases one-by-one, so I
took this opportunity to unify the stage1 and stage2 testing harness,
specifically with regards to compile errors. In this way we can start
keeping track of which tests pass for 1, 2, or both.
`zig build test-compile-errors` no longer works; it is now integrated
into `zig build test-stage2`.
This is one step closer to executing compile error tests in parallel; in
fact the ThreadPool object is already in scope.
There are some cases where the stage1 compile errors were actually
better; those are left failing in this commit, to be addressed in a
follow-up commit.
Other changes in this commit:
* build.zig: improve support for -Dstage1 used with the test step.
* AstGen: minor cosmetic changes to error messages.
* stage2: add -fstage1 and -fno-stage1 flags. This now allows one to
download a binary of the zig compiler and use the llvm backend of
self-hosted. This was also needed for hooking up the test harness.
However, I realized that stage1 calls exit() and also has memory
leaks, so had to complicate the test harness by not using this flag
after all and instead invoking as a child process.
- These CLI flags will disappear once we start shipping the
self-hosted compiler as the main compiler. Until then, they can be
used to try out the work-in-progress stage2.
* stage2: select the LLVM backend by default for release modes, as long
as the target architecture is supported by LLVM.
* test harness: support setting the optimize mode
2021-06-30 18:27:39 +00:00
//exe.addIncludeDir("/nix/store/fvf3qjqa5qpcjjkq37pb6ypnk1mzhf5h-gcc-9.3.0/lib/gcc/x86_64-unknown-linux-gnu/9.3.0/../../../../include/c++/9.3.0");
//exe.addIncludeDir("/nix/store/fvf3qjqa5qpcjjkq37pb6ypnk1mzhf5h-gcc-9.3.0/lib/gcc/x86_64-unknown-linux-gnu/9.3.0/../../../../include/c++/9.3.0/x86_64-unknown-linux-gnu");
//exe.addIncludeDir("/nix/store/fvf3qjqa5qpcjjkq37pb6ypnk1mzhf5h-gcc-9.3.0/lib/gcc/x86_64-unknown-linux-gnu/9.3.0/../../../../include/c++/9.3.0/backward");
2020-12-08 00:23:17 +00:00
}
}
fn addCMakeLibraryList ( exe : * std . build . LibExeObjStep , list : [ ] const u8 ) void {
2021-08-06 09:01:47 +00:00
var it = mem . tokenize ( u8 , list , " ; " ) ;
2020-12-08 00:23:17 +00:00
while ( it . next ( ) ) | lib | {
if ( mem . startsWith ( u8 , lib , " -l " ) ) {
exe . linkSystemLibrary ( lib [ " -l " . len . . ] ) ;
} else {
exe . addObjectFile ( lib ) ;
}
}
}
const CMakeConfig = struct {
cmake_binary_dir : [ ] const u8 ,
cmake_prefix_path : [ ] const u8 ,
cxx_compiler : [ ] const u8 ,
lld_include_dir : [ ] const u8 ,
lld_libraries : [ ] const u8 ,
clang_libraries : [ ] const u8 ,
llvm_libraries : [ ] const u8 ,
dia_guids_lib : [ ] const u8 ,
} ;
const max_config_h_bytes = 1 * 1024 * 1024 ;
fn findAndParseConfigH ( b : * Builder , config_h_path_option : ? [ ] const u8 ) ? CMakeConfig {
const config_h_text : [ ] const u8 = if ( config_h_path_option ) | config_h_path | blk : {
break : blk fs . cwd ( ) . readFileAlloc ( b . allocator , config_h_path , max_config_h_bytes ) catch unreachable ;
} else blk : {
// TODO this should stop looking for config.h once it detects we hit the
// zig source root directory.
var check_dir = fs . path . dirname ( b . zig_exe ) . ? ;
while ( true ) {
var dir = fs . cwd ( ) . openDir ( check_dir , . { } ) catch unreachable ;
defer dir . close ( ) ;
break : blk dir . readFileAlloc ( b . allocator , " config.h " , max_config_h_bytes ) catch | err | switch ( err ) {
error . FileNotFound = > {
const new_check_dir = fs . path . dirname ( check_dir ) ;
if ( new_check_dir = = null or mem . eql ( u8 , new_check_dir . ? , check_dir ) ) {
return null ;
}
check_dir = new_check_dir . ? ;
continue ;
} ,
else = > unreachable ,
} ;
} else unreachable ; // TODO should not need `else unreachable`.
} ;
var ctx : CMakeConfig = . {
. cmake_binary_dir = undefined ,
. cmake_prefix_path = undefined ,
. cxx_compiler = undefined ,
. lld_include_dir = undefined ,
. lld_libraries = undefined ,
. clang_libraries = undefined ,
. llvm_libraries = undefined ,
. dia_guids_lib = undefined ,
} ;
const mappings = [ _ ] struct { prefix : [ ] const u8 , field : [ ] const u8 } {
. {
. prefix = " #define ZIG_CMAKE_BINARY_DIR " ,
. field = " cmake_binary_dir " ,
} ,
. {
. prefix = " #define ZIG_CMAKE_PREFIX_PATH " ,
. field = " cmake_prefix_path " ,
} ,
. {
. prefix = " #define ZIG_CXX_COMPILER " ,
. field = " cxx_compiler " ,
} ,
. {
. prefix = " #define ZIG_LLD_INCLUDE_PATH " ,
. field = " lld_include_dir " ,
} ,
. {
. prefix = " #define ZIG_LLD_LIBRARIES " ,
. field = " lld_libraries " ,
} ,
. {
. prefix = " #define ZIG_CLANG_LIBRARIES " ,
. field = " clang_libraries " ,
} ,
. {
. prefix = " #define ZIG_LLVM_LIBRARIES " ,
. field = " llvm_libraries " ,
} ,
. {
. prefix = " #define ZIG_DIA_GUIDS_LIB " ,
. field = " dia_guids_lib " ,
} ,
} ;
2021-08-06 09:01:47 +00:00
var lines_it = mem . tokenize ( u8 , config_h_text , " \r \n " ) ;
2020-12-08 00:23:17 +00:00
while ( lines_it . next ( ) ) | line | {
inline for ( mappings ) | mapping | {
if ( mem . startsWith ( u8 , line , mapping . prefix ) ) {
2021-08-06 09:01:47 +00:00
var it = mem . split ( u8 , line , " \" " ) ;
2020-12-08 00:23:17 +00:00
_ = it . next ( ) . ? ; // skip the stuff before the quote
const quoted = it . next ( ) . ? ; // the stuff inside the quote
@field ( ctx , mapping . field ) = toNativePathSep ( b , quoted ) ;
}
}
}
return ctx ;
}
fn toNativePathSep ( b : * Builder , s : [ ] const u8 ) [ ] u8 {
const duplicated = mem . dupe ( b . allocator , u8 , s ) catch unreachable ;
for ( duplicated ) | * byte | switch ( byte . * ) {
'/' = > byte . * = fs . path . sep ,
else = > { } ,
} ;
return duplicated ;
}
ability to build stage1 using only a zig tarball
The main idea here is that there are now 2 ways to get a stage1 zig
binary:
* The cmake path. Requirements: cmake, system C++ compiler, system
LLVM, LLD, Clang libraries, compiled by the system C++ compiler.
* The zig path. Requirements: a zig installation, system LLVM, LLD,
Clang libraries, compiled by the zig installation.
Note that the former can be used to now take the latter path.
Removed config.h.in and config.zig.in. The build.zig script no longer is
coupled to the cmake script.
cmake no longer tries to determine the zig version. A build with cmake
will yield a stage1 zig binary that reports 0.0.0+zig0. This is going to
get reverted.
`zig build` now accepts `-Dstage1` which will build the stage1 compiler,
and put the stage2 backend behind a feature flag.
build.zig is simplified to only support the use case of enabling LLVM
support when the LLVM, LLD, and Clang libraries were built by zig. This
part is probably sadly going to have to get reverted to make package
maintainers happy.
Zig build system addBuildOption supports a couple new types.
The biggest reason to make this change is that the zig path is an
attractive option for doing compiler development work on Windows. It
allows people to work on the compiler without having MSVC installed,
using only a .zip file that contains Zig + LLVM/LLD/Clang libraries.
2020-12-05 04:33:29 +00:00
const softfloat_sources = [ _ ] [ ] const u8 {
" deps/SoftFloat-3e/source/8086/f128M_isSignalingNaN.c " ,
" deps/SoftFloat-3e/source/8086/s_commonNaNToF128M.c " ,
" deps/SoftFloat-3e/source/8086/s_commonNaNToF16UI.c " ,
" deps/SoftFloat-3e/source/8086/s_commonNaNToF32UI.c " ,
" deps/SoftFloat-3e/source/8086/s_commonNaNToF64UI.c " ,
" deps/SoftFloat-3e/source/8086/s_f128MToCommonNaN.c " ,
" deps/SoftFloat-3e/source/8086/s_f16UIToCommonNaN.c " ,
" deps/SoftFloat-3e/source/8086/s_f32UIToCommonNaN.c " ,
" deps/SoftFloat-3e/source/8086/s_f64UIToCommonNaN.c " ,
" deps/SoftFloat-3e/source/8086/s_propagateNaNF128M.c " ,
" deps/SoftFloat-3e/source/8086/s_propagateNaNF16UI.c " ,
" deps/SoftFloat-3e/source/8086/softfloat_raiseFlags.c " ,
" deps/SoftFloat-3e/source/f128M_add.c " ,
" deps/SoftFloat-3e/source/f128M_div.c " ,
" deps/SoftFloat-3e/source/f128M_eq.c " ,
" deps/SoftFloat-3e/source/f128M_eq_signaling.c " ,
" deps/SoftFloat-3e/source/f128M_le.c " ,
" deps/SoftFloat-3e/source/f128M_le_quiet.c " ,
" deps/SoftFloat-3e/source/f128M_lt.c " ,
" deps/SoftFloat-3e/source/f128M_lt_quiet.c " ,
" deps/SoftFloat-3e/source/f128M_mul.c " ,
" deps/SoftFloat-3e/source/f128M_mulAdd.c " ,
" deps/SoftFloat-3e/source/f128M_rem.c " ,
" deps/SoftFloat-3e/source/f128M_roundToInt.c " ,
" deps/SoftFloat-3e/source/f128M_sqrt.c " ,
" deps/SoftFloat-3e/source/f128M_sub.c " ,
" deps/SoftFloat-3e/source/f128M_to_f16.c " ,
" deps/SoftFloat-3e/source/f128M_to_f32.c " ,
" deps/SoftFloat-3e/source/f128M_to_f64.c " ,
" deps/SoftFloat-3e/source/f128M_to_i32.c " ,
" deps/SoftFloat-3e/source/f128M_to_i32_r_minMag.c " ,
" deps/SoftFloat-3e/source/f128M_to_i64.c " ,
" deps/SoftFloat-3e/source/f128M_to_i64_r_minMag.c " ,
" deps/SoftFloat-3e/source/f128M_to_ui32.c " ,
" deps/SoftFloat-3e/source/f128M_to_ui32_r_minMag.c " ,
" deps/SoftFloat-3e/source/f128M_to_ui64.c " ,
" deps/SoftFloat-3e/source/f128M_to_ui64_r_minMag.c " ,
" deps/SoftFloat-3e/source/f16_add.c " ,
" deps/SoftFloat-3e/source/f16_div.c " ,
" deps/SoftFloat-3e/source/f16_eq.c " ,
" deps/SoftFloat-3e/source/f16_isSignalingNaN.c " ,
" deps/SoftFloat-3e/source/f16_lt.c " ,
" deps/SoftFloat-3e/source/f16_mul.c " ,
" deps/SoftFloat-3e/source/f16_mulAdd.c " ,
" deps/SoftFloat-3e/source/f16_rem.c " ,
" deps/SoftFloat-3e/source/f16_roundToInt.c " ,
" deps/SoftFloat-3e/source/f16_sqrt.c " ,
" deps/SoftFloat-3e/source/f16_sub.c " ,
" deps/SoftFloat-3e/source/f16_to_f128M.c " ,
" deps/SoftFloat-3e/source/f16_to_f64.c " ,
" deps/SoftFloat-3e/source/f32_to_f128M.c " ,
" deps/SoftFloat-3e/source/f64_to_f128M.c " ,
" deps/SoftFloat-3e/source/f64_to_f16.c " ,
" deps/SoftFloat-3e/source/i32_to_f128M.c " ,
" deps/SoftFloat-3e/source/s_add256M.c " ,
" deps/SoftFloat-3e/source/s_addCarryM.c " ,
" deps/SoftFloat-3e/source/s_addComplCarryM.c " ,
" deps/SoftFloat-3e/source/s_addF128M.c " ,
" deps/SoftFloat-3e/source/s_addM.c " ,
" deps/SoftFloat-3e/source/s_addMagsF16.c " ,
" deps/SoftFloat-3e/source/s_addMagsF32.c " ,
" deps/SoftFloat-3e/source/s_addMagsF64.c " ,
" deps/SoftFloat-3e/source/s_approxRecip32_1.c " ,
" deps/SoftFloat-3e/source/s_approxRecipSqrt32_1.c " ,
" deps/SoftFloat-3e/source/s_approxRecipSqrt_1Ks.c " ,
" deps/SoftFloat-3e/source/s_approxRecip_1Ks.c " ,
" deps/SoftFloat-3e/source/s_compare128M.c " ,
" deps/SoftFloat-3e/source/s_compare96M.c " ,
" deps/SoftFloat-3e/source/s_countLeadingZeros16.c " ,
" deps/SoftFloat-3e/source/s_countLeadingZeros32.c " ,
" deps/SoftFloat-3e/source/s_countLeadingZeros64.c " ,
" deps/SoftFloat-3e/source/s_countLeadingZeros8.c " ,
" deps/SoftFloat-3e/source/s_eq128.c " ,
" deps/SoftFloat-3e/source/s_invalidF128M.c " ,
" deps/SoftFloat-3e/source/s_isNaNF128M.c " ,
" deps/SoftFloat-3e/source/s_le128.c " ,
" deps/SoftFloat-3e/source/s_lt128.c " ,
" deps/SoftFloat-3e/source/s_mul128MTo256M.c " ,
" deps/SoftFloat-3e/source/s_mul64To128M.c " ,
" deps/SoftFloat-3e/source/s_mulAddF128M.c " ,
" deps/SoftFloat-3e/source/s_mulAddF16.c " ,
" deps/SoftFloat-3e/source/s_mulAddF32.c " ,
" deps/SoftFloat-3e/source/s_mulAddF64.c " ,
" deps/SoftFloat-3e/source/s_negXM.c " ,
" deps/SoftFloat-3e/source/s_normRoundPackMToF128M.c " ,
" deps/SoftFloat-3e/source/s_normRoundPackToF16.c " ,
" deps/SoftFloat-3e/source/s_normRoundPackToF32.c " ,
" deps/SoftFloat-3e/source/s_normRoundPackToF64.c " ,
" deps/SoftFloat-3e/source/s_normSubnormalF128SigM.c " ,
" deps/SoftFloat-3e/source/s_normSubnormalF16Sig.c " ,
" deps/SoftFloat-3e/source/s_normSubnormalF32Sig.c " ,
" deps/SoftFloat-3e/source/s_normSubnormalF64Sig.c " ,
" deps/SoftFloat-3e/source/s_remStepMBy32.c " ,
" deps/SoftFloat-3e/source/s_roundMToI64.c " ,
" deps/SoftFloat-3e/source/s_roundMToUI64.c " ,
" deps/SoftFloat-3e/source/s_roundPackMToF128M.c " ,
" deps/SoftFloat-3e/source/s_roundPackToF16.c " ,
" deps/SoftFloat-3e/source/s_roundPackToF32.c " ,
" deps/SoftFloat-3e/source/s_roundPackToF64.c " ,
" deps/SoftFloat-3e/source/s_roundToI32.c " ,
" deps/SoftFloat-3e/source/s_roundToI64.c " ,
" deps/SoftFloat-3e/source/s_roundToUI32.c " ,
" deps/SoftFloat-3e/source/s_roundToUI64.c " ,
" deps/SoftFloat-3e/source/s_shiftLeftM.c " ,
" deps/SoftFloat-3e/source/s_shiftNormSigF128M.c " ,
" deps/SoftFloat-3e/source/s_shiftRightJam256M.c " ,
" deps/SoftFloat-3e/source/s_shiftRightJam32.c " ,
" deps/SoftFloat-3e/source/s_shiftRightJam64.c " ,
" deps/SoftFloat-3e/source/s_shiftRightJamM.c " ,
" deps/SoftFloat-3e/source/s_shiftRightM.c " ,
" deps/SoftFloat-3e/source/s_shortShiftLeft64To96M.c " ,
" deps/SoftFloat-3e/source/s_shortShiftLeftM.c " ,
" deps/SoftFloat-3e/source/s_shortShiftRightExtendM.c " ,
" deps/SoftFloat-3e/source/s_shortShiftRightJam64.c " ,
" deps/SoftFloat-3e/source/s_shortShiftRightJamM.c " ,
" deps/SoftFloat-3e/source/s_shortShiftRightM.c " ,
" deps/SoftFloat-3e/source/s_sub1XM.c " ,
" deps/SoftFloat-3e/source/s_sub256M.c " ,
" deps/SoftFloat-3e/source/s_subM.c " ,
" deps/SoftFloat-3e/source/s_subMagsF16.c " ,
" deps/SoftFloat-3e/source/s_subMagsF32.c " ,
" deps/SoftFloat-3e/source/s_subMagsF64.c " ,
" deps/SoftFloat-3e/source/s_tryPropagateNaNF128M.c " ,
" deps/SoftFloat-3e/source/softfloat_state.c " ,
" deps/SoftFloat-3e/source/ui32_to_f128M.c " ,
" deps/SoftFloat-3e/source/ui64_to_f128M.c " ,
2017-12-12 04:34:59 +00:00
} ;
ability to build stage1 using only a zig tarball
The main idea here is that there are now 2 ways to get a stage1 zig
binary:
* The cmake path. Requirements: cmake, system C++ compiler, system
LLVM, LLD, Clang libraries, compiled by the system C++ compiler.
* The zig path. Requirements: a zig installation, system LLVM, LLD,
Clang libraries, compiled by the zig installation.
Note that the former can be used to now take the latter path.
Removed config.h.in and config.zig.in. The build.zig script no longer is
coupled to the cmake script.
cmake no longer tries to determine the zig version. A build with cmake
will yield a stage1 zig binary that reports 0.0.0+zig0. This is going to
get reverted.
`zig build` now accepts `-Dstage1` which will build the stage1 compiler,
and put the stage2 backend behind a feature flag.
build.zig is simplified to only support the use case of enabling LLVM
support when the LLVM, LLD, and Clang libraries were built by zig. This
part is probably sadly going to have to get reverted to make package
maintainers happy.
Zig build system addBuildOption supports a couple new types.
The biggest reason to make this change is that the zig path is an
attractive option for doing compiler development work on Windows. It
allows people to work on the compiler without having MSVC installed,
using only a .zip file that contains Zig + LLVM/LLD/Clang libraries.
2020-12-05 04:33:29 +00:00
const stage1_sources = [ _ ] [ ] const u8 {
" src/stage1/analyze.cpp " ,
2021-05-30 11:28:46 +00:00
" src/stage1/astgen.cpp " ,
ability to build stage1 using only a zig tarball
The main idea here is that there are now 2 ways to get a stage1 zig
binary:
* The cmake path. Requirements: cmake, system C++ compiler, system
LLVM, LLD, Clang libraries, compiled by the system C++ compiler.
* The zig path. Requirements: a zig installation, system LLVM, LLD,
Clang libraries, compiled by the zig installation.
Note that the former can be used to now take the latter path.
Removed config.h.in and config.zig.in. The build.zig script no longer is
coupled to the cmake script.
cmake no longer tries to determine the zig version. A build with cmake
will yield a stage1 zig binary that reports 0.0.0+zig0. This is going to
get reverted.
`zig build` now accepts `-Dstage1` which will build the stage1 compiler,
and put the stage2 backend behind a feature flag.
build.zig is simplified to only support the use case of enabling LLVM
support when the LLVM, LLD, and Clang libraries were built by zig. This
part is probably sadly going to have to get reverted to make package
maintainers happy.
Zig build system addBuildOption supports a couple new types.
The biggest reason to make this change is that the zig path is an
attractive option for doing compiler development work on Windows. It
allows people to work on the compiler without having MSVC installed,
using only a .zip file that contains Zig + LLVM/LLD/Clang libraries.
2020-12-05 04:33:29 +00:00
" src/stage1/bigfloat.cpp " ,
" src/stage1/bigint.cpp " ,
" src/stage1/buffer.cpp " ,
" src/stage1/codegen.cpp " ,
" src/stage1/dump_analysis.cpp " ,
" src/stage1/errmsg.cpp " ,
" src/stage1/error.cpp " ,
" src/stage1/heap.cpp " ,
" src/stage1/ir.cpp " ,
" src/stage1/ir_print.cpp " ,
" src/stage1/mem.cpp " ,
" src/stage1/os.cpp " ,
" src/stage1/parser.cpp " ,
" src/stage1/range_set.cpp " ,
" src/stage1/stage1.cpp " ,
" src/stage1/target.cpp " ,
" src/stage1/tokenizer.cpp " ,
" src/stage1/util.cpp " ,
" src/stage1/softfloat_ext.cpp " ,
} ;
const optimized_c_sources = [ _ ] [ ] const u8 {
" src/stage1/parse_f128.c " ,
} ;
const zig_cpp_sources = [ _ ] [ ] const u8 {
// These are planned to stay even when we are self-hosted.
" src/zig_llvm.cpp " ,
" src/zig_clang.cpp " ,
2021-06-06 00:33:16 +00:00
" src/zig_llvm-ar.cpp " ,
ability to build stage1 using only a zig tarball
The main idea here is that there are now 2 ways to get a stage1 zig
binary:
* The cmake path. Requirements: cmake, system C++ compiler, system
LLVM, LLD, Clang libraries, compiled by the system C++ compiler.
* The zig path. Requirements: a zig installation, system LLVM, LLD,
Clang libraries, compiled by the zig installation.
Note that the former can be used to now take the latter path.
Removed config.h.in and config.zig.in. The build.zig script no longer is
coupled to the cmake script.
cmake no longer tries to determine the zig version. A build with cmake
will yield a stage1 zig binary that reports 0.0.0+zig0. This is going to
get reverted.
`zig build` now accepts `-Dstage1` which will build the stage1 compiler,
and put the stage2 backend behind a feature flag.
build.zig is simplified to only support the use case of enabling LLVM
support when the LLVM, LLD, and Clang libraries were built by zig. This
part is probably sadly going to have to get reverted to make package
maintainers happy.
Zig build system addBuildOption supports a couple new types.
The biggest reason to make this change is that the zig path is an
attractive option for doing compiler development work on Windows. It
allows people to work on the compiler without having MSVC installed,
using only a .zip file that contains Zig + LLVM/LLD/Clang libraries.
2020-12-05 04:33:29 +00:00
" src/zig_clang_driver.cpp " ,
" src/zig_clang_cc1_main.cpp " ,
" src/zig_clang_cc1as_main.cpp " ,
// https://github.com/ziglang/zig/issues/6363
" src/windows_sdk.cpp " ,
2018-07-11 00:18:43 +00:00
} ;
2020-04-04 15:57:28 +00:00
ability to build stage1 using only a zig tarball
The main idea here is that there are now 2 ways to get a stage1 zig
binary:
* The cmake path. Requirements: cmake, system C++ compiler, system
LLVM, LLD, Clang libraries, compiled by the system C++ compiler.
* The zig path. Requirements: a zig installation, system LLVM, LLD,
Clang libraries, compiled by the zig installation.
Note that the former can be used to now take the latter path.
Removed config.h.in and config.zig.in. The build.zig script no longer is
coupled to the cmake script.
cmake no longer tries to determine the zig version. A build with cmake
will yield a stage1 zig binary that reports 0.0.0+zig0. This is going to
get reverted.
`zig build` now accepts `-Dstage1` which will build the stage1 compiler,
and put the stage2 backend behind a feature flag.
build.zig is simplified to only support the use case of enabling LLVM
support when the LLVM, LLD, and Clang libraries were built by zig. This
part is probably sadly going to have to get reverted to make package
maintainers happy.
Zig build system addBuildOption supports a couple new types.
The biggest reason to make this change is that the zig path is an
attractive option for doing compiler development work on Windows. It
allows people to work on the compiler without having MSVC installed,
using only a .zip file that contains Zig + LLVM/LLD/Clang libraries.
2020-12-05 04:33:29 +00:00
const clang_libs = [ _ ] [ ] const u8 {
" clangFrontendTool " ,
" clangCodeGen " ,
" clangFrontend " ,
" clangDriver " ,
" clangSerialization " ,
" clangSema " ,
" clangStaticAnalyzerFrontend " ,
" clangStaticAnalyzerCheckers " ,
" clangStaticAnalyzerCore " ,
" clangAnalysis " ,
" clangASTMatchers " ,
" clangAST " ,
" clangParse " ,
" clangSema " ,
" clangBasic " ,
" clangEdit " ,
" clangLex " ,
" clangARCMigrate " ,
" clangRewriteFrontend " ,
" clangRewrite " ,
" clangCrossTU " ,
" clangIndex " ,
" clangToolingCore " ,
} ;
const lld_libs = [ _ ] [ ] const u8 {
" lldDriver " ,
" lldMinGW " ,
" lldELF " ,
" lldCOFF " ,
" lldMachO " ,
" lldWasm " ,
" lldReaderWriter " ,
" lldCore " ,
" lldYAML " ,
" lldCommon " ,
} ;
// This list can be re-generated with `llvm-config --libfiles` and then
// reformatting using your favorite text editor. Note we do not execute
2021-04-15 17:58:53 +00:00
// `llvm-config` here because we are cross compiling. Also omit LLVMTableGen
// from these libs.
ability to build stage1 using only a zig tarball
The main idea here is that there are now 2 ways to get a stage1 zig
binary:
* The cmake path. Requirements: cmake, system C++ compiler, system
LLVM, LLD, Clang libraries, compiled by the system C++ compiler.
* The zig path. Requirements: a zig installation, system LLVM, LLD,
Clang libraries, compiled by the zig installation.
Note that the former can be used to now take the latter path.
Removed config.h.in and config.zig.in. The build.zig script no longer is
coupled to the cmake script.
cmake no longer tries to determine the zig version. A build with cmake
will yield a stage1 zig binary that reports 0.0.0+zig0. This is going to
get reverted.
`zig build` now accepts `-Dstage1` which will build the stage1 compiler,
and put the stage2 backend behind a feature flag.
build.zig is simplified to only support the use case of enabling LLVM
support when the LLVM, LLD, and Clang libraries were built by zig. This
part is probably sadly going to have to get reverted to make package
maintainers happy.
Zig build system addBuildOption supports a couple new types.
The biggest reason to make this change is that the zig path is an
attractive option for doing compiler development work on Windows. It
allows people to work on the compiler without having MSVC installed,
using only a .zip file that contains Zig + LLVM/LLD/Clang libraries.
2020-12-05 04:33:29 +00:00
const llvm_libs = [ _ ] [ ] const u8 {
" LLVMWindowsManifest " ,
2021-04-15 17:43:39 +00:00
" LLVMXRay " ,
ability to build stage1 using only a zig tarball
The main idea here is that there are now 2 ways to get a stage1 zig
binary:
* The cmake path. Requirements: cmake, system C++ compiler, system
LLVM, LLD, Clang libraries, compiled by the system C++ compiler.
* The zig path. Requirements: a zig installation, system LLVM, LLD,
Clang libraries, compiled by the zig installation.
Note that the former can be used to now take the latter path.
Removed config.h.in and config.zig.in. The build.zig script no longer is
coupled to the cmake script.
cmake no longer tries to determine the zig version. A build with cmake
will yield a stage1 zig binary that reports 0.0.0+zig0. This is going to
get reverted.
`zig build` now accepts `-Dstage1` which will build the stage1 compiler,
and put the stage2 backend behind a feature flag.
build.zig is simplified to only support the use case of enabling LLVM
support when the LLVM, LLD, and Clang libraries were built by zig. This
part is probably sadly going to have to get reverted to make package
maintainers happy.
Zig build system addBuildOption supports a couple new types.
The biggest reason to make this change is that the zig path is an
attractive option for doing compiler development work on Windows. It
allows people to work on the compiler without having MSVC installed,
using only a .zip file that contains Zig + LLVM/LLD/Clang libraries.
2020-12-05 04:33:29 +00:00
" LLVMLibDriver " ,
" LLVMDlltoolDriver " ,
" LLVMCoverage " ,
2021-04-15 17:43:39 +00:00
" LLVMLineEditor " ,
ability to build stage1 using only a zig tarball
The main idea here is that there are now 2 ways to get a stage1 zig
binary:
* The cmake path. Requirements: cmake, system C++ compiler, system
LLVM, LLD, Clang libraries, compiled by the system C++ compiler.
* The zig path. Requirements: a zig installation, system LLVM, LLD,
Clang libraries, compiled by the zig installation.
Note that the former can be used to now take the latter path.
Removed config.h.in and config.zig.in. The build.zig script no longer is
coupled to the cmake script.
cmake no longer tries to determine the zig version. A build with cmake
will yield a stage1 zig binary that reports 0.0.0+zig0. This is going to
get reverted.
`zig build` now accepts `-Dstage1` which will build the stage1 compiler,
and put the stage2 backend behind a feature flag.
build.zig is simplified to only support the use case of enabling LLVM
support when the LLVM, LLD, and Clang libraries were built by zig. This
part is probably sadly going to have to get reverted to make package
maintainers happy.
Zig build system addBuildOption supports a couple new types.
The biggest reason to make this change is that the zig path is an
attractive option for doing compiler development work on Windows. It
allows people to work on the compiler without having MSVC installed,
using only a .zip file that contains Zig + LLVM/LLD/Clang libraries.
2020-12-05 04:33:29 +00:00
" LLVMXCoreDisassembler " ,
" LLVMXCoreCodeGen " ,
" LLVMXCoreDesc " ,
" LLVMXCoreInfo " ,
" LLVMX86Disassembler " ,
" LLVMX86AsmParser " ,
2021-04-15 17:43:39 +00:00
" LLVMX86CodeGen " ,
ability to build stage1 using only a zig tarball
The main idea here is that there are now 2 ways to get a stage1 zig
binary:
* The cmake path. Requirements: cmake, system C++ compiler, system
LLVM, LLD, Clang libraries, compiled by the system C++ compiler.
* The zig path. Requirements: a zig installation, system LLVM, LLD,
Clang libraries, compiled by the zig installation.
Note that the former can be used to now take the latter path.
Removed config.h.in and config.zig.in. The build.zig script no longer is
coupled to the cmake script.
cmake no longer tries to determine the zig version. A build with cmake
will yield a stage1 zig binary that reports 0.0.0+zig0. This is going to
get reverted.
`zig build` now accepts `-Dstage1` which will build the stage1 compiler,
and put the stage2 backend behind a feature flag.
build.zig is simplified to only support the use case of enabling LLVM
support when the LLVM, LLD, and Clang libraries were built by zig. This
part is probably sadly going to have to get reverted to make package
maintainers happy.
Zig build system addBuildOption supports a couple new types.
The biggest reason to make this change is that the zig path is an
attractive option for doing compiler development work on Windows. It
allows people to work on the compiler without having MSVC installed,
using only a .zip file that contains Zig + LLVM/LLD/Clang libraries.
2020-12-05 04:33:29 +00:00
" LLVMX86Desc " ,
" LLVMX86Info " ,
" LLVMWebAssemblyDisassembler " ,
2021-04-15 17:43:39 +00:00
" LLVMWebAssemblyAsmParser " ,
ability to build stage1 using only a zig tarball
The main idea here is that there are now 2 ways to get a stage1 zig
binary:
* The cmake path. Requirements: cmake, system C++ compiler, system
LLVM, LLD, Clang libraries, compiled by the system C++ compiler.
* The zig path. Requirements: a zig installation, system LLVM, LLD,
Clang libraries, compiled by the zig installation.
Note that the former can be used to now take the latter path.
Removed config.h.in and config.zig.in. The build.zig script no longer is
coupled to the cmake script.
cmake no longer tries to determine the zig version. A build with cmake
will yield a stage1 zig binary that reports 0.0.0+zig0. This is going to
get reverted.
`zig build` now accepts `-Dstage1` which will build the stage1 compiler,
and put the stage2 backend behind a feature flag.
build.zig is simplified to only support the use case of enabling LLVM
support when the LLVM, LLD, and Clang libraries were built by zig. This
part is probably sadly going to have to get reverted to make package
maintainers happy.
Zig build system addBuildOption supports a couple new types.
The biggest reason to make this change is that the zig path is an
attractive option for doing compiler development work on Windows. It
allows people to work on the compiler without having MSVC installed,
using only a .zip file that contains Zig + LLVM/LLD/Clang libraries.
2020-12-05 04:33:29 +00:00
" LLVMWebAssemblyCodeGen " ,
" LLVMWebAssemblyDesc " ,
2021-10-02 19:40:07 +00:00
" LLVMWebAssemblyUtils " ,
ability to build stage1 using only a zig tarball
The main idea here is that there are now 2 ways to get a stage1 zig
binary:
* The cmake path. Requirements: cmake, system C++ compiler, system
LLVM, LLD, Clang libraries, compiled by the system C++ compiler.
* The zig path. Requirements: a zig installation, system LLVM, LLD,
Clang libraries, compiled by the zig installation.
Note that the former can be used to now take the latter path.
Removed config.h.in and config.zig.in. The build.zig script no longer is
coupled to the cmake script.
cmake no longer tries to determine the zig version. A build with cmake
will yield a stage1 zig binary that reports 0.0.0+zig0. This is going to
get reverted.
`zig build` now accepts `-Dstage1` which will build the stage1 compiler,
and put the stage2 backend behind a feature flag.
build.zig is simplified to only support the use case of enabling LLVM
support when the LLVM, LLD, and Clang libraries were built by zig. This
part is probably sadly going to have to get reverted to make package
maintainers happy.
Zig build system addBuildOption supports a couple new types.
The biggest reason to make this change is that the zig path is an
attractive option for doing compiler development work on Windows. It
allows people to work on the compiler without having MSVC installed,
using only a .zip file that contains Zig + LLVM/LLD/Clang libraries.
2020-12-05 04:33:29 +00:00
" LLVMWebAssemblyInfo " ,
" LLVMSystemZDisassembler " ,
" LLVMSystemZAsmParser " ,
2021-04-15 17:43:39 +00:00
" LLVMSystemZCodeGen " ,
ability to build stage1 using only a zig tarball
The main idea here is that there are now 2 ways to get a stage1 zig
binary:
* The cmake path. Requirements: cmake, system C++ compiler, system
LLVM, LLD, Clang libraries, compiled by the system C++ compiler.
* The zig path. Requirements: a zig installation, system LLVM, LLD,
Clang libraries, compiled by the zig installation.
Note that the former can be used to now take the latter path.
Removed config.h.in and config.zig.in. The build.zig script no longer is
coupled to the cmake script.
cmake no longer tries to determine the zig version. A build with cmake
will yield a stage1 zig binary that reports 0.0.0+zig0. This is going to
get reverted.
`zig build` now accepts `-Dstage1` which will build the stage1 compiler,
and put the stage2 backend behind a feature flag.
build.zig is simplified to only support the use case of enabling LLVM
support when the LLVM, LLD, and Clang libraries were built by zig. This
part is probably sadly going to have to get reverted to make package
maintainers happy.
Zig build system addBuildOption supports a couple new types.
The biggest reason to make this change is that the zig path is an
attractive option for doing compiler development work on Windows. It
allows people to work on the compiler without having MSVC installed,
using only a .zip file that contains Zig + LLVM/LLD/Clang libraries.
2020-12-05 04:33:29 +00:00
" LLVMSystemZDesc " ,
" LLVMSystemZInfo " ,
" LLVMSparcDisassembler " ,
" LLVMSparcAsmParser " ,
2021-04-15 17:43:39 +00:00
" LLVMSparcCodeGen " ,
ability to build stage1 using only a zig tarball
The main idea here is that there are now 2 ways to get a stage1 zig
binary:
* The cmake path. Requirements: cmake, system C++ compiler, system
LLVM, LLD, Clang libraries, compiled by the system C++ compiler.
* The zig path. Requirements: a zig installation, system LLVM, LLD,
Clang libraries, compiled by the zig installation.
Note that the former can be used to now take the latter path.
Removed config.h.in and config.zig.in. The build.zig script no longer is
coupled to the cmake script.
cmake no longer tries to determine the zig version. A build with cmake
will yield a stage1 zig binary that reports 0.0.0+zig0. This is going to
get reverted.
`zig build` now accepts `-Dstage1` which will build the stage1 compiler,
and put the stage2 backend behind a feature flag.
build.zig is simplified to only support the use case of enabling LLVM
support when the LLVM, LLD, and Clang libraries were built by zig. This
part is probably sadly going to have to get reverted to make package
maintainers happy.
Zig build system addBuildOption supports a couple new types.
The biggest reason to make this change is that the zig path is an
attractive option for doing compiler development work on Windows. It
allows people to work on the compiler without having MSVC installed,
using only a .zip file that contains Zig + LLVM/LLD/Clang libraries.
2020-12-05 04:33:29 +00:00
" LLVMSparcDesc " ,
" LLVMSparcInfo " ,
" LLVMRISCVDisassembler " ,
" LLVMRISCVAsmParser " ,
2021-04-15 17:43:39 +00:00
" LLVMRISCVCodeGen " ,
ability to build stage1 using only a zig tarball
The main idea here is that there are now 2 ways to get a stage1 zig
binary:
* The cmake path. Requirements: cmake, system C++ compiler, system
LLVM, LLD, Clang libraries, compiled by the system C++ compiler.
* The zig path. Requirements: a zig installation, system LLVM, LLD,
Clang libraries, compiled by the zig installation.
Note that the former can be used to now take the latter path.
Removed config.h.in and config.zig.in. The build.zig script no longer is
coupled to the cmake script.
cmake no longer tries to determine the zig version. A build with cmake
will yield a stage1 zig binary that reports 0.0.0+zig0. This is going to
get reverted.
`zig build` now accepts `-Dstage1` which will build the stage1 compiler,
and put the stage2 backend behind a feature flag.
build.zig is simplified to only support the use case of enabling LLVM
support when the LLVM, LLD, and Clang libraries were built by zig. This
part is probably sadly going to have to get reverted to make package
maintainers happy.
Zig build system addBuildOption supports a couple new types.
The biggest reason to make this change is that the zig path is an
attractive option for doing compiler development work on Windows. It
allows people to work on the compiler without having MSVC installed,
using only a .zip file that contains Zig + LLVM/LLD/Clang libraries.
2020-12-05 04:33:29 +00:00
" LLVMRISCVDesc " ,
" LLVMRISCVInfo " ,
" LLVMPowerPCDisassembler " ,
" LLVMPowerPCAsmParser " ,
2021-04-15 17:43:39 +00:00
" LLVMPowerPCCodeGen " ,
ability to build stage1 using only a zig tarball
The main idea here is that there are now 2 ways to get a stage1 zig
binary:
* The cmake path. Requirements: cmake, system C++ compiler, system
LLVM, LLD, Clang libraries, compiled by the system C++ compiler.
* The zig path. Requirements: a zig installation, system LLVM, LLD,
Clang libraries, compiled by the zig installation.
Note that the former can be used to now take the latter path.
Removed config.h.in and config.zig.in. The build.zig script no longer is
coupled to the cmake script.
cmake no longer tries to determine the zig version. A build with cmake
will yield a stage1 zig binary that reports 0.0.0+zig0. This is going to
get reverted.
`zig build` now accepts `-Dstage1` which will build the stage1 compiler,
and put the stage2 backend behind a feature flag.
build.zig is simplified to only support the use case of enabling LLVM
support when the LLVM, LLD, and Clang libraries were built by zig. This
part is probably sadly going to have to get reverted to make package
maintainers happy.
Zig build system addBuildOption supports a couple new types.
The biggest reason to make this change is that the zig path is an
attractive option for doing compiler development work on Windows. It
allows people to work on the compiler without having MSVC installed,
using only a .zip file that contains Zig + LLVM/LLD/Clang libraries.
2020-12-05 04:33:29 +00:00
" LLVMPowerPCDesc " ,
" LLVMPowerPCInfo " ,
" LLVMNVPTXCodeGen " ,
" LLVMNVPTXDesc " ,
" LLVMNVPTXInfo " ,
" LLVMMSP430Disassembler " ,
" LLVMMSP430AsmParser " ,
2021-04-15 17:43:39 +00:00
" LLVMMSP430CodeGen " ,
ability to build stage1 using only a zig tarball
The main idea here is that there are now 2 ways to get a stage1 zig
binary:
* The cmake path. Requirements: cmake, system C++ compiler, system
LLVM, LLD, Clang libraries, compiled by the system C++ compiler.
* The zig path. Requirements: a zig installation, system LLVM, LLD,
Clang libraries, compiled by the zig installation.
Note that the former can be used to now take the latter path.
Removed config.h.in and config.zig.in. The build.zig script no longer is
coupled to the cmake script.
cmake no longer tries to determine the zig version. A build with cmake
will yield a stage1 zig binary that reports 0.0.0+zig0. This is going to
get reverted.
`zig build` now accepts `-Dstage1` which will build the stage1 compiler,
and put the stage2 backend behind a feature flag.
build.zig is simplified to only support the use case of enabling LLVM
support when the LLVM, LLD, and Clang libraries were built by zig. This
part is probably sadly going to have to get reverted to make package
maintainers happy.
Zig build system addBuildOption supports a couple new types.
The biggest reason to make this change is that the zig path is an
attractive option for doing compiler development work on Windows. It
allows people to work on the compiler without having MSVC installed,
using only a .zip file that contains Zig + LLVM/LLD/Clang libraries.
2020-12-05 04:33:29 +00:00
" LLVMMSP430Desc " ,
" LLVMMSP430Info " ,
" LLVMMipsDisassembler " ,
" LLVMMipsAsmParser " ,
2021-04-15 17:43:39 +00:00
" LLVMMipsCodeGen " ,
ability to build stage1 using only a zig tarball
The main idea here is that there are now 2 ways to get a stage1 zig
binary:
* The cmake path. Requirements: cmake, system C++ compiler, system
LLVM, LLD, Clang libraries, compiled by the system C++ compiler.
* The zig path. Requirements: a zig installation, system LLVM, LLD,
Clang libraries, compiled by the zig installation.
Note that the former can be used to now take the latter path.
Removed config.h.in and config.zig.in. The build.zig script no longer is
coupled to the cmake script.
cmake no longer tries to determine the zig version. A build with cmake
will yield a stage1 zig binary that reports 0.0.0+zig0. This is going to
get reverted.
`zig build` now accepts `-Dstage1` which will build the stage1 compiler,
and put the stage2 backend behind a feature flag.
build.zig is simplified to only support the use case of enabling LLVM
support when the LLVM, LLD, and Clang libraries were built by zig. This
part is probably sadly going to have to get reverted to make package
maintainers happy.
Zig build system addBuildOption supports a couple new types.
The biggest reason to make this change is that the zig path is an
attractive option for doing compiler development work on Windows. It
allows people to work on the compiler without having MSVC installed,
using only a .zip file that contains Zig + LLVM/LLD/Clang libraries.
2020-12-05 04:33:29 +00:00
" LLVMMipsDesc " ,
" LLVMMipsInfo " ,
" LLVMLanaiDisassembler " ,
" LLVMLanaiCodeGen " ,
" LLVMLanaiAsmParser " ,
" LLVMLanaiDesc " ,
" LLVMLanaiInfo " ,
" LLVMHexagonDisassembler " ,
" LLVMHexagonCodeGen " ,
" LLVMHexagonAsmParser " ,
" LLVMHexagonDesc " ,
" LLVMHexagonInfo " ,
" LLVMBPFDisassembler " ,
" LLVMBPFAsmParser " ,
2021-04-15 17:43:39 +00:00
" LLVMBPFCodeGen " ,
ability to build stage1 using only a zig tarball
The main idea here is that there are now 2 ways to get a stage1 zig
binary:
* The cmake path. Requirements: cmake, system C++ compiler, system
LLVM, LLD, Clang libraries, compiled by the system C++ compiler.
* The zig path. Requirements: a zig installation, system LLVM, LLD,
Clang libraries, compiled by the zig installation.
Note that the former can be used to now take the latter path.
Removed config.h.in and config.zig.in. The build.zig script no longer is
coupled to the cmake script.
cmake no longer tries to determine the zig version. A build with cmake
will yield a stage1 zig binary that reports 0.0.0+zig0. This is going to
get reverted.
`zig build` now accepts `-Dstage1` which will build the stage1 compiler,
and put the stage2 backend behind a feature flag.
build.zig is simplified to only support the use case of enabling LLVM
support when the LLVM, LLD, and Clang libraries were built by zig. This
part is probably sadly going to have to get reverted to make package
maintainers happy.
Zig build system addBuildOption supports a couple new types.
The biggest reason to make this change is that the zig path is an
attractive option for doing compiler development work on Windows. It
allows people to work on the compiler without having MSVC installed,
using only a .zip file that contains Zig + LLVM/LLD/Clang libraries.
2020-12-05 04:33:29 +00:00
" LLVMBPFDesc " ,
" LLVMBPFInfo " ,
" LLVMAVRDisassembler " ,
" LLVMAVRAsmParser " ,
2021-04-15 17:43:39 +00:00
" LLVMAVRCodeGen " ,
ability to build stage1 using only a zig tarball
The main idea here is that there are now 2 ways to get a stage1 zig
binary:
* The cmake path. Requirements: cmake, system C++ compiler, system
LLVM, LLD, Clang libraries, compiled by the system C++ compiler.
* The zig path. Requirements: a zig installation, system LLVM, LLD,
Clang libraries, compiled by the zig installation.
Note that the former can be used to now take the latter path.
Removed config.h.in and config.zig.in. The build.zig script no longer is
coupled to the cmake script.
cmake no longer tries to determine the zig version. A build with cmake
will yield a stage1 zig binary that reports 0.0.0+zig0. This is going to
get reverted.
`zig build` now accepts `-Dstage1` which will build the stage1 compiler,
and put the stage2 backend behind a feature flag.
build.zig is simplified to only support the use case of enabling LLVM
support when the LLVM, LLD, and Clang libraries were built by zig. This
part is probably sadly going to have to get reverted to make package
maintainers happy.
Zig build system addBuildOption supports a couple new types.
The biggest reason to make this change is that the zig path is an
attractive option for doing compiler development work on Windows. It
allows people to work on the compiler without having MSVC installed,
using only a .zip file that contains Zig + LLVM/LLD/Clang libraries.
2020-12-05 04:33:29 +00:00
" LLVMAVRDesc " ,
" LLVMAVRInfo " ,
" LLVMARMDisassembler " ,
" LLVMARMAsmParser " ,
2021-04-15 17:43:39 +00:00
" LLVMARMCodeGen " ,
ability to build stage1 using only a zig tarball
The main idea here is that there are now 2 ways to get a stage1 zig
binary:
* The cmake path. Requirements: cmake, system C++ compiler, system
LLVM, LLD, Clang libraries, compiled by the system C++ compiler.
* The zig path. Requirements: a zig installation, system LLVM, LLD,
Clang libraries, compiled by the zig installation.
Note that the former can be used to now take the latter path.
Removed config.h.in and config.zig.in. The build.zig script no longer is
coupled to the cmake script.
cmake no longer tries to determine the zig version. A build with cmake
will yield a stage1 zig binary that reports 0.0.0+zig0. This is going to
get reverted.
`zig build` now accepts `-Dstage1` which will build the stage1 compiler,
and put the stage2 backend behind a feature flag.
build.zig is simplified to only support the use case of enabling LLVM
support when the LLVM, LLD, and Clang libraries were built by zig. This
part is probably sadly going to have to get reverted to make package
maintainers happy.
Zig build system addBuildOption supports a couple new types.
The biggest reason to make this change is that the zig path is an
attractive option for doing compiler development work on Windows. It
allows people to work on the compiler without having MSVC installed,
using only a .zip file that contains Zig + LLVM/LLD/Clang libraries.
2020-12-05 04:33:29 +00:00
" LLVMARMDesc " ,
" LLVMARMUtils " ,
" LLVMARMInfo " ,
" LLVMAMDGPUDisassembler " ,
" LLVMAMDGPUAsmParser " ,
2021-04-15 17:43:39 +00:00
" LLVMAMDGPUCodeGen " ,
ability to build stage1 using only a zig tarball
The main idea here is that there are now 2 ways to get a stage1 zig
binary:
* The cmake path. Requirements: cmake, system C++ compiler, system
LLVM, LLD, Clang libraries, compiled by the system C++ compiler.
* The zig path. Requirements: a zig installation, system LLVM, LLD,
Clang libraries, compiled by the zig installation.
Note that the former can be used to now take the latter path.
Removed config.h.in and config.zig.in. The build.zig script no longer is
coupled to the cmake script.
cmake no longer tries to determine the zig version. A build with cmake
will yield a stage1 zig binary that reports 0.0.0+zig0. This is going to
get reverted.
`zig build` now accepts `-Dstage1` which will build the stage1 compiler,
and put the stage2 backend behind a feature flag.
build.zig is simplified to only support the use case of enabling LLVM
support when the LLVM, LLD, and Clang libraries were built by zig. This
part is probably sadly going to have to get reverted to make package
maintainers happy.
Zig build system addBuildOption supports a couple new types.
The biggest reason to make this change is that the zig path is an
attractive option for doing compiler development work on Windows. It
allows people to work on the compiler without having MSVC installed,
using only a .zip file that contains Zig + LLVM/LLD/Clang libraries.
2020-12-05 04:33:29 +00:00
" LLVMAMDGPUDesc " ,
" LLVMAMDGPUUtils " ,
" LLVMAMDGPUInfo " ,
" LLVMAArch64Disassembler " ,
2021-04-15 17:43:39 +00:00
" LLVMAArch64AsmParser " ,
ability to build stage1 using only a zig tarball
The main idea here is that there are now 2 ways to get a stage1 zig
binary:
* The cmake path. Requirements: cmake, system C++ compiler, system
LLVM, LLD, Clang libraries, compiled by the system C++ compiler.
* The zig path. Requirements: a zig installation, system LLVM, LLD,
Clang libraries, compiled by the zig installation.
Note that the former can be used to now take the latter path.
Removed config.h.in and config.zig.in. The build.zig script no longer is
coupled to the cmake script.
cmake no longer tries to determine the zig version. A build with cmake
will yield a stage1 zig binary that reports 0.0.0+zig0. This is going to
get reverted.
`zig build` now accepts `-Dstage1` which will build the stage1 compiler,
and put the stage2 backend behind a feature flag.
build.zig is simplified to only support the use case of enabling LLVM
support when the LLVM, LLD, and Clang libraries were built by zig. This
part is probably sadly going to have to get reverted to make package
maintainers happy.
Zig build system addBuildOption supports a couple new types.
The biggest reason to make this change is that the zig path is an
attractive option for doing compiler development work on Windows. It
allows people to work on the compiler without having MSVC installed,
using only a .zip file that contains Zig + LLVM/LLD/Clang libraries.
2020-12-05 04:33:29 +00:00
" LLVMAArch64CodeGen " ,
2021-04-15 17:43:39 +00:00
" LLVMAArch64Desc " ,
" LLVMAArch64Utils " ,
" LLVMAArch64Info " ,
" LLVMOrcJIT " ,
" LLVMMCJIT " ,
" LLVMJITLink " ,
" LLVMInterpreter " ,
" LLVMExecutionEngine " ,
" LLVMRuntimeDyld " ,
2021-10-02 19:40:07 +00:00
" LLVMOrcTargetProcess " ,
" LLVMOrcShared " ,
" LLVMDWP " ,
2021-04-15 17:43:39 +00:00
" LLVMSymbolize " ,
" LLVMDebugInfoPDB " ,
" LLVMDebugInfoGSYM " ,
" LLVMOption " ,
" LLVMObjectYAML " ,
" LLVMMCA " ,
" LLVMMCDisassembler " ,
" LLVMLTO " ,
" LLVMPasses " ,
ability to build stage1 using only a zig tarball
The main idea here is that there are now 2 ways to get a stage1 zig
binary:
* The cmake path. Requirements: cmake, system C++ compiler, system
LLVM, LLD, Clang libraries, compiled by the system C++ compiler.
* The zig path. Requirements: a zig installation, system LLVM, LLD,
Clang libraries, compiled by the zig installation.
Note that the former can be used to now take the latter path.
Removed config.h.in and config.zig.in. The build.zig script no longer is
coupled to the cmake script.
cmake no longer tries to determine the zig version. A build with cmake
will yield a stage1 zig binary that reports 0.0.0+zig0. This is going to
get reverted.
`zig build` now accepts `-Dstage1` which will build the stage1 compiler,
and put the stage2 backend behind a feature flag.
build.zig is simplified to only support the use case of enabling LLVM
support when the LLVM, LLD, and Clang libraries were built by zig. This
part is probably sadly going to have to get reverted to make package
maintainers happy.
Zig build system addBuildOption supports a couple new types.
The biggest reason to make this change is that the zig path is an
attractive option for doing compiler development work on Windows. It
allows people to work on the compiler without having MSVC installed,
using only a .zip file that contains Zig + LLVM/LLD/Clang libraries.
2020-12-05 04:33:29 +00:00
" LLVMCFGuard " ,
2021-04-15 17:43:39 +00:00
" LLVMCoroutines " ,
" LLVMObjCARCOpts " ,
" LLVMipo " ,
" LLVMVectorize " ,
" LLVMLinker " ,
" LLVMInstrumentation " ,
" LLVMFrontendOpenMP " ,
" LLVMFrontendOpenACC " ,
" LLVMExtensions " ,
" LLVMDWARFLinker " ,
ability to build stage1 using only a zig tarball
The main idea here is that there are now 2 ways to get a stage1 zig
binary:
* The cmake path. Requirements: cmake, system C++ compiler, system
LLVM, LLD, Clang libraries, compiled by the system C++ compiler.
* The zig path. Requirements: a zig installation, system LLVM, LLD,
Clang libraries, compiled by the zig installation.
Note that the former can be used to now take the latter path.
Removed config.h.in and config.zig.in. The build.zig script no longer is
coupled to the cmake script.
cmake no longer tries to determine the zig version. A build with cmake
will yield a stage1 zig binary that reports 0.0.0+zig0. This is going to
get reverted.
`zig build` now accepts `-Dstage1` which will build the stage1 compiler,
and put the stage2 backend behind a feature flag.
build.zig is simplified to only support the use case of enabling LLVM
support when the LLVM, LLD, and Clang libraries were built by zig. This
part is probably sadly going to have to get reverted to make package
maintainers happy.
Zig build system addBuildOption supports a couple new types.
The biggest reason to make this change is that the zig path is an
attractive option for doing compiler development work on Windows. It
allows people to work on the compiler without having MSVC installed,
using only a .zip file that contains Zig + LLVM/LLD/Clang libraries.
2020-12-05 04:33:29 +00:00
" LLVMGlobalISel " ,
2021-04-15 17:43:39 +00:00
" LLVMMIRParser " ,
ability to build stage1 using only a zig tarball
The main idea here is that there are now 2 ways to get a stage1 zig
binary:
* The cmake path. Requirements: cmake, system C++ compiler, system
LLVM, LLD, Clang libraries, compiled by the system C++ compiler.
* The zig path. Requirements: a zig installation, system LLVM, LLD,
Clang libraries, compiled by the zig installation.
Note that the former can be used to now take the latter path.
Removed config.h.in and config.zig.in. The build.zig script no longer is
coupled to the cmake script.
cmake no longer tries to determine the zig version. A build with cmake
will yield a stage1 zig binary that reports 0.0.0+zig0. This is going to
get reverted.
`zig build` now accepts `-Dstage1` which will build the stage1 compiler,
and put the stage2 backend behind a feature flag.
build.zig is simplified to only support the use case of enabling LLVM
support when the LLVM, LLD, and Clang libraries were built by zig. This
part is probably sadly going to have to get reverted to make package
maintainers happy.
Zig build system addBuildOption supports a couple new types.
The biggest reason to make this change is that the zig path is an
attractive option for doing compiler development work on Windows. It
allows people to work on the compiler without having MSVC installed,
using only a .zip file that contains Zig + LLVM/LLD/Clang libraries.
2020-12-05 04:33:29 +00:00
" LLVMAsmPrinter " ,
2021-10-02 19:40:07 +00:00
" LLVMDebugInfoMSF " ,
ability to build stage1 using only a zig tarball
The main idea here is that there are now 2 ways to get a stage1 zig
binary:
* The cmake path. Requirements: cmake, system C++ compiler, system
LLVM, LLD, Clang libraries, compiled by the system C++ compiler.
* The zig path. Requirements: a zig installation, system LLVM, LLD,
Clang libraries, compiled by the zig installation.
Note that the former can be used to now take the latter path.
Removed config.h.in and config.zig.in. The build.zig script no longer is
coupled to the cmake script.
cmake no longer tries to determine the zig version. A build with cmake
will yield a stage1 zig binary that reports 0.0.0+zig0. This is going to
get reverted.
`zig build` now accepts `-Dstage1` which will build the stage1 compiler,
and put the stage2 backend behind a feature flag.
build.zig is simplified to only support the use case of enabling LLVM
support when the LLVM, LLD, and Clang libraries were built by zig. This
part is probably sadly going to have to get reverted to make package
maintainers happy.
Zig build system addBuildOption supports a couple new types.
The biggest reason to make this change is that the zig path is an
attractive option for doing compiler development work on Windows. It
allows people to work on the compiler without having MSVC installed,
using only a .zip file that contains Zig + LLVM/LLD/Clang libraries.
2020-12-05 04:33:29 +00:00
" LLVMDebugInfoDWARF " ,
2021-04-15 17:43:39 +00:00
" LLVMSelectionDAG " ,
ability to build stage1 using only a zig tarball
The main idea here is that there are now 2 ways to get a stage1 zig
binary:
* The cmake path. Requirements: cmake, system C++ compiler, system
LLVM, LLD, Clang libraries, compiled by the system C++ compiler.
* The zig path. Requirements: a zig installation, system LLVM, LLD,
Clang libraries, compiled by the zig installation.
Note that the former can be used to now take the latter path.
Removed config.h.in and config.zig.in. The build.zig script no longer is
coupled to the cmake script.
cmake no longer tries to determine the zig version. A build with cmake
will yield a stage1 zig binary that reports 0.0.0+zig0. This is going to
get reverted.
`zig build` now accepts `-Dstage1` which will build the stage1 compiler,
and put the stage2 backend behind a feature flag.
build.zig is simplified to only support the use case of enabling LLVM
support when the LLVM, LLD, and Clang libraries were built by zig. This
part is probably sadly going to have to get reverted to make package
maintainers happy.
Zig build system addBuildOption supports a couple new types.
The biggest reason to make this change is that the zig path is an
attractive option for doing compiler development work on Windows. It
allows people to work on the compiler without having MSVC installed,
using only a .zip file that contains Zig + LLVM/LLD/Clang libraries.
2020-12-05 04:33:29 +00:00
" LLVMCodeGen " ,
2021-04-15 17:43:39 +00:00
" LLVMIRReader " ,
" LLVMAsmParser " ,
" LLVMInterfaceStub " ,
" LLVMFileCheck " ,
" LLVMFuzzMutate " ,
ability to build stage1 using only a zig tarball
The main idea here is that there are now 2 ways to get a stage1 zig
binary:
* The cmake path. Requirements: cmake, system C++ compiler, system
LLVM, LLD, Clang libraries, compiled by the system C++ compiler.
* The zig path. Requirements: a zig installation, system LLVM, LLD,
Clang libraries, compiled by the zig installation.
Note that the former can be used to now take the latter path.
Removed config.h.in and config.zig.in. The build.zig script no longer is
coupled to the cmake script.
cmake no longer tries to determine the zig version. A build with cmake
will yield a stage1 zig binary that reports 0.0.0+zig0. This is going to
get reverted.
`zig build` now accepts `-Dstage1` which will build the stage1 compiler,
and put the stage2 backend behind a feature flag.
build.zig is simplified to only support the use case of enabling LLVM
support when the LLVM, LLD, and Clang libraries were built by zig. This
part is probably sadly going to have to get reverted to make package
maintainers happy.
Zig build system addBuildOption supports a couple new types.
The biggest reason to make this change is that the zig path is an
attractive option for doing compiler development work on Windows. It
allows people to work on the compiler without having MSVC installed,
using only a .zip file that contains Zig + LLVM/LLD/Clang libraries.
2020-12-05 04:33:29 +00:00
" LLVMTarget " ,
" LLVMScalarOpts " ,
" LLVMInstCombine " ,
" LLVMAggressiveInstCombine " ,
" LLVMTransformUtils " ,
" LLVMBitWriter " ,
" LLVMAnalysis " ,
" LLVMProfileData " ,
" LLVMObject " ,
" LLVMTextAPI " ,
" LLVMMCParser " ,
" LLVMMC " ,
" LLVMDebugInfoCodeView " ,
2021-04-15 17:43:39 +00:00
" LLVMBitReader " ,
" LLVMCore " ,
" LLVMRemarks " ,
" LLVMBitstreamReader " ,
ability to build stage1 using only a zig tarball
The main idea here is that there are now 2 ways to get a stage1 zig
binary:
* The cmake path. Requirements: cmake, system C++ compiler, system
LLVM, LLD, Clang libraries, compiled by the system C++ compiler.
* The zig path. Requirements: a zig installation, system LLVM, LLD,
Clang libraries, compiled by the zig installation.
Note that the former can be used to now take the latter path.
Removed config.h.in and config.zig.in. The build.zig script no longer is
coupled to the cmake script.
cmake no longer tries to determine the zig version. A build with cmake
will yield a stage1 zig binary that reports 0.0.0+zig0. This is going to
get reverted.
`zig build` now accepts `-Dstage1` which will build the stage1 compiler,
and put the stage2 backend behind a feature flag.
build.zig is simplified to only support the use case of enabling LLVM
support when the LLVM, LLD, and Clang libraries were built by zig. This
part is probably sadly going to have to get reverted to make package
maintainers happy.
Zig build system addBuildOption supports a couple new types.
The biggest reason to make this change is that the zig path is an
attractive option for doing compiler development work on Windows. It
allows people to work on the compiler without having MSVC installed,
using only a .zip file that contains Zig + LLVM/LLD/Clang libraries.
2020-12-05 04:33:29 +00:00
" LLVMBinaryFormat " ,
" LLVMSupport " ,
" LLVMDemangle " ,
} ;