The Zig build system is still missing documentation and for a lot of people, this is a reason not to use it. Others often search for recipies to build their project, but also struggle with the build system.
This series is an attempt to give a in-depth introduction into the build system and how to use it.
To get started, you should check out the first article which gives an overview and introduction into the build system. In this chapter, we're going to tackle compositon of several projects as well as preparing a release.
Disclaimer
I will expect you to have at least some basic experience with Zig already, as i will not explain syntax or semantics of the Zig language. I will also link to several points in the standard library source, so you can see where all of this comes from. I recommend you to read the source of the build system, as most of it is self-explanatory if you start digging for functions you see in the build script. Everything is implemented in the standard library, there is no hidden build magic happening.
Note
From here on, i will always just provide a minimal build.zig
that will explain what is necessary to solve a single problem. If you want to learn how to glue all these files together into a nice and comfy build file, read the first article.
Composite projects
There are a lot of simple projects out there that consist of only a single executable. But as soon as one starts to write a library, it has to be tested, and it's typical to write one or more example applications. Complexity also rises when people start to use external packages, C libraries, generated code and so on.
This articles tries to cover all of these use cases and will explain how to compose several programs and libraries with build.zig
.
Packages
But what are packages? A package in the Zig world is a Zig source tree that can be consumed by another project. A package can be imported similar to how files are imported by using the @import
statement:
// this is "main.zig"
const std = @import("std"); // imports the "std" package
const ihex = @import("ihex"); // imports the "ihex" package
const tools = @import("tools.zig"); // imports the file "tools.zig"
pub fn main() !void {
const data = try tools.loadFile("foo.ihex");
const hex_file = try ihex.parse(data);
std.debug.print("foo.ihex = {}\n", .{ hex_file });
}
In this case, we import two packages (std
and ihex
) and use one other local file tools.zig
. But how do these import statements differ semantically?
Not much, actually! File imports are just using relative paths to include other Zig files. Packages however use names. These names are given on the command line like this:
zig build-exe --pkg-begin ihex ihex.zig --pkg-end main.zig
The first argument to --pkg-begin
is the name of the package. This is what we can later import from main.zig
The second argument is the file that will be imported. This is pretty neat, as it allows us to import a source tree by name without knowing the path to it. It also allows us to store the package whereever we want, even outside of our source tree.
The cool thing is that packages can also be nested and their names are only locally visible to a single source tree. This means that a package foo
can import another package called foo
which uses totally different files. This is done by nesting --pkg-begin … --pkg-end
declarations inside each other.
Libraries
But Zig also knows the term library. But didn't we already talk about external libraries already?
Well, in the Zig world, a library is a precompiled static or dynamic library exactly like they are the C/C++ world. Libraries usually come with header files that can be included (be it .h
or .zig
) and a binary file which we can link against (typically .a
, .lib
, .so
or .dll
).
Common examples for such a library is zlib or SDL.
Contrary to packages, a library has to be linked by either
- (static libraries) passing the file name on the command line
- (dynamic libraries) using
-L
to add the folder of the library to the search path and using-l
to actually link it.
From Zig, we need to import the headers of the library then by either using a package if the headers are in Zig or using @cImport
for C headers.
Tooling
If our projects grow more and more, there will be a point when the use of tools are required in the build process. These tools typically done some of these tasks:
- Generating some code (e.g. parser generators, serializers, or library headers)
- Bundling the application (e.g. generating an APK, bundle the application, ...)
- Creating asset packs
- ...
With Zig, we have the power to not only utilize existing tools in the build process, but also compile our own (or even external) tools for the current host and run them.
But how do we do all of this in build.zig?
Adding packages
Adding packages is typically done with the function addPackage
on our LibExeObjStep
. This function takes a std.build.Pkg
structure that describes how the package looks like:
pub const Pkg = struct {
name: []const u8,
path: FileSource,
dependencies: ?[]const Pkg = null,
};
As we can see, it has three members:
-
name
is the package name we can use on@import()
-
path
is aFileSource
that defines the root file of the package. This is typically just a path to your file, likevendor/zig-args/args.zig
-
dependencies
is an optional slice of packages this package requires. If we use more complex packages, this is often required.
This is a personal recommendation:
I usually create a struct/namespace called pkgs
at the top of my build.zig
that looks kinda like this:
const pkgs = struct {
const args = std.build.Pkg{
.name = "args",
.source = .{ .path = "libs/args/args.zig" },
.dependencies = &[_]std.build.Pkg{},
};
const interface = std.build.Pkg{
.name = "interface",
.source = .{ .path = "libs/interface.zig/interface.zig" },
.dependencies = &[_]std.build.Pkg{},
};
const lola = std.build.Pkg{
.name = "lola",
.source = .{ .path = "src/library/main.zig" },
.dependencies = &[_]std.build.Pkg{
interface,
},
};
};
This way i can see all packages used in this build file at one central point.
To add these packages, we simply add them to our LibExeObjStep
s like this:
const exe = b.addExecutable("lola", "src/frontend/main.zig");
exe.addPackage(pkgs.lola);
exe.addPackage(pkgs.args);
...
If you only use one or two packages, it's also a good pattern to just declare them locally:
const exe = b.addExecutable("ftz", "src/main.zig");
exe.addPackage(.{
.name = "args",
.source = .{ .path = "./deps/args/args.zig" },
});
exe.addPackage(.{
.name = "network",
.source = .{ .path = "./deps/network/network.zig" },
});
You can also use addPackagePath
which will construct the package for you. Imho, the version with addPackage
is cleaner, though.
Adding libraries
Adding libraries is comparatively easy, but we need to configure more paths.
Note: We covered most of this in the previous article, but let's go over it again quickly:
Let's assume we want to link to libcurl
to our project, as we want to download some files.
System libraries
For unixoid systems, we can usually just use our system package manager to link against the system library. This is done by calling linkSystemLibrary
which will use pkg-config
to figure out all paths on it's own:
pub fn build(b: *std.build.Builder) void {
const exe = b.addExecutable("url2stdout", "src/main.zig");
exe.linkLibC();
exe.linkSystemLibrary("curl");
exe.install();
}
For Linux systems this is the preferred way of linking external libraries.
Local libraries
But you can also link a library you vendor as binaries. For this, we need to call several functions. But first, let's take a look at how such a library might look like:
./vendor/libcurl
├── include
│ └── curl
│ ├── curl.h
│ ├── curlver.h
│ ├── easy.h
│ ├── mprintf.h
│ ├── multi.h
│ ├── options.h
│ ├── stdcheaders.h
│ ├── system.h
│ ├── typecheck-gcc.h
│ └── urlapi.h
├── lib
│ ├── libcurl.a
│ ├── libcurl.so
│ └── ...
├── bin
│ └── ...
└── share
└── ...
What we can see here is that the path vendor/libcurl/include
contains our headers and the folder vendor/libcurl/lib
contains both a static library (libcurl.a
) and a shared/dynamic one (libcurl.so
).
Linking dynamically
To link libcurl
, we need to add the include path first, then provide zig with a prefix to the library and the library name:
pub fn build(b: *std.build.Builder) void {
const exe = b.addExecutable("chapter-3", "src/main.zig");
exe.linkLibC();
exe.addIncludeDir("vendor/libcurl/include");
exe.addLibPath("vendor/libcurl/lib");
exe.linkSystemLibraryName("curl");
exe.install();
}
addIncludeDir
adds the folder to the search path so Zig will find the curl/curl.h
file. Note that we could also pass "vendor/libcurl/include/curl"
here, but you should usually check what your library actually wants.
addLibPath
will do the same for library files. This means that Zig will now also search the folder "vendor/libcurl/lib"
for libraries.
Finally linkSystemLibraryName
will then tell Zig to search for a library named "curl"
. If you've been paying attention, you'll notice that the file in the listing above is called libcurl.so
and not curl.so
. On unixoid systems it's common to prefix library files with lib
, so you don't pass that to the system. On Windows, the library would've been called curl.lib
or similar.
Linking statically
When we want to link a library statically, we have to do that a bit different:
pub fn build(b: *std.build.Builder) void {
const exe = b.addExecutable("chapter-3", "src/main.zig");
exe.linkLibC();
exe.addIncludeDir("vendor/libcurl/include");
exe.addObjectFile("vendor/libcurl/lib/libcurl.a");
exe.install();
}
The call to addIncludeDir
didn't change, but suddenly we don't call a function with link
anymore? You might already know this, but: Static libraries are actually just a collection of object files. On Windows, this also pretty similar, afaik MSVC also uses the same toolset.
Thus, static libraries are just passed into the linker like object files via addObjectFile
and will be unpacked by it.
Note: Most static libraries have some transitive dependencies. In the case of my libcurl
build, those are nghttp2
, zstd
, z
and pthread
, which we then need to link manually again:
pub fn build(b: *std.build.Builder) void {
const exe = b.addExecutable("chapter-3", "src/main.zig");
exe.linkLibC();
exe.addIncludeDir("vendor/libcurl/include");
exe.addObjectFile("vendor/libcurl/lib/libcurl.a");
exe.linkSystemLibrary("nghttp2");
exe.linkSystemLibrary("zstd");
exe.linkSystemLibrary("z");
exe.linkSystemLibrary("pthread");
exe.install();
}
We can continue linkinking more and more libraries statically and pulling in the full dependency tree.
Linking a library by source
But we also have a very different way of linking libraries with the Zig toolchain:
We can just compile them ourselves!
This gives us the benefit that we can much much easier cross-compile our programs. For this, we need to convert the libraries build files into our build.zig
. This typically requires a pretty good understanding of both build.zig
and the build system your library uses. But let's assume the library is super-simple and just consists of a bunch of C files:
pub fn build(b: *std.build.Builder) void {
const cflags = [_][]const u8{};
const curl = b.addSharedLibrary("curl", null, .unversioned);
curl.addCSourceFile("vendor/libcurl/src/tool_main.c", &cflags);
curl.addCSourceFile("vendor/libcurl/src/tool_msgs.c", &cflags);
curl.addCSourceFile("vendor/libcurl/src/tool_dirhie.c", &cflags);
curl.addCSourceFile("vendor/libcurl/src/tool_doswin.c", &cflags);
const exe = b.addExecutable("chapter-3", "src/main.zig");
exe.linkLibC();
exe.addIncludeDir("vendor/libcurl/include");
exe.linkLibrary(curl);
exe.install();
}
With this, we can use both addSharedLibrary
and addStaticLibrary
to add libraries to our LibExeObjStep
.
This is especially convenient as we can use setTarget
and setBuildMode
to compile from everywhere to everywhere.
Using tools
Using tools in your workflow is typically required when you need some precompilation in the form of bison
, flex
, protobuf
or others. Other use cases for tooling is transforming the output file to a different format (e.g. firmware images) or bundling your final application.
System tools
Using pre-installed system tools is quite easy, just create yourself a new step with addSystemCommand
:
pub fn build(b: *std.build.Builder) void {
const cmd = b.addSystemCommand(&[_][]const u8{
"flex",
"--outfile=lines.c",
"lines.l",
});
const exe = b.addExecutable("chapter-3", null);
exe.linkLibC();
exe.addCSourceFile("lines.c", &[_][]const u8{});
exe.install();
exe.step.dependOn(&cmd.step);
}
Here you can see that we just pass an array of options into addSystemCommand
that will reflect our command line invocation. After that, we create our executable file as we are already used to and just add a step dependency on our cmd
by using dependOn
.
We can also do the other way round and add a nice little info about our program when we compile it:
pub fn build(b: *std.build.Builder) void {
const exe = b.addExecutable("chapter-3", "src/main.zig");
exe.install();
const cmd = b.addSystemCommand(&[_][]const u8{"size"});
cmd.addArtifactArg(exe);
b.getInstallStep().dependOn(&cmd.step);
}
size
is a neat tool that will output information about the code size of our executable, this might look like this:
text data bss dec hex filename
12377 620 104 13101 332d /chapter-3/zig-cache/o/558561c5f79d7773de9744645235aa0d/chapter-3
As you can see, we use the addArtifactArg
here, as a addSystemCommand
will just return a std.build.RunStep
. This allows us to incrementally build our full command line, composed of any LibExeObjStep
output, FileSource
or just verbatim arguments.
Fresh-made tools
The cool thing is: We can obtain a std.build.RunStep
from a LibExeObjStep
as well:
const std = @import("std");
pub fn build(b: *std.build.Builder) void {
const game = b.addExecutable("game", "src/game.zig");
const pack_tool = b.addExecutable("pack", "tools/pack.zig");
const precompilation = pack_tool.run(); // returns *RunStep
precompilation.addArtifactArg(game);
precompilation.addArg("assets.zip");
const pack_step = b.step("pack", "Packs the game and assets together");
pack_step.dependOn(&precompilation.step);
}
This build script will first compile a executable named pack
. This executable will then be called with the file of our game
and assets.zig
as command line arguments.
When invoking zig build pack
, we now run tools/pack.zig
. This is pretty cool, as we can also compile the tools we need from scratch. For the best dev experience, you can even compile "external" tools like bison
from source, thus having no dependencies on the system!
Putting it all together
All of this can be intimidating at first, but if we look at a larger example of a build.zig
, we can see that a good build file structure will help us a lot.
The following build script will compile a fictional tool that can parse a input file via a lexer generated by flex
, will then use curl
to to connect to a server and will deliver some files there. The project will be bundled into a single zip file when we invoke zig build deploy
. A normal zig build
invocation will only prepare a local debug install that isn't packed.
const std = @import("std");
pub fn build(b: *std.build.Builder) void {
const mode = b.standardReleaseOptions();
const target = b.standardTargetOptions(.{});
// Generates the lex-based parser
const parser_gen = b.addSystemCommand(&[_][]const u8{
"flex",
"--outfile=review-parser.c",
"review-parser.l",
});
// Our application
const exe = b.addExecutable("upload-review", "src/main.zig");
{
exe.step.dependOn(&parser_gen.step);
exe.addCSourceFile("review-parser.c", &[_][]const u8{});
// add zig-args to parse arguments
exe.addPackage(.{
.name = "args-parser",
.source = .{ .path = "vendor/zig-args/args.zig" },
});
// add libcurl for uploading
exe.addIncludeDir("vendor/libcurl/include");
exe.addObjectFile("vendor/libcurl/lib/libcurl.a");
exe.setBuildMode(mode);
exe.setTarget(target);
exe.linkLibC();
exe.install();
}
// Our test suite
const test_step = b.step("test", "Runs the test suite");
{
const test_suite = b.addTest("src/tests.zig");
test_suite.step.dependOn(&parser_gen.step);
test_suite.addCSourceFile("review-parser.c", &[_][]const u8{});
// add libcurl for uploading
test_suite.addIncludeDir("vendor/libcurl/include");
test_suite.addObjectFile("vendor/libcurl/lib/libcurl.a");
test_suite.linkLibC();
test_step.dependOn(&test_suite.step);
}
const deploy_step = b.step("deploy", "Creates an application bundle");
{
// compile the app bundler
const deploy_tool = b.addExecutable("deploy", "tools/deploy.zig");
{
deploy_tool.linkLibC();
deploy_tool.linkSystemLibrary("libzip");
}
const bundle_app = deploy_tool.run();
bundle_app.addArg("app-bundle.zip");
bundle_app.addArtifactArg(exe);
bundle_app.addArg("resources/index.htm");
bundle_app.addArg("resources/style.css");
deploy_step.dependOn(&bundle_app.step);
}
}
As you can see, it's a lot of code, but with the use of blocks, we can structure the build script into logical groups.
If you might wonder why we don't set a target for deploy_tool
and test_suite
:
Both are meant to be run on the host platform, not on the target machine.
And deploy_tool
also sets a fixed build mode, as we want to go fast, even we build a debug build of our application.
Conclusion
After this wall of text, you now should be able to build pretty much any project you want. We have learned how to compile Zig applications, how to add any kind of external libraries to them, and even how to postprocess our application for release management.
We can also build C and C++ projects with a tiny bit of work and deploy them everywhere, you don't have to use zig build
for Zig projects only.
Even if we mix projects, tools and everything. A single build.zig
file can satisfy our needs. But soon you will notice... Build files get repetetive soon, and some packages or libraries require quite a bit of code to set up properly.
So look out for the next article, where we will learn how to modularize our build.zig
file, create convenient sdks for Zig and even how to make our own build steps!
As always, keep on hacking!
Latest comments (15)
Circular imports in build.zig
I have found circular imports to be very useful in build.zig.
Awesome article! It helped me understand the Zig build system a lot better.
btw AFAIK there is a ungoing attempt to add package management to zig build system i believe it works kinda like Golang gets the dependencies.
Waiting for part 4 on that!
Is there some way to leverage zigs fantastic cross-compilation functionality for dependencies (other C libraries) which are currently using automake style builds? Or is writing a build.zig for each of those libraries the only way?
I guess this is the wrong place for bug reports ;)
You should make a bug report on github.com/ziglang/zig/ for that, as the docgen is currently in the makings
cc @kristoff
Do you happen to know of any articles on custom build options? I've found pieces from formats, Discord, etc, but I'm having trouble putting it all together.
Here for same question! Want to make an application use AVX512 instructions (or not) depending on some options. @nsmryan got any leads?
You can set
features_add
in the CrossTarget struct to achieve this :)Thank you for this article! It helped me a lot in understanding Zig packages.
BTW, it looks like the Pkg structure field "path" is now called "source" github.com/ziglang/zig/blob/master....
I have a build script for a library I am working on. It creates the library with
However when I try to use
@import("interface")
in src/lib.zig, the compiler says:Do you know what the reason could be? When I use the exact same addPackage call to an exe, it works just fine.
It the first
.zig
here a typo?Nope!
github.com/alexnask/interface.zig
Thanks for clarifying.
Nice job Felix, thanks. I'm sure this series will be useful to a lot of people!
I hope so!