Zant (Zig-Ant) is an open-source SDK for deploying optimized neural networks (NNs) on microcontrollers.
- Lack of DL Support: Devices like TI Sitara, Raspberry Pi Pico, or ARM Cortex-M lack comprehensive DL libraries.
- Open-source: End-to-end NN deployment and optimization open-source solution.
- Research-Inspired: Implements optimization inspired by MIT's Han Lab research.
- Academic Collaboration: Developed in collaboration with institutions like Politecnico di Milano.
- Real-time Optimizations: Quantization, pruning, buffer optimization.
- Cross-platform Compatibility: ARM Cortex-M, RISC-V, and others.
- Modular and Easy Integration: Clear APIs, examples, and extensive documentation.
- Edge AI: Real-time anomaly detection, predictive maintenance.
- IoT & Autonomous Systems: Lightweight AI models for drones, robots, vehicles, IoT devices.
- March 5: MNIST inference on Raspberry Pi Pico 2.
- April 30: Efficient YOLO on Raspberry Pi Pico 2.
- Shape Tracker implementation.
- Frontend GUI for library interaction.
im2tensor
for image preprocessing.- Enhanced code generation (flash vs RAM execution).
- Expanded ONNX compatibility.
- Advanced pruning and quantization support.
- Expanded microcontroller compatibility.
- Model execution benchmarking tools.
- Improved real-time inference capabilities.
- Install the latest Zig compiler.
- Improve Zig proficiency via Ziglings.
zig build run
Add tests to build.zig/test_list
.
- Regular tests:
zig build test --summary all
- Heavy computational tests:
zig build test -Dheavy --summary all
Follow Zig's doc-comments.
zig build codegen -Dmodel=model_name [-Dlog -Duser_tests=user_tests.json]
Generated code will be placed in:
generated/model_name/
├── lib_{model_name}.zig
├── test_{model_name}.zig
└── user_tests.json
zig build test-codegen -Dmodel=model_name
Build the static library:
zig build lib -Dmodel=model_name -Dtarget={arch} -Dcpu={cpu}
Linking with CMake:
target_link_libraries(your_project PUBLIC path/to/libzant.a)
To set a custom log function from your C code:
extern void setLogFunction(void (*log_function)(uint8_t *string));
Key Build Commands:
-
Standard build & test:
zig build zig build test --summary all
-
Run code generation:
zig build codegen -Dmodel=model_name [-Dlog -Duser_tests=path/to/tests.json]
-
Compile static library:
zig build lib -Dmodel=model_name -Dtarget=target_arch -Dcpu=specific_cpu
-
Generate onnx oneOperation models:
zig build test-codegen-gen
-
Build and Test generated onnx oneOperation models:
zig build test-codegen
-Dtrace_allocator=true|false
: Use tracing allocator for debugging.-Dlog=true|false
: Enable detailed logging during code generation.-Duser_tests=path/to/user_tests.json
: Specify custom tests.
-
We are committed to enhancing our Continuous Integration/Continuous Deployment (CI/CD) pipeline to ensure robustness, reliability, and performance of Zant across all supported platforms. Key improvements include: Hardware-in-the-Loop (HIL) Testing: Integrate a hardware test bench with connected microcontrollers (e.g., Raspberry Pi Pico, ARM Cortex-M) into the CI/CD pipeline to validate real-world performance and compatibility.
-
Profiling in CI/CD: Automatically profile generated models during the pipeline to measure execution time, memory usage, and power consumption on target hardware.
-
Daily Fuzzing Tests: Run fuzzing tests daily within the CI/CD pipeline to identify edge-case bugs and ensure model stability under unexpected inputs.
-
Multi-Platform Build Matrix: Test builds across a variety of architectures (ARM, RISC-V) and configurations in parallel to catch platform-specific issues early.
-
Automated Benchmarking: Include performance benchmarking in the pipeline to track inference speed and resource usage over time, ensuring optimizations don’t regress.
-
Code Coverage Reporting: Generate and publish code coverage metrics with every CI run to maintain high test quality.
-
Containerized CI Environment: Use Docker containers to standardize the CI/CD environment, ensuring consistent builds and tests across all contributors.
- Follow our Docker guide.
Join us on GitHub and shape the future of tinyML!
All contributors. Let's grow together!