Microsoft researchers claim they have developed the largest 1-bit AI model, also known as the “bitnet” to date. Called the Bitnet B1.58 2B4T, it is openly available under the MIT license and can run on CPUs including Apple’s M2.
Bitnets are essentially a compression model designed to run on lightweight hardware. In standard models, weights, which are values that define the internal structure of a model, are often quantized, so the model works well on a wide range of machines. Quantizing weights reduces what is needed to represent these weights by the number of bits (the smallest unit a computer can handle) so that the model can be run on chips with less memory.
Bitnet quantifies the weights to three values: -1, 0, and 1. In theory, it will be much more memory and computing efficient than most models today.
Microsoft researchers say that the Bitnet B1.58 2B4T is the first bitnet with 2 billion parameters, and that “parameter” is roughly synonymous with “weights.” It is trained on a data set of 4 trillion tokens – equivalent to about 33 million books, and in one estimate, the Bitnet B1.58 2B4T outperforms traditional models of similar sizes, researchers argue.
The Bitnet B1.58 2B4T is a rival of the 2 billion parameter model and does not sweep the floor, but please be clear. Researchers tested the model outperforms the meta on benchmarks including GSM8K (a collection of grade level mathematics problems) and PIQA (testing physical common sense skills) with the Llama 3.2 1B, Google’s Gemma 3 1B, and Alibaba’s Qwen 2.5 1.5b.
Perhaps more impressively, the Bitnet B1.58 2B4T is faster than other models of its size (in some cases twice the speed) while using some of the memory.
However, there is a catch.
To achieve that performance, you must use Bitnet.cpp, a custom framework from Microsoft. GPUs that are not on the list of supported chips dominate the landscape of AI infrastructure.
That means that Vitinet can hold promises, especially for resource-constrained devices. But compatibility is a big sticking point and will likely remain.