Reader

Microsoft Native 1-Bit LLM Could Bring Efficient genAI to Everyday CPUs

| InfoQ | Default

In a recent paper, Microsoft researchers described BitNet b1.58 2B4T, the first LLM to be natively trained using "1-bit" (technically, 1-trit) weights, rather than being quantized from a model trained with floating point weights. According to Microsoft, the model delivers performance comparable to full-precision LLMs of similar size at a fraction of the computation cost and hardware requirements.

By Sergio De Simone