Databricks launches DBRX, an open-source LLM using MoE architecture, outperforming existing models and enabling customized reasoning capabilities for enterprises.

Databricks has launched DBRX, an open-source large language model (LLM) that outperforms existing open-source models such as Llama 2, Mixtral-8x7B, and GPT-3.5 on standard benchmarks. DBRX, built using a mixture-of-experts (MoE) architecture, is designed to enable enterprises to build customized reasoning capabilities based on their own data while giving them control over their data. It is optimized for efficiency and trained using NVIDIA DGX Cloud.

March 27, 2024
15 Articles

Further Reading