Inception, a G42 company focused on developing advanced AI models and applications as a service, has launched the latest addition to its large language model (LLM) lineup, the Inception 70B.
The 70 billion parameter model, Inception 70B, is designed for developers of Arabic-based natural language processing (NLP) solutions and aims to accelerate the integration of Generative AI services across various sectors, enhancing capabilities in customer service, content creation, and data analysis.
The company has also introduced a comprehensive suite of Inception foundation and fine-tuned models; 20 models, across 8 sizes, ranging from 590M to 70B parameters, specifically fine-tuned for chat applications and trained on up to 1.6T tokens of Arabic, English, and code data.
This extensive release now provides a breadth of tools, including the first Arabic-centric model small enough to run on a laptop, delivering both compact, compute-efficient models for targeted applications and advanced model sizes for enterprise-level precision.
Read more: Abu Dhabi’s G42 partners with EcoCloud to set up geothermal-powered data center in Kenya
AI’s proven value-adding capabilities
According to Dr. Andrew Jackson, CEO of Inception, AI is now a proven value-adding force, and large language models have been at the forefront of the AI adoption spike. He stated that JAIS was created to preserve Arabic heritage, culture, and language, and to democratize access to AI. Furthermore, he said that by releasing JAIS 70B and this new family of models, Dr. Jackson asserted that Inception is reinforcing their commitment to delivering the highest-quality AI foundation model for Arabic-speaking nations.
Training JAIS from scratch
For her part, Neha Sengupta, principal applied scientist at Inception, stated that for models up to 30 billion parameters, they successfully trained JAIS from scratch, consistently outperforming adapted models in the community. However, she acknowledged that for models with 70 billion parameters and above, the computational complexity and environmental impact of training from scratch were significant.
For more news on technology, click here.