Only 400 million parameters
Efficient training and deployment
Comparable performance to larger models
Available on Hugging Face for commercial use
Fine-tuning 8B model on 94B token dataset
Applying depth and width pruning techniques
Fine-tuning pruned model with NeMo-Aligner
Evaluating model capabilities
Instruction Following
Role Playing
Retrieval Augmented Generation (RAG)
Function Calling
@AIbase