ML Research Wiki / Benchmarks / Natural Language Inference / MultiNLI

MultiNLI

Natural Language Inference Benchmark

Performance Over Time

📊 Showing 63 results | 📏 Metric: Matched

Top Performing Models

Rank Model Paper Matched Date Code
1 UnitedSynT5 (3B) 📚 First Train to Generate, then Generate to Train: UnitedSynT5 for Few-Shot NLI 92.60 2024-12-12 -
2 T5 SMART: Robust and Efficient Fine-Tuning for Pre-trained Natural Language Models through Principled Regularized Optimization 92.00 2019-11-08 📦 namisan/mt-dnn 📦 microsoft/MT-DNN 📦 archinetai/smart-pytorch
3 T5-XXL 11B (fine-tuned) Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer 92.00 2019-10-23 📦 huggingface/transformers 📦 PaddlePaddle/PaddleNLP 📦 google-research/text-to-text-transfer-transformer
4 T5-11B Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer 91.70 2019-10-23 📦 huggingface/transformers 📦 PaddlePaddle/PaddleNLP 📦 google-research/text-to-text-transfer-transformer
5 T5-3B Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer 91.40 2019-10-23 📦 huggingface/transformers 📦 PaddlePaddle/PaddleNLP 📦 google-research/text-to-text-transfer-transformer
6 ALBERT ALBERT: A Lite BERT for Self-supervised Learning of Language Representations 91.30 2019-09-26 📦 huggingface/transformers 📦 tensorflow/models 📦 PaddlePaddle/PaddleNLP
7 DeBERTa (large) DeBERTa: Decoding-enhanced BERT with Disentangled Attention 91.10 2020-06-05 📦 huggingface/transformers 📦 microsoft/DeBERTa 📦 osu-nlp-group/mind2web
8 Adv-RoBERTa ensemble StructBERT: Incorporating Language Structures into Pre-training for Deep Language Understanding 91.10 2019-08-13 -
9 SMARTRoBERTa SMART: Robust and Efficient Fine-Tuning for Pre-trained Natural Language Models through Principled Regularized Optimization 91.10 2019-11-08 📦 namisan/mt-dnn 📦 microsoft/MT-DNN 📦 archinetai/smart-pytorch
10 RoBERTa RoBERTa: A Robustly Optimized BERT Pretraining Approach 90.80 2019-07-26 📦 huggingface/transformers 📦 pytorch/fairseq 📦 PaddlePaddle/PaddleNLP

All Papers (63)