📊 Showing 6 results | 📏 Metric: Accuracy
Rank | Model | Paper | Accuracy | Date | Code |
---|---|---|---|---|---|
1 | BioLinkBERT (large) | LinkBERT: Pretraining Language Models with Document Links | 94.80 | 2022-03-29 | 📦 michiyasunaga/LinkBERT |
2 | GAL 120B (zero-shot) | Galactica: A Large Language Model for Science | 94.30 | 2022-11-16 | 📦 paperswithcode/galai |
3 | BioLinkBERT (base) | LinkBERT: Pretraining Language Models with Document Links | 91.40 | 2022-03-29 | 📦 michiyasunaga/LinkBERT |
4 | BLOOM (zero-shot) | Galactica: A Large Language Model for Science | 91.40 | 2022-11-16 | 📦 paperswithcode/galai |
5 | PubMedBERT uncased | Domain-Specific Language Model Pretraining for Biomedical Natural Language Processing | 87.56 | 2020-07-31 | 📦 bionlu-coling2024/biomed-ner-intent_detection 📦 rohanshad/cmr_transformer |
6 | OPT (zero-shot) | Galactica: A Large Language Model for Science | 81.40 | 2022-11-16 | 📦 paperswithcode/galai |