Malteos

Rank:

Average Model Cost: $0.0000

Number of Runs: 45,165

Models by this creator

scincl

scincl

malteos

The scincl model is a platform that performs feature extraction. However, no description was provided for this model.

Read more

$-/run

42.3K

Huggingface

PubMedNCL

PubMedNCL

PubMedNCL A pretrained language model for document representations of biomedical papers. PubMedNCL is based on PubMedBERT, which is a BERT model pretrained on abstracts and full-texts from PubMedCentral, and fine-tuned via citation neighborhood contrastive learning, as introduced by SciNCL. How to use the pretrained model Citation Neighborhood Contrastive Learning for Scientific Document Representations with Citation Embeddings (EMNLP 2022 paper). Domain-Specific Language Model Pretraining for Biomedical Natural Language Processing. License MIT

Read more

$-/run

105

Huggingface

bloom-6b4-clp-german-oasst-v0.1

bloom-6b4-clp-german-oasst-v0.1

Instruction-fine-tuned German language model (6B parameters; early alpha version) Base model: malteos/bloom-6b4-clp-german (Ostendorff and Rehm, 2023) Trained on: 20B additional German tokens (Wikimedia dumps and OSCAR 2023) OpenAssistant/oasst1 (German subset) LEL-A/translated_german_alpaca_validation LEL-A's version of deepset/germandpr Chat demo https://opengptx.dfki.de/chat/ Please note that this a research prototype and may not be suitable for extensive use. How to cite If you are using our code or models, please cite our paper: License BigScience BLOOM RAIL 1.0 Acknowledgements This model was trained during the Helmholtz GPU Hackathon 2023. We gratefully thank the organizers for hosting this event and the provided computing resources.

Read more

$-/run

45

Huggingface

Similar creators