Svalabs

Rank:

Average Model Cost: $0.0000

Number of Runs: 4,951

Models by this creator

gbert-large-zeroshot-nli

gbert-large-zeroshot-nli

svalabs

Platform did not provide a description for this model.

Read more

$-/run

2.1K

Huggingface

twitter-xlm-roberta-bitcoin-sentiment

twitter-xlm-roberta-bitcoin-sentiment

This model is mainly focussed on extracting the sentiment on tweets regarding bitcoin. The model was trained on manually on labeled data with rubrix (https://www.rubrix.ml/). The training set approximately contained 500 samples and 500 test samples. The cardiffnlp/twitter-xlm-roberta-base-sentiment (https://huggingface.co/cardiffnlp/twitter-xlm-roberta-base-sentiment) was used as weak classifier and also as base-model for finetuning.

Read more

$-/run

43

Huggingface

mt5-large-german-query-gen-v1

mt5-large-german-query-gen-v1

svalabs/mt5-large-german-query-gen-v1 This is a german doc2query model usable for document expansion to further boost search results by generating queries. Usage (code from doc2query/msmarco-14langs-mt5-base-v1) Console Output: References 'Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks'. 'MS MARCO: A Human Generated MAchine Reading COmprehension Dataset'. 'GermanQuAD and GermanDPR: Improving Non-English Question Answering and Passage Retrieval'. google/mt5-large mMARCO dataset doc2query

Read more

$-/run

41

Huggingface

german-gpl-adapted-covid

german-gpl-adapted-covid

svalabs/german-gpl-adapted-covid This is a german on covid adapted sentence-transformers model: It is adapted on covid related documents using the GPL integration of Haystack. We used the svalabs/cross-electra-ms-marco-german-uncased as CrossEncoder and svalabs/mt5-large-german-query-gen-v1 for query generation. Usage (Sentence-Transformers) Using this model becomes easy when you have sentence-transformers installed: Then you can use the model like this: Evaluation Results Training The model was trained with the parameters: DataLoader: torch.utils.data.dataloader.DataLoader of length 125 with parameters: Loss: sentence_transformers.losses.MarginMSELoss.MarginMSELoss Parameters of the fit()-Method: Full Model Architecture Citing & Authors

Read more

$-/run

17

Huggingface

Similar creators