Jhgan

Rank:

Average Model Cost: $0.0000

Number of Runs: 86,346

Models by this creator

ko-sroberta-multitask

ko-sroberta-multitask

jhgan

ko-sroberta-multitask is a model designed for sentence similarity, but additional information about its purpose, architecture, or training process is not provided.

Read more

$-/run

48.5K

Huggingface

ko-sbert-multitask

ko-sbert-multitask

ko-sbert-multitask This is a sentence-transformers model: It maps sentences & paragraphs to a 768 dimensional dense vector space and can be used for tasks like clustering or semantic search. Usage (Sentence-Transformers) Using this model becomes easy when you have sentence-transformers installed: Then you can use the model like this: Usage (HuggingFace Transformers) Without sentence-transformers, you can use the model like this: First, you pass your input through the transformer model, then you have to apply the right pooling-operation on-top of the contextualized word embeddings. Evaluation Results KorSTS, KorNLI ํ•™์Šต ๋ฐ์ดํ„ฐ์…‹์œผ๋กœ ๋ฉ€ํ‹ฐ ํƒœ์Šคํฌ ํ•™์Šต์„ ์ง„ํ–‰ํ•œ ํ›„ KorSTS ํ‰๊ฐ€ ๋ฐ์ดํ„ฐ์…‹์œผ๋กœ ํ‰๊ฐ€ํ•œ ๊ฒฐ๊ณผ์ž…๋‹ˆ๋‹ค. Cosine Pearson: 84.13 Cosine Spearman: 84.71 Euclidean Pearson: 82.42 Euclidean Spearman: 82.66 Manhattan Pearson: 81.41 Manhattan Spearman: 81.69 Dot Pearson: 80.05 Dot Spearman: 79.69 Training The model was trained with the parameters: DataLoader: sentence_transformers.datasets.NoDuplicatesDataLoader.NoDuplicatesDataLoader of length 8885 with parameters: Loss: sentence_transformers.losses.MultipleNegativesRankingLoss.MultipleNegativesRankingLoss with parameters: DataLoader: torch.utils.data.dataloader.DataLoader of length 719 with parameters: Loss: sentence_transformers.losses.CosineSimilarityLoss.CosineSimilarityLoss Parameters of the fit()-Method: Full Model Architecture Citing & Authors Ham, J., Choe, Y. J., Park, K., Choi, I., & Soh, H. (2020). Kornli and korsts: New benchmark datasets for korean natural language understanding. arXiv preprint arXiv:2004.03289 Reimers, Nils and Iryna Gurevych. โ€œSentence-BERT: Sentence Embeddings using Siamese BERT-Networks.โ€ ArXiv abs/1908.10084 (2019) Reimers, Nils and Iryna Gurevych. โ€œMaking Monolingual Sentence Embeddings Multilingual Using Knowledge Distillation.โ€ EMNLP (2020).

Read more

$-/run

431

Huggingface

ko-sbert-nli

ko-sbert-nli

ko-sbert-nli This is a sentence-transformers model: It maps sentences & paragraphs to a 768 dimensional dense vector space and can be used for tasks like clustering or semantic search. Usage (Sentence-Transformers) Using this model becomes easy when you have sentence-transformers installed: Then you can use the model like this: Usage (HuggingFace Transformers) Without sentence-transformers, you can use the model like this: First, you pass your input through the transformer model, then you have to apply the right pooling-operation on-top of the contextualized word embeddings. Evaluation Results KorNLI ํ•™์Šต ๋ฐ์ดํ„ฐ์…‹์œผ๋กœ ํ•™์Šตํ•œ ํ›„ KorSTS ํ‰๊ฐ€ ๋ฐ์ดํ„ฐ์…‹์œผ๋กœ ํ‰๊ฐ€ํ•œ ๊ฒฐ๊ณผ์ž…๋‹ˆ๋‹ค. Cosine Pearson: 82.24 Cosine Spearman: 83.16 Euclidean Pearson: 82.19 Euclidean Spearman: 82.31 Manhattan Pearson: 82.18 Manhattan Spearman: 82.30 Dot Pearson: 79.30 Dot Spearman: 78.78 Training The model was trained with the parameters: DataLoader: sentence_transformers.datasets.NoDuplicatesDataLoader.NoDuplicatesDataLoader of length 8885 with parameters: Loss: sentence_transformers.losses.MultipleNegativesRankingLoss.MultipleNegativesRankingLoss with parameters: Parameters of the fit()-Method: Full Model Architecture Citing & Authors Ham, J., Choe, Y. J., Park, K., Choi, I., & Soh, H. (2020). Kornli and korsts: New benchmark datasets for korean natural language understanding. arXiv preprint arXiv:2004.03289 Reimers, Nils and Iryna Gurevych. โ€œSentence-BERT: Sentence Embeddings using Siamese BERT-Networks.โ€ ArXiv abs/1908.10084 (2019) Reimers, Nils and Iryna Gurevych. โ€œMaking Monolingual Sentence Embeddings Multilingual Using Knowledge Distillation.โ€ EMNLP (2020).

Read more

$-/run

55

Huggingface

Similar creators