L3cube-pune
Rank:Average Model Cost: $0.0000
Number of Runs: 2,719
Models by this creator
marathi-ner
marathi-ner
MahaNER-BERT MahaNER-BERT is a MahaBERT(l3cube-pune/marathi-bert) model fine-tuned on L3Cube-MahaNER - a Marathi named entity recognition dataset. [dataset link] (https://github.com/l3cube-pune/MarathiNLP) More details on the dataset, models, and baseline results can be found in our [paper] (https://arxiv.org/abs/2204.06029) IOB Model : marathi-ner-iob
$-/run
1.1K
Huggingface
marathi-sentiment-md
$-/run
221
Huggingface
MarathiSentiment
$-/run
216
Huggingface
indic-sentence-similarity-sbert
indic-sentence-similarity-sbert
IndicSBERT-STS This is a IndicSBERT model (l3cube-pune/indic-sentence-bert-nli) trained on the STS dataset of ten major Indian Languages. The single model works for English, Hindi, Marathi, Kannada, Tamil, Telugu, Gujarati, Oriya, Punjabi, Malayalam, and Bengali. The model also has cross-lingual capabilities. Released as a part of project MahaNLP: https://github.com/l3cube-pune/MarathiNLP Generic Indic Sentence BERT model is shared here : l3cube-pune/indic-sentence-bert-nli More details on the dataset, models, and baseline results can be found in our [paper] (https://arxiv.org/abs/2304.11434) monolingual Indic SBERT paper multilingual Indic SBERT paper Other Monolingual similarity models are listed below: Marathi Similarity Hindi Similarity Kannada Similarity Telugu Similarity Malayalam Similarity Tamil Similarity Gujarati Similarity Oriya Similarity Bengali Similarity Punjabi Similarity Indic Similarity (multilingual) Other Monolingual Indic sentence BERT models are listed below: Marathi SBERT Hindi SBERT Kannada SBERT Telugu SBERT Malayalam SBERT Tamil SBERT Gujarati SBERT Oriya SBERT Bengali SBERT Punjabi SBERT Indic SBERT (multilingual) Usage (Sentence-Transformers) Using this model becomes easy when you have sentence-transformers installed: Then you can use the model like this: Usage (HuggingFace Transformers) Without sentence-transformers, you can use the model like this: First, you pass your input through the transformer model, then you have to apply the right pooling-operation on-top of the contextualized word embeddings.
$-/run
197
Huggingface
mahahate-bert
mahahate-bert
MahaHate-BERT MahaHate-BERT (Marathi Hate speech identification) is a MahaBERT(l3cube-pune/marathi-bert) model fine-tuned on L3Cube-MahaHate - a Marathi tweet-based hate speech detection dataset. This is a two-class model with labels as hate (LABEL_1) and not (LABEL_0). The 4-class model can be found here [dataset link] (https://github.com/l3cube-pune/MarathiNLP) More details on the dataset, models, and baseline results can be found in our [paper] (https://arxiv.org/abs/2203.13778)
$-/run
172
Huggingface
hing-bert
$-/run
159
Huggingface