Bhadresh-savani

Rank:

Average Model Cost: $0.0000

Number of Runs: 248,317

Models by this creator

bert-base-go-emotion

bert-base-go-emotion

bhadresh-savani

The bert-base-go-emotion model is a text classification model based on the BERT architecture. It is trained to classify the emotions (joy, sadness, anger, surprise, love) present in a given text. The model has been trained on a large dataset using the BERT-Base-Uncased configuration and supports basic emotion classification tasks. The training parameters and evaluation output of the model are also provided. The model can be accessed and used through the Colab Notebook provided.

Read more

$-/run

122.4K

Huggingface

distilbert-base-uncased-emotion

distilbert-base-uncased-emotion

The distilbert-base-uncased-emotion model is a variant of the DistilBERT model that has been fine-tuned on a dataset of emotions from Twitter. It is a smaller and faster version of the popular BERT model, but still retains a high level of language understanding. The model can be used for text classification tasks related to emotion analysis. The model's performance on the emotion dataset from Twitter can be found in the provided reference.

Read more

$-/run

115.4K

Huggingface

bert-base-uncased-emotion

bert-base-uncased-emotion

bert-base-uncased-emotion Model description: Bert is a Transformer Bidirectional Encoder based Architecture trained on MLM(Mask Language Modeling) objective bert-base-uncased finetuned on the emotion dataset using HuggingFace Trainer with below training parameters Model Performance Comparision on Emotion Dataset from Twitter: How to Use the model: Dataset: Twitter-Sentiment-Analysis. Training procedure Colab Notebook follow the above notebook by changing the model name from distilbert to bert Eval results Reference: Natural Language Processing with Transformer By Lewis Tunstall, Leandro von Werra, Thomas Wolf

Read more

$-/run

4.2K

Huggingface

albert-base-v2-emotion

albert-base-v2-emotion

Albert-base-v2-emotion Model description: Albert is A Lite BERT architecture that has significantly fewer parameters than a traditional BERT architecture. Albert-base-v2 finetuned on the emotion dataset using HuggingFace Trainer with below Hyperparameters Model Performance Comparision on Emotion Dataset from Twitter: How to Use the model: Dataset: Twitter-Sentiment-Analysis. Training procedure Colab Notebook Eval results Reference: Natural Language Processing with Transformer By Lewis Tunstall, Leandro von Werra, Thomas Wolf

Read more

$-/run

1.8K

Huggingface

electra-base-squad2

electra-base-squad2

electra-base for QA Overview Language model: electra-baseLanguage: EnglishDownstream-task: Extractive QATraining data: SQuAD 2.0Eval data: SQuAD 2.0Code: See example in FARMInfrastructure: 1x Tesla v100 Hyperparameters Performance Evaluated on the SQuAD 2.0 dev set with the official eval script. Usage In Transformers In FARM In haystack For doing QA at scale (i.e. many docs instead of single paragraph), you can load the model also in haystack: Authors Vaishali Pal vaishali.pal [at] deepset.ai Branden Chan: branden.chan [at] deepset.ai Timo Möller: timo.moeller [at] deepset.ai Malte Pietsch: malte.pietsch [at] deepset.ai Tanay Soni: tanay.soni [at] deepset.ai Note: Borrowed this model from Haystack model repo for adding tensorflow model.

Read more

$-/run

1.8K

Huggingface

roberta-base-emotion

roberta-base-emotion

robert-base-emotion Model description: roberta is Bert with better hyperparameter choices so they said it's Robustly optimized Bert during pretraining. roberta-base finetuned on the emotion dataset using HuggingFace Trainer with below Hyperparameters Model Performance Comparision on Emotion Dataset from Twitter: How to Use the model: Dataset: Twitter-Sentiment-Analysis. Training procedure Colab Notebook follow the above notebook by changing the model name to roberta Eval results Reference: Natural Language Processing with Transformer By Lewis Tunstall, Leandro von Werra, Thomas Wolf

Read more

$-/run

434

Huggingface

Similar creators