Bhadresh-savani
Rank:Average Model Cost: $0.0000
Number of Runs: 248,317
Models by this creator
bert-base-go-emotion
bert-base-go-emotion
The bert-base-go-emotion model is a text classification model based on the BERT architecture. It is trained to classify the emotions (joy, sadness, anger, surprise, love) present in a given text. The model has been trained on a large dataset using the BERT-Base-Uncased configuration and supports basic emotion classification tasks. The training parameters and evaluation output of the model are also provided. The model can be accessed and used through the Colab Notebook provided.
$-/run
122.4K
Huggingface
distilbert-base-uncased-emotion
distilbert-base-uncased-emotion
The distilbert-base-uncased-emotion model is a variant of the DistilBERT model that has been fine-tuned on a dataset of emotions from Twitter. It is a smaller and faster version of the popular BERT model, but still retains a high level of language understanding. The model can be used for text classification tasks related to emotion analysis. The model's performance on the emotion dataset from Twitter can be found in the provided reference.
$-/run
115.4K
Huggingface
bert-base-uncased-emotion
bert-base-uncased-emotion
bert-base-uncased-emotion Model description: Bert is a Transformer Bidirectional Encoder based Architecture trained on MLM(Mask Language Modeling) objective bert-base-uncased finetuned on the emotion dataset using HuggingFace Trainer with below training parameters Model Performance Comparision on Emotion Dataset from Twitter: How to Use the model: Dataset: Twitter-Sentiment-Analysis. Training procedure Colab Notebook follow the above notebook by changing the model name from distilbert to bert Eval results Reference: Natural Language Processing with Transformer By Lewis Tunstall, Leandro von Werra, Thomas Wolf
$-/run
4.2K
Huggingface
albert-base-v2-emotion
albert-base-v2-emotion
Albert-base-v2-emotion Model description: Albert is A Lite BERT architecture that has significantly fewer parameters than a traditional BERT architecture. Albert-base-v2 finetuned on the emotion dataset using HuggingFace Trainer with below Hyperparameters Model Performance Comparision on Emotion Dataset from Twitter: How to Use the model: Dataset: Twitter-Sentiment-Analysis. Training procedure Colab Notebook Eval results Reference: Natural Language Processing with Transformer By Lewis Tunstall, Leandro von Werra, Thomas Wolf
$-/run
1.8K
Huggingface
electra-base-squad2
electra-base-squad2
electra-base for QA Overview Language model: electra-baseLanguage: EnglishDownstream-task: Extractive QATraining data: SQuAD 2.0Eval data: SQuAD 2.0Code: See example in FARMInfrastructure: 1x Tesla v100 Hyperparameters Performance Evaluated on the SQuAD 2.0 dev set with the official eval script. Usage In Transformers In FARM In haystack For doing QA at scale (i.e. many docs instead of single paragraph), you can load the model also in haystack: Authors Vaishali Pal vaishali.pal [at] deepset.ai Branden Chan: branden.chan [at] deepset.ai Timo Möller: timo.moeller [at] deepset.ai Malte Pietsch: malte.pietsch [at] deepset.ai Tanay Soni: tanay.soni [at] deepset.ai Note: Borrowed this model from Haystack model repo for adding tensorflow model.
$-/run
1.8K
Huggingface
electra-base-emotion
$-/run
1.5K
Huggingface
distilbert-base-uncased-sentiment-sst2
distilbert-base-uncased-sentiment-sst2
distilbert-base-uncased-sentiment-sst2 This model will be able to identify positivity or negativity present in the sentence Dataset: The Stanford Sentiment Treebank from GLUE Results:
$-/run
450
Huggingface
roberta-base-emotion
roberta-base-emotion
robert-base-emotion Model description: roberta is Bert with better hyperparameter choices so they said it's Robustly optimized Bert during pretraining. roberta-base finetuned on the emotion dataset using HuggingFace Trainer with below Hyperparameters Model Performance Comparision on Emotion Dataset from Twitter: How to Use the model: Dataset: Twitter-Sentiment-Analysis. Training procedure Colab Notebook follow the above notebook by changing the model name to roberta Eval results Reference: Natural Language Processing with Transformer By Lewis Tunstall, Leandro von Werra, Thomas Wolf
$-/run
434
Huggingface
electra-base-discriminator-finetuned-conll03-english
electra-base-discriminator-finetuned-conll03-english
Platform did not provide a description for this model.
$-/run
380
Huggingface
distilbert-base-uncased-go-emotion
distilbert-base-uncased-go-emotion
Distilbert-Base-Uncased-Go-Emotion Model description: Not working fine Training Parameters: TrainOutput: Evalution Output: Colab Notebook: Notebook
$-/run
21
Huggingface