Daspartho

Rank:

Average Model Cost: $0.0000

Number of Runs: 4,815

Models by this creator

prompt-extend

prompt-extend

daspartho

Platform did not provide a description for this model.

Read more

$-/run

4.7K

Huggingface

text-emotion

text-emotion

text-emotion This model is a fine-tuned version of distilbert-base-uncased on the emotion dataset. It achieves the following results on the evaluation set: Loss: 0.1414 Accuracy: 0.9367 Model description More information needed Intended uses & limitations More information needed Training and evaluation data More information needed Training procedure Training hyperparameters The following hyperparameters were used during training: learning_rate: 0.0001 train_batch_size: 256 eval_batch_size: 512 seed: 42 optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 lr_scheduler_type: cosine lr_scheduler_warmup_ratio: 0.1 num_epochs: 5 mixed_precision_training: Native AMP Training results Framework versions Transformers 4.24.0 Pytorch 1.12.1+cu113 Datasets 2.6.1 Tokenizers 0.13.2

Read more

$-/run

46

Huggingface

subreddit-predictor

subreddit-predictor

An NLP model that predicts subreddit based on the title of a post. Training DistilBERT is fine-tuned on subreddit-posts, a dataset of titles of the top 1000 posts from the top 250 subreddits. For steps to make the model check out the model notebook in the github repo or open in Colab. Limitations and bias Since the model is trained on top 250 subreddits (for reference) therefore it can only categorise within those subreddits. Some subreddits have a specific format for their post title, like r/todayilearned where post title starts with "TIL" so the model becomes biased towards "TIL" --> r/todayilearned. This can be removed by cleaning the dataset of these specific terms. In some subreddit like r/gifs, the title of the post doesn't matter much, so the model performs poorly on them.

Read more

$-/run

38

Huggingface

Similar creators