Martin-ha
Rank:Average Model Cost: $0.0000
Number of Runs: 136,041
Models by this creator
toxic-comment-model
toxic-comment-model
The toxic-comment-model is a text classification model that is designed to identify and classify toxic comments. It uses a machine learning algorithm to analyze the text and determine its level of toxicity. The model can be used to automatically detect and filter out toxic comments in online platforms and social media, allowing for safer and more inclusive online communities.
$-/run
136.0K
Huggingface
text_image_dual_encoder
text_image_dual_encoder
Model description More information needed Intended uses & limitations More information needed Training and evaluation data More information needed Training procedure Training hyperparameters The following hyperparameters were used during training: optimizer: {'name': 'AdamW', 'learning_rate': 0.001, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay': 0.001, 'exclude_from_weight_decay': None} training_precision: float32 Training Metrics Model history needed Model Plot
$-/run
0
Huggingface
text_encoder_in_dual
$-/run
0
Huggingface
vision_encoder_in_dual
vision_encoder_in_dual
Model description More information needed Intended uses & limitations More information needed Training and evaluation data More information needed Training Metrics Model history needed Model Plot
$-/run
0
Huggingface