Stanford-crfm
Rank:Average Model Cost: $0.0000
Number of Runs: 12,093
Models by this creator
BioMedLM
BioMedLM
BioMedLM is a text generation model designed to assist in the extraction and summarization of biomedical literature. It is trained on a large corpus of biomedical articles and can generate concise and complete summaries for technical audiences. This model aims to facilitate information retrieval and synthesis in the field of biomedicine.
$-/run
7.5K
Huggingface
alias-gpt2-small-x21
alias-gpt2-small-x21
Model Card for alias-gpt2-small-x21 Model Details Model Description More information needed Developed by: Stanford CRFM Shared by [Optional]: Stanford CRFM Model type: Text Generation Language(s) (NLP): More information needed License: Apache 2.0 Parent Model: GPT-2 Resources for more information: GitHub Repo Uses Direct Use This model can be used for the task of Text Generation. Downstream Use [Optional] More information needed. Out-of-Scope Use The model should not be used to intentionally create hostile or alienating environments for people. Bias, Risks, and Limitations Significant research has explored bias and fairness issues with language models (see, e.g., Sheng et al. (2021) and Bender et al. (2021)). Predictions generated by the model may include disturbing and harmful stereotypes across protected classes; identity characteristics; and sensitive, social, and occupational groups. Recommendations Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. Training Details Training Data More information needed Training Procedure Preprocessing More information needed Speeds, Sizes, Times More information needed Evaluation Testing Data, Factors & Metrics Testing Data More information needed Factors More information needed Metrics More information needed Results More information needed Model Examination More information needed Environmental Impact Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019). Hardware Type: More information needed Hours used: More information needed Cloud Provider: More information needed Compute Region: More information needed Carbon Emitted: More information needed Technical Specifications [optional] Model Architecture and Objective More information needed Compute Infrastructure More information needed Hardware More information needed Software More information needed. Citation BibTeX: More information needed APA: More information needed Glossary [optional] More information needed More Information [optional] More information needed Model Card Authors [optional] Stanford CRFM in collaboration with Ezi Ozoani and the Hugging Face team Model Card Contact More information needed How to Get Started with the Model Use the code below to get started with the model.
$-/run
1.4K
Huggingface
music-small-800k
music-small-800k
This is a Small (128M parameter) Transformer trained for 800k steps on arrival-time encoded music from the Lakh MIDI dataset. This model was trained with anticipation. References for the Anticipatory Music Transformer The Anticipatory Music Transformer paper is available on ArXiv. The full model card is available here. Code for using this model is available on GitHub. See the accompanying blog post for additional discussion of this model.
$-/run
645
Huggingface
expanse-gpt2-small-x777
$-/run
606
Huggingface
arwen-gpt2-medium-x21
$-/run
442
Huggingface
celebrimbor-gpt2-medium-x81
celebrimbor-gpt2-medium-x81
Platform did not provide a description for this model.
$-/run
375
Huggingface
beren-gpt2-medium-x49
$-/run
312
Huggingface
battlestar-gpt2-small-x49
battlestar-gpt2-small-x49
Platform did not provide a description for this model.
$-/run
295
Huggingface
levanter-backpacks-test
levanter-backpacks-test
Levanter Backpacks Example: Note that you need to install safetensors together with transformers and torch to make it work.
$-/run
278
Huggingface
darkmatter-gpt2-small-x343
darkmatter-gpt2-small-x343
Platform did not provide a description for this model.
$-/run
237
Huggingface