Cyberagent
Rank:Average Model Cost: $0.0000
Number of Runs: 80,625
Models by this creator
open-calm-7b
open-calm-7b
OpenCALM-7B is a decoder-only language model developed by CyberAgent, Inc. It is pre-trained on Japanese datasets and is based on the GPT-NeoX library. This model is fine-tuned from the original OpenCALM-XX model and is released under the Creative Commons Attribution-ShareAlike 4.0 International License. The training datasets include Japanese Wikipedia and Common Crawl.
$-/run
22.5K
Huggingface
open-calm-large
open-calm-large
OpenCALM-Large is a decoder-only language model developed by CyberAgent, Inc. It is a transformer-based model trained on Japanese datasets. It is a fine-tuned version of the original OpenCALM-XX model and is released under the CC BY-SA 4.0 license. The model can be used for various natural language processing tasks in Japanese. The training data includes Wikipedia and Common Crawl in Japanese.
$-/run
10.2K
Huggingface
open-calm-3b
open-calm-3b
OpenCALM-3B Model Description OpenCALM is a suite of decoder-only language models pre-trained on Japanese datasets, developed by CyberAgent, Inc. Usage Model Details Developed by: CyberAgent, Inc. Model type: Transformer-based Language Model Language: Japanese Library: GPT-NeoX License: OpenCALM is licensed under the Creative Commons Attribution-ShareAlike 4.0 International License (CC BY-SA 4.0). When using this model, please provide appropriate credit to CyberAgent, Inc. Example (en): This model is a fine-tuned version of OpenCALM-XX developed by CyberAgent, Inc. The original model is released under the CC BY-SA 4.0 license, and this model is also released under the same CC BY-SA 4.0 license. For more information, please visit: https://creativecommons.org/licenses/by-sa/4.0/ Example (ja): 本モデルは、株式会社サイバーエージェントによるOpenCALM-XXをファインチューニングしたものです。元のモデルはCC BY-SA 4.0ライセンスのもとで公開されており、本モデルも同じくCC BY-SA 4.0ライセンスで公開します。詳しくはこちらをご覧ください: https://creativecommons.org/licenses/by-sa/4.0/ Training Dataset Wikipedia (ja) Common Crawl (ja) Author Ryosuke Ishigami Citations
$-/run
4.6K
Huggingface
open-calm-1b
open-calm-1b
OpenCALM-1B Model Description OpenCALM is a suite of decoder-only language models pre-trained on Japanese datasets, developed by CyberAgent, Inc. Usage Model Details Developed by: CyberAgent, Inc. Model type: Transformer-based Language Model Language: Japanese Library: GPT-NeoX License: OpenCALM is licensed under the Creative Commons Attribution-ShareAlike 4.0 International License (CC BY-SA 4.0). When using this model, please provide appropriate credit to CyberAgent, Inc. Example (en): This model is a fine-tuned version of OpenCALM-XX developed by CyberAgent, Inc. The original model is released under the CC BY-SA 4.0 license, and this model is also released under the same CC BY-SA 4.0 license. For more information, please visit: https://creativecommons.org/licenses/by-sa/4.0/ Example (ja): 本モデルは、株式会社サイバーエージェントによるOpenCALM-XXをファインチューニングしたものです。元のモデルはCC BY-SA 4.0ライセンスのもとで公開されており、本モデルも同じくCC BY-SA 4.0ライセンスで公開します。詳しくはこちらをご覧ください: https://creativecommons.org/licenses/by-sa/4.0/ Training Dataset Wikipedia (ja) Common Crawl (ja) Author Ryosuke Ishigami Citations
$-/run
4.2K
Huggingface
open-calm-medium
open-calm-medium
OpenCALM-Medium Model Description OpenCALM is a suite of decoder-only language models pre-trained on Japanese datasets, developed by CyberAgent, Inc. Usage Model Details Developed by: CyberAgent, Inc. Model type: Transformer-based Language Model Language: Japanese Library: GPT-NeoX License: OpenCALM is licensed under the Creative Commons Attribution-ShareAlike 4.0 International License (CC BY-SA 4.0). When using this model, please provide appropriate credit to CyberAgent, Inc. Example (en): This model is a fine-tuned version of OpenCALM-XX developed by CyberAgent, Inc. The original model is released under the CC BY-SA 4.0 license, and this model is also released under the same CC BY-SA 4.0 license. For more information, please visit: https://creativecommons.org/licenses/by-sa/4.0/ Example (ja): 本モデルは、株式会社サイバーエージェントによるOpenCALM-XXをファインチューニングしたものです。元のモデルはCC BY-SA 4.0ライセンスのもとで公開されており、本モデルも同じくCC BY-SA 4.0ライセンスで公開します。詳しくはこちらをご覧ください: https://creativecommons.org/licenses/by-sa/4.0/ Training Dataset Wikipedia (ja) Common Crawl (ja) Author Ryosuke Ishigami Citations
$-/run
2.3K
Huggingface
xlm-roberta-large-jnli-jsick
xlm-roberta-large-jnli-jsick
Japanese Natural Language Inference Model This model was trained using SentenceTransformers Cross-Encoder class, gradient accumulation PR, and the code from CyberAgentAILab/japanese-nli-model. Training Data The model was trained on the JGLUE-JNLI and JSICK datasets. For a given sentence pair, it will output three scores corresponding to the labels: contradiction, entailment, neutral. Usage
$-/run
52
Huggingface