Alibaba-pai
Rank:Average Model Cost: $0.0000
Number of Runs: 3,025
Models by this creator
pai-bloom-1b1-text2prompt-sd
pai-bloom-1b1-text2prompt-sd
Platform did not provide a description for this model.
$-/run
1.1K
Huggingface
pai-diffusion-artist-large-zh
pai-diffusion-artist-large-zh
Platform did not provide a description for this model.
$-/run
721
Huggingface
pai-ckbert-base-zh
pai-ckbert-base-zh
Chinese Kowledge-enhanced BERT (CKBERT) Knowledge-enhanced pre-trained language models (KEPLMs) improve context-aware representations via learning from structured relations in knowledge graphs, and/or linguistic knowledge from syntactic or dependency analysis. Unlike English, there is a lack of high-performing open-source Chinese KEPLMs in the natural language processing (NLP) community to support various language understanding applications. For Chinese natural language processing, we provide three Chinese Kowledge-enhanced BERT (CKBERT) models named pai-ckbert-bert-zh, pai-ckbert-large-zh and pai-ckbert-huge-zh, from our EMNLP 2022 paper named Revisiting and Advancing Chinese Natural Language Understanding with Accelerated Heterogeneous Knowledge Pre-training. This repository is developed based on the EasyNLP framework: https://github.com/alibaba/EasyNLP Citation If you find the resource is useful, please cite the following papers in your work. For the EasyNLP framework: For CKBERT:
$-/run
356
Huggingface
pai-diffusion-food-large-zh
$-/run
350
Huggingface
pai-bert-tiny-zh
$-/run
171
Huggingface
pai-dkplm-medical-base-zh
$-/run
99
Huggingface
pai-diffusion-artist-xlarge-zh
pai-diffusion-artist-xlarge-zh
Chinese Diffusion Model (Artist, 768 Resolution) 简介 Brief Introduction 我们开源了一个中文 Diffusion 模型,您可以直接输入中文提示词,我们为您呈现精美的艺术风格图片。本模型的默认分辨率是 768*768。 We release a Chinese diffusion model, which is able to generate high-quality artistic images according to the prompts you input. The default resolution of this model is 768*768. Github: EasyNLP 使用 Usage 本模型支持 diffusers,可以参考以下范例代码: This model supports diffusers. Please refer to the following code: 作品展示 Gallery 使用须知 Notice for Use 使用上述模型需遵守AIGC模型开源特别条款。 If you want to use this model, please read this document carefully and abide by the terms.
$-/run
63
Huggingface
pai-ckbert-huge-zh
$-/run
63
Huggingface
pai-ckbert-large-zh
pai-ckbert-large-zh
Chinese Kowledge-enhanced BERT (CKBERT) Knowledge-enhanced pre-trained language models (KEPLMs) improve context-aware representations via learning from structured relations in knowledge graphs, and/or linguistic knowledge from syntactic or dependency analysis. Unlike English, there is a lack of high-performing open-source Chinese KEPLMs in the natural language processing (NLP) community to support various language understanding applications. For Chinese natural language processing, we provide three Chinese Kowledge-enhanced BERT (CKBERT) models named pai-ckbert-bert-zh, pai-ckbert-large-zh and pai-ckbert-huge-zh, from our EMNLP 2022 paper named Revisiting and Advancing Chinese Natural Language Understanding with Accelerated Heterogeneous Knowledge Pre-training. This repository is developed based on the EasyNLP framework: https://github.com/alibaba/EasyNLP Citation If you find the resource is useful, please cite the following papers in your work. For the EasyNLP framework: For CKBERT:
$-/run
35
Huggingface
pai-dkplm-financial-base-zh
pai-dkplm-financial-base-zh
For Chinese natural language processing in specific domains, we provide Chinese DKPLM (Decomposable Knowledge-enhanced Pre-trained Language Model) for the financial domain named pai-dkplm-financial-base-zh, from our AAAI 2021 paper named DKPLM: Decomposable Knowledge-enhanced Pre-trained Language Model for Natural Language Understanding. This repository is developed based on the EasyNLP framework: https://github.com/alibaba/EasyNLP developed by the Alibaba PAI team. If you find the resource is useful, please cite the following papers in your work.
$-/run
35
Huggingface