PanGu-α Description

PanGu-a was developed under MindSpore, and trained on 2048 Ascend AI processors. The MindSpore Auto-parallel parallelism strategy was implemented to scale the training task efficiently to 2048 processors. This includes data parallelism as well as op-level parallelism. We pretrain PanGu-a with 1.1TB of high-quality Chinese data collected from a variety of domains in order to enhance its generalization ability. We test the generation abilities of PanGua in different scenarios, including text summarizations, question answering, dialog generation, etc. We also investigate the effects of model scaling on the few shot performances across a wide range of Chinese NLP task. The experimental results show that PanGu-a is superior in performing different tasks with zero-shot or few-shot settings.

Integrations

No Integrations at this time

Reviews

Total
ease
features
design
support

No User Reviews. Be the first to provide a review:

Write a Review

Company Details

Company:
Huawei
Year Founded:
1987
Headquarters:
China
Website:
arxiv.org/abs/2104.12369

Media

Get Started
Recommended Products
Secure your business by securing your people. Icon
Secure your business by securing your people.

Over 100,000 businesses trust 1Password

Take the guesswork out of password management, shadow IT, infrastructure, and secret sharing so you can keep your people safe and your business moving.

Product Details

Platforms
SaaS
Windows
Mac
Linux
On-Premises
Type of Training
Documentation

PanGu-α Features and Options