BERT Description
BERT is a large language model that can be used to pre-train language representations. Pre-training refers the process by which BERT is trained on large text sources such as Wikipedia. The training results can then be applied to other Natural Language Processing tasks (NLP), such as sentiment analysis and question answering. You can train many NLP models with AI Platform Training and BERT in just 30 minutes.
Pricing
Pricing Starts At:
Free
Free Version:
Yes
Integrations
Company Details
Company:
Google
Year Founded:
1998
Headquarters:
United States
Website:
cloud.google.com/ai-platform/training/docs/algorithms/bert-start
Recommended Products
Product Details
Platforms
SaaS
Type of Training
Documentation