BERT Description

BERT is a large language model that can be used to pre-train language representations. Pre-training refers the process by which BERT is trained on large text sources such as Wikipedia. The training results can then be applied to other Natural Language Processing tasks (NLP), such as sentiment analysis and question answering. You can train many NLP models with AI Platform Training and BERT in just 30 minutes.

Pricing

Pricing Starts At:
Free
Free Version:
Yes

Integrations

Reviews

Total
ease
features
design
support

No User Reviews. Be the first to provide a review:

Write a Review

Company Details

Company:
Google
Year Founded:
1998
Headquarters:
United States
Website:
cloud.google.com/ai-platform/training/docs/algorithms/bert-start

Media

BERT Screenshot 1
Recommended Products
Secure your business by securing your people. Icon
Secure your business by securing your people.

Over 100,000 businesses trust 1Password

Take the guesswork out of password management, shadow IT, infrastructure, and secret sharing so you can keep your people safe and your business moving.

Product Details

Platforms
SaaS
Type of Training
Documentation

BERT Features and Options

Natural Language Processing Software

Co-Reference Resolution
In-Database Text Analytics
Named Entity Recognition
Natural Language Generation (NLG)
Open Source Integrations
Parsing
Part-of-Speech Tagging
Sentence Segmentation
Stemming/Lemmatization
Tokenization

BERT Lists