RoBERTa Description

RoBERTa is based on BERT's language-masking strategy. The system learns to predict hidden sections of text in unannotated language examples. RoBERTa was implemented in PyTorch and modifies key hyperparameters of BERT. This includes removing BERT’s next-sentence-pretraining objective and training with larger mini-batches. This allows RoBERTa improve on the masked-language modeling objective, which is comparable to BERT. It also leads to improved downstream task performance. We are also exploring the possibility of training RoBERTa with a lot more data than BERT and for a longer time. We used both existing unannotated NLP data sets as well as CC-News which was a new set of public news articles.

Pricing

Pricing Starts At:
Free
Free Version:
Yes

Integrations

Reviews

Total
ease
features
design
support

No User Reviews. Be the first to provide a review:

Write a Review

Company Details

Company:
Meta
Year Founded:
2004
Headquarters:
United States
Website:
ai.facebook.com/blog/roberta-an-optimized-method-for-pretraining-self-supervised-nlp-systems/

Media

RoBERTa Screenshot 1
Recommended Products
Our Free Plans just got better! | Auth0 by Okta Icon
Our Free Plans just got better! | Auth0 by Okta

With up to 25k MAUs and unlimited Okta connections, our Free Plan lets you focus on what you do best—building great apps.

You asked, we delivered! Auth0 is excited to expand our Free and Paid plans to include more options so you can focus on building, deploying, and scaling applications without having to worry about your secuirty. Auth0 now, thank yourself later.
Try free now

Product Details

Platforms
SaaS
Type of Training
Documentation

RoBERTa Features and Options

RoBERTa Lists

RoBERTa User Reviews

Write a Review
  • Previous
  • Next