GPT-NeoX Description

This repository showcases an implementation of model parallel autoregressive transformers utilizing GPUs, leveraging the capabilities of the DeepSpeed library. It serves as a record of EleutherAI's framework designed for training extensive language models on GPU architecture. Currently, it builds upon NVIDIA's Megatron Language Model, enhanced with advanced techniques from DeepSpeed alongside innovative optimizations. Our goal is to create a centralized hub for aggregating methodologies related to the training of large-scale autoregressive language models, thereby fostering accelerated research and development in the field of large-scale training. We believe that by providing these resources, we can significantly contribute to the progress of language model research.

Pricing

Pricing Starts At:
Free
Pricing Information:
Open source
Free Version:
Yes

Integrations

Reviews

Total
ease
features
design
support

No User Reviews. Be the first to provide a review:

Write a Review

Company Details

Company:
EleutherAI
Year Founded:
2020
Website:
github.com/EleutherAI/gpt-neox

Media

GPT-NeoX Screenshot 1
Recommended Products
Auth for GenAI | Auth0 Icon
Auth for GenAI | Auth0

Enable AI agents to securely access tools, workflows, and data with fine-grained control and just a few lines of code.

Easily implement secure login experiences for AI Agents - from interactive chatbots to background workers with Auth0. Auth for GenAI is now available in Developer Preview
Try free now

Product Details

Platforms
Web-Based
On-Premises
Types of Training
Training Docs

GPT-NeoX Features and Options

GPT-NeoX User Reviews

Write a Review
  • Previous
  • Next