GPT-NeoX Description

This repository showcases an implementation of model parallel autoregressive transformers utilizing GPUs, leveraging the capabilities of the DeepSpeed library. It serves as a record of EleutherAI's framework designed for training extensive language models on GPU architecture. Currently, it builds upon NVIDIA's Megatron Language Model, enhanced with advanced techniques from DeepSpeed alongside innovative optimizations. Our goal is to create a centralized hub for aggregating methodologies related to the training of large-scale autoregressive language models, thereby fostering accelerated research and development in the field of large-scale training. We believe that by providing these resources, we can significantly contribute to the progress of language model research.

Pricing

Pricing Starts At:
Free
Pricing Information:
Open source
Free Version:
Yes

Integrations

Reviews

Total
ease
features
design
support

No User Reviews. Be the first to provide a review:

Write a Review

Company Details

Company:
EleutherAI
Year Founded:
2020
Website:
github.com/EleutherAI/gpt-neox

Media

GPT-NeoX Screenshot 1
Recommended Products
Gemini 3 and 200+ AI Models on One Platform Icon
Gemini 3 and 200+ AI Models on One Platform

Access Google's best plus Claude, Llama, and Gemma. Fine-tune and deploy from one console.

Build, govern, and optimize agents and models with Gemini Enterprise Agent Platform.
Start Free

Product Details

Platforms
Web-Based
On-Premises
Types of Training
Training Docs

GPT-NeoX Features and Options

GPT-NeoX User Reviews

Write a Review
  • Previous
  • Next