GPT-NeoX Description
A model parallel autoregressive transformator implementation on GPUs based on the DeepSpeed Library.
This repository contains EleutherAI’s library for training large language models on GPUs. Our current framework is based upon NVIDIA's Megatron Language Model, and has been enhanced with techniques from DeepSpeed, as well as some novel improvements. This repo is intended to be a central and accessible place for techniques to train large-scale autoregressive models and to accelerate research into large scale training.
Pricing
Pricing Starts At:
Free
Pricing Information:
Open source
Free Version:
Yes
Integrations
Company Details
Company:
EleutherAI
Year Founded:
2020
Website:
github.com/EleutherAI/gpt-neox
Recommended Products
Product Details
Platforms
SaaS
On-Premises
Type of Training
Documentation