LongLLaMA Description

This repository showcases the research preview of LongLLaMA, an advanced large language model that can manage extensive contexts of up to 256,000 tokens or potentially more. LongLLaMA is developed on the OpenLLaMA framework and has been fine-tuned utilizing the Focused Transformer (FoT) technique. The underlying code for LongLLaMA is derived from Code Llama. We are releasing a smaller 3B base variant of the LongLLaMA model, which is not instruction-tuned, under an open license (Apache 2.0), along with inference code that accommodates longer contexts available on Hugging Face. This model's weights can seamlessly replace LLaMA in existing systems designed for shorter contexts, specifically those handling up to 2048 tokens. Furthermore, we include evaluation results along with comparisons to the original OpenLLaMA models, thereby providing a comprehensive overview of LongLLaMA's capabilities in the realm of long-context processing.

Pricing

Pricing Starts At:
Free
Free Version:
Yes

Integrations

No Integrations at this time

Reviews

Total
ease
features
design
support

No User Reviews. Be the first to provide a review:

Write a Review

Company Details

Company:
LongLLaMA
Website:
github.com/CStanKonrad/long_llama

Media

LongLLaMA Screenshot 1
Recommended Products
Gen AI apps are built with MongoDB Atlas Icon
Gen AI apps are built with MongoDB Atlas

Build gen AI apps with an all-in-one modern database: MongoDB Atlas

MongoDB Atlas provides built-in vector search and a flexible document model so developers can build, scale, and run gen AI apps without stitching together multiple databases. From LLM integration to semantic search, Atlas simplifies your AI architecture—and it’s free to get started.
Start Free

Product Details

Platforms
Web-Based
On-Premises
Types of Training
Training Docs
Customer Support
Online Support

LongLLaMA Features and Options

LongLLaMA Lists

LongLLaMA User Reviews

Write a Review
  • Previous
  • Next