DevStack
T

Together AI

Fast inference for open-source LLMs

Visit Website →
paidFrom Pay-per-tokenaillminferenceopen-source

Overview

Together AI provides fast, cheap inference for open-source models like LLaMA, Mixtral, and more. Also offers fine-tuning and dedicated deployments.

Key Features

  • Open-source LLM inference
  • Fine-tuning
  • Custom deployments
  • Function calling
  • JSON mode
  • OpenAI-compatible API

Pros

  • +Cheapest LLM inference
  • +OpenAI-compatible API
  • +Wide model selection

Cons

  • Models less capable than GPT-4
  • Documentation could be better
  • Newer platform

Alternatives to Together AI

More in AI & Machine Learning