Skip to content

Groq

The Groq provider gives access to open-source models running on Groq's LPU inference engine. Models are accessed with the groq: prefix.

Setup

export GROQ_API_KEY="your-api-key"

Base URL: https://api.groq.com/openai/v1

Available Models

Model Use Case
mixtral-8x7b-32768 High quality with 32K context window
llama2-70b-4096 Fast inference, 4K context

Usage

from eval_lib.metrics import AnswerRelevancyMetric

metric = AnswerRelevancyMetric(model="groq:mixtral-8x7b-32768", threshold=0.7)