Supported Language Models

UsageGuard supports a wide range of Language Models (LLMs) from various providers, allowing you to leverage the power of different AI models while maintaining a consistent API and robust safeguards.

Overview

UsageGuard acts as a proxy for multiple LLM providers, offering a unified interface for interacting with various models. This page provides an overview of the supported LLMs, their capabilities, and any provider-specific considerations.

Supported Providers and Models

UsageGuard offers a unified inference API for various language models.

NameModel IdCapabilities (Input)AvailableDescription
Mistral Largemistral.mistral-large-2402-v1:0textYesLarge language model by Mistral AI. For advanced NLP tasks and complex reasoning.
Mistral Smallmistral.mistral-small-2402-v1:0textYesEfficient small model by Mistral AI. For lightweight NLP tasks and quick responses.
Amazon Titan Text G1 - Expressamazon.titan-text-express-v1textYesFast and efficient text model by Amazon. Ideal for rapid text generation and real-time applications.
Amazon Titan Text G1 - Liteamazon.titan-text-lite-v1textYesLightweight text model by Amazon. Suitable for efficient text processing and mobile applications.
Amazon Titan Text G1 - Premieramazon.titan-text-premier-v1:0textYesPremium text model by Amazon. For high-quality text generation and complex NLP tasks.
Meta Llama 3.2 1B Instructus.meta.llama3-2-1b-instruct-v1:0textYesCompact instruction-tuned model. Efficient for instruction following and quick responses.
Meta Llama 3.2 3B Instructus.meta.llama3-2-3b-instruct-v1:0textYesLarger instruction-tuned model. For more complex instruction tasks with better quality.
Mistral 8x7b Instructmistral.mixtral-8x7b-instruct-v0:1textYesLarge language model for instruction. Suitable for complex instruction-based tasks and educational tools.
Mistral 7b Instructmistral.mistral-7b-instruct-v0:2textYesSmaller instruction-optimized language model. For lightweight educational tools and guided tasks.
Meta Llama3 70b Instructmeta.llama3-70b-instruct-v1:0textYesHigh-capacity language model for instruction. Ideal for advanced educational platforms and detailed guidance.
Meta Llama3 8b Instructmeta.llama3-8b-instruct-v1:0textYesEfficient language model for instruction. Suitable for general instruction and lightweight educational tools.
Anthropic Claude 3 Opusanthropic.claude-3-opus-20240229-v1:0text,image,documentYesComprehensive language model. For a broad range of tasks and versatile applications.
Anthropic Claude 3 Haikuanthropic.claude-3-haiku-20240307-v1:0text,image,documentYesSpecialized for concise and poetic outputs. Ideal for creative writing and poetry generation.
Anthropic Claude 3.5 Sonnetanthropic.claude-3-5-sonnet-20240620-v1:0text,image,documentYesAdvanced poetic and structured language model. For high-quality structured writing and sonnet creation.
Anthropic Claude 3 Sonnetanthropic.claude-3-sonnet-20240229-v1:0text,image,documentYesPoetic and structured language model. Suitable for poetic tasks and structured creative writing.
Anthropic Claude v2anthropic.claude-v2:1textNoGeneral purpose language model. For versatile applications and general language tasks.
Open AI GPT 3.5 Turbogpt-3.5-turbo-0125textYesGeneral purpose language model. Ideal for chatbots, content creation, and language tasks.
Open AI GPT-4ogpt-4o-2024-05-13text,imageYesAdvanced language understanding and generation. For complex tasks and detailed content creation.
Open AI GPT-4o-minigpt-4o-mini-2024-07-18text,imageYesCost effective, lightweight language model. Suitable for lightweight tasks, chatbots, and content creation.
Open AI GPT 3.5 Turbo Instructgpt-3.5-turbo-instructtextYesInstruction-optimized language model. For educational tools and guided instructions.

Switching Between Models

One of the key benefits of UsageGuard is the ability to easily switch between different LLMs without changing your application code. To switch models:

  1. Create a new or edit existing connection
  2. Enable the new model(s) to the connection
  3. Update your API calls to use the new connection ID (if new connection, add x-connection-id header)
  4. UsageGuard will handle the rest, including any necessary request transformations

Best Practices

  • Model Selection: Choose the appropriate model based on your specific use case and performance requirements.
  • Cost Management: Monitor your usage and leverage UsageGuard's cost control features to manage expenses.
  • Content Policies: Be aware of each provider's content policies and use UsageGuard's moderation features to ensure compliance.
  • Performance Optimization: Use model-specific best practices for prompt engineering and request formatting to get the best results.

Troubleshooting

If you encounter any issues with a specific model or provider:

  1. Check the Status Page for any known issues or outages
  2. Review the API Reference to learn more about the model's specific parameters.
  3. Consult our Error Handling Guide for common issues and solutions

If you need further assistance, don't hesitate to contact our support team.

Next Steps

Now that you're familiar with the supported LLMs, you're ready to start leveraging these powerful models in your applications:

Was this page helpful?