Skip to main content

Supported Providers

Specialized Providers

AWS Integration

Local Models

Choosing the Right Provider

Consider these factors when selecting an AI provider:
  • Fastest inference: Groq, Fireworks
  • Best reasoning: Anthropic Claude, DeepSeek
  • Balanced performance: OpenAI GPT-5, Google Gemini
  • Free/Low-cost: Local models (Ollama), OpenRouter
  • Pay-per-use: Most cloud providers
  • Enterprise: AWS Bedrock, Anthropic
  • Highest privacy: Local models (Ollama, LM Studio)
  • Enterprise-grade: AWS Bedrock
  • SOC 2 compliant: Anthropic, OpenAI, Google
  • Code generation: All providers support coding tasks
  • Multimodal: Google Gemini, GPT-5 Vision
  • Long context: Claude (200K), Gemini (1M+)
  • Function calling: OpenAI, Anthropic, Google

Quick Start

1

Choose Your Provider

Select a provider from the list above based on your needs
2

Get API Credentials

Sign up for the provider and obtain your API key
3

Configure in CodinIT

Add your credentials in CodinIT’s settings under AI Providers
4

Select Your Model

Choose the specific model you want to use for your project

Configuration Tips

Multi-Provider Setup: You can configure multiple providers simultaneously and switch between them based on your task requirements.
API Key Security: Your API keys are stored locally and never sent to CodinIT servers. They are only used to communicate directly with your chosen AI provider.
Rate Limits: Each provider has different rate limits. Check your provider’s documentation for details.

Next Steps