# Docs > Documentation for CodinIT.dev, an AI-powered development platform for building applications with local and cloud AI models. ## Docs - [Direct LLM invocation](https://codinit.dev/docs/api-reference/chat-&-llm/direct-llm-invocation.md): Execute a direct call to a language model with optional streaming support. - [Enhance prompts using AI](https://codinit.dev/docs/api-reference/chat-&-llm/enhance-prompts-using-ai.md): Improves user prompts using AI model suggestions with streaming response. - [Get models for a specific provider](https://codinit.dev/docs/api-reference/chat-&-llm/get-models-for-a-specific-provider.md): Returns all available models for the specified provider. - [List all available models and providers](https://codinit.dev/docs/api-reference/chat-&-llm/list-all-available-models-and-providers.md): Returns a list of all supported AI models and their providers. - [Stream chat responses with AI models](https://codinit.dev/docs/api-reference/chat-&-llm/stream-chat-responses-with-ai-models.md): Initiates a streaming chat session with the selected AI model and provider. Returns server-sent events. - [Execute Supabase queries](https://codinit.dev/docs/api-reference/database/execute-supabase-queries.md): Executes SQL queries against a Supabase database. - [Fetch Supabase API keys](https://codinit.dev/docs/api-reference/database/fetch-supabase-api-keys.md): Retrieves API keys and configuration for a Supabase project. - [Get Supabase projects list](https://codinit.dev/docs/api-reference/database/get-supabase-projects-list.md): Retrieves all Supabase projects for the authenticated user. - [Deploy to Netlify](https://codinit.dev/docs/api-reference/deployment/deploy-to-netlify.md): Creates or updates a Netlify site deployment with the provided files. - [Deploy to Vercel](https://codinit.dev/docs/api-reference/deployment/deploy-to-vercel.md): Creates a new deployment on Vercel platform. - [Get Vercel deployment status](https://codinit.dev/docs/api-reference/deployment/get-vercel-deployment-status.md): Retrieves the status of current Vercel deployments. - [Check MCP server availability](https://codinit.dev/docs/api-reference/mcp/check-mcp-server-availability.md): Verifies that configured MCP servers are available and responsive. - [Retry MCP server connection](https://codinit.dev/docs/api-reference/mcp/retry-mcp-server-connection.md): Attempts to reconnect to a failed or disconnected MCP server and retrieves available tools. - [Update MCP configuration](https://codinit.dev/docs/api-reference/mcp/update-mcp-configuration.md): Updates the Model Context Protocol server configuration. - [Validate MCP server configuration](https://codinit.dev/docs/api-reference/mcp/validate-mcp-server-configuration.md): Validates the configuration for a Model Context Protocol server before applying it. - [Git proxy for CORS-enabled requests](https://codinit.dev/docs/api-reference/proxy/git-proxy-for-cors-enabled-requests.md): CORS-enabled proxy for isomorphic-git operations. - [Get application metadata](https://codinit.dev/docs/api-reference/system/get-application-metadata.md): Returns application version, dependencies, and build information. - [Get application metadata (POST)](https://codinit.dev/docs/api-reference/system/get-application-metadata-post.md): Alternative POST endpoint for application metadata. - [Get comprehensive diagnostics](https://codinit.dev/docs/api-reference/system/get-comprehensive-diagnostics.md): Returns complete system diagnostics including app info, memory, disk, processes, and git information. - [Get disk usage information](https://codinit.dev/docs/api-reference/system/get-disk-usage-information.md): Returns disk usage per filesystem/drive. - [Get disk usage information (POST)](https://codinit.dev/docs/api-reference/system/get-disk-usage-information-post.md): Alternative POST endpoint for disk information. - [Get Git repository information](https://codinit.dev/docs/api-reference/system/get-git-repository-information.md): Returns local and GitHub repository information including repos, organizations, and activity. - [Get running process information](https://codinit.dev/docs/api-reference/system/get-running-process-information.md): Returns information about running processes including PID, CPU%, and memory usage. - [Get running process information (POST)](https://codinit.dev/docs/api-reference/system/get-running-process-information-post.md): Alternative POST endpoint for process information. - [Get system memory statistics](https://codinit.dev/docs/api-reference/system/get-system-memory-statistics.md): Returns total, free, used, and swap memory information (cross-platform). - [Get system memory statistics (POST)](https://codinit.dev/docs/api-reference/system/get-system-memory-statistics-post.md): Alternative POST endpoint for memory information. - [Health check endpoint](https://codinit.dev/docs/api-reference/system/health-check-endpoint.md): Simple health check to verify API availability. - [Check for application updates](https://codinit.dev/docs/api-reference/utilities/check-for-application-updates.md): Checks if a newer version of the application is available. - [Check if API key is configured](https://codinit.dev/docs/api-reference/utilities/check-if-api-key-is-configured.md): Verifies whether a specific API key is configured in the environment. - [Export all configured API keys](https://codinit.dev/docs/api-reference/utilities/export-all-configured-api-keys.md): Returns all configured API keys and provider settings for backup purposes. - [Fetch GitHub repository template files](https://codinit.dev/docs/api-reference/utilities/fetch-github-repository-template-files.md): Retrieves template files from a GitHub repository for project initialization. - [Changelog](https://codinit.dev/docs/changelog.md): Track the latest CodinIT releases, feature updates, bug fixes, and improvements with detailed changelogs and version history. - [Bolt.DIY vs CodinIT](https://codinit.dev/docs/comparisons/bolt-vs-codinit.md): Compare Bolt.DIY vs CodinIT AI coding assistants. Compare features, LLM support, and AI code generation capabilities to choose the best AI-powered IDE for your development needs. - [Lovable vs CodinIT](https://codinit.dev/docs/comparisons/lovable-vs-codinit.md): Compare Lovable.dev vs CodinIT AI app builders. Detailed comparison of AI code generation, LLM features, and development approaches. - [Code Editor Panel](https://codinit.dev/docs/features/development/code-editor.md): Where you write and organize your code files - [Development](https://codinit.dev/docs/features/development/developers.md): Set up your local development environment with CodinIT using Web Containers or E2B for building and testing applications. - [Terminal](https://codinit.dev/docs/features/development/terminal.md): Type commands to control your project, just like a computer's command line - [WebContainer](https://codinit.dev/docs/features/development/webcontainer.md): Switch between different parts of your app while it's running - [Workbench](https://codinit.dev/docs/features/development/workbench.md): Your complete workspace for building apps - write code, see it work, and track changes - [Features Overview](https://codinit.dev/docs/features/overview.md): Discover CodinIT's AI code generation, intelligent autocomplete, AI pair programming, and LLM-powered development tools for full-stack applications. - [Installation](https://codinit.dev/docs/getting-started/installation.md): Install CodinIT AI-powered IDE on Windows, Mac, and Linux. Set up the AI coding assistant with step-by-step installation guide for developers. - [Model Selection](https://codinit.dev/docs/getting-started/select-your-model.md): Select the best AI coding model for your needs. Compare Claude, GPT-4, Gemini, and DeepSeek for AI-powered development and code generation. - [Build Your First Project](https://codinit.dev/docs/getting-started/your-first-project.md): Create a modern React application with database integration and deployment in 15 minutes. - [Cloudflare Pages](https://codinit.dev/docs/integrations/cloudflare.md): Deploy your CodinIT projects to Cloudflare Pages with global CDN, edge computing, and automatic SSL for lightning-fast performance. - [Netlify](https://codinit.dev/docs/integrations/netlify.md): Deploy your CodinIT projects directly to Netlify with one click using seamless integration and automatic build detection. - [Vercel](https://codinit.dev/docs/integrations/vercel.md): Deploy AI-generated applications to Vercel directly from CodinIT with automatic optimization & intelligent configuration. - [Welcome](https://codinit.dev/docs/introduction/welcome.md): CodinIT is an open-source AI coding assistant and AI-powered development environment. Build full-stack applications with AI. - [Context Window Guide](https://codinit.dev/docs/model-config/context-windows.md): How much the AI can remember at once - [Model Comparison & Pricing](https://codinit.dev/docs/model-config/model-comparison.md): Which AI to use and how much it costs - [Discussion Mode](https://codinit.dev/docs/prompting/discussion-mode.md): Technical consultant mode for collaborative problem-solving and guidance - [Maximize Token Efficiency](https://codinit.dev/docs/prompting/maximize-token-efficiency.md): How to use AI without spending too much - [Plan Your App](https://codinit.dev/docs/prompting/plan-your-app.md): Master strategic planning techniques for successful application development with AI-powered architecture and feature planning guidance. - [Prompt Engineering Guide](https://codinit.dev/docs/prompting/prompt-engineering-guide.md): How to talk to AI to get better code - [Prompt Effectively](https://codinit.dev/docs/prompting/prompting-effectively.md): Master the art of clear, effective communication with AI models - [Anthropic](https://codinit.dev/docs/providers/anthropic.md): Configure Anthropic Claude models with CodinIT for advanced reasoning and code generation. - [AWS Bedrock](https://codinit.dev/docs/providers/aws-bedrock.md): How to connect AWS Bedrock AI to CodinIT - [Providers](https://codinit.dev/docs/providers/cloud-providers.md): Connect CodinIT AI IDE with 18+ LLM providers including Claude, GPT-4, Gemini, DeepSeek for AI code generation, local inference, and specialized AI coding services. - [Cohere](https://codinit.dev/docs/providers/cohere.md): Configure Cohere's Command R series models for reasoning, code generation, and multilingual tasks. - [DeepSeek](https://codinit.dev/docs/providers/deepseek.md): Configure DeepSeek models for coding and reasoning tasks with CodinIT. - [GitHub Models](https://codinit.dev/docs/providers/github.md): Access OpenAI GPT-4, o1, and other AI models through GitHub's platform. - [Google Gemini](https://codinit.dev/docs/providers/google.md): Configure GCP Vertex AI to access Gemini and Claude models through Google Cloud. - [Groq](https://codinit.dev/docs/providers/groq.md): Configure Groq's ultra-fast LPU inference for models from OpenAI, Meta, and DeepSeek. - [Hugging Face](https://codinit.dev/docs/providers/huggingface.md): Access thousands of open-source AI models through Hugging Face's inference API. - [Hyperbolic](https://codinit.dev/docs/providers/hyperbolic.md): Access high-performance open-source AI models through Hyperbolic's optimized infrastructure. - [LM Studio](https://codinit.dev/docs/providers/lmstudio.md): Run AI models locally with LM Studio's user-friendly interface for privacy and offline development. - [Mistral](https://codinit.dev/docs/providers/mistral-ai.md): Configure Mistral AI models including Codestral for code generation with CodinIT. - [Moonshot](https://codinit.dev/docs/providers/moonshot.md): Configure Moonshot's Kimi series models for Chinese language and multilingual tasks. - [Ollama](https://codinit.dev/docs/providers/ollama.md): Run AI models locally with Ollama for privacy and offline access. - [OpenAI](https://codinit.dev/docs/providers/openai.md): Configure OpenAI models including GPT-5, o3, and o4-mini with CodinIT. - [OpenAI Compatible](https://codinit.dev/docs/providers/openai-like.md): Connect to any OpenAI-compatible API endpoint including custom deployments and self-hosted models. - [OpenRouter](https://codinit.dev/docs/providers/openrouter.md): Access multiple AI models through a unified API with OpenRouter. - [Perplexity](https://codinit.dev/docs/providers/perplexity.md): Configure Perplexity's Sonar models with integrated web search for research tasks. - [Together AI](https://codinit.dev/docs/providers/togetherai.md): Access hundreds of open-source AI models through Together's optimized platform. - [xAI (Grok)](https://codinit.dev/docs/providers/xai-grok.md): Configure xAI's Grok models with large context windows and reasoning capabilities. - [Quickstart](https://codinit.dev/docs/quickstart.md): Get started with CodinIT AI-powered IDE in minutes. Install the AI coding assistant, connect LLM providers like Claude and OpenAI, and start building with AI code generation. - [Local models setup](https://codinit.dev/docs/running-models-locally/local-model-setup.md): Run AI models locally on your own hardware for enhanced privacy, zero API costs, offline development, and complete data control. - [FAQ](https://codinit.dev/docs/support/frequently-asked-questions.md): Find answers to common questions about CodinIT AI IDE setup, LLM model selection, AI code generation features, integrations, and AI development troubleshooting. - [Integration issues](https://codinit.dev/docs/support/integration-issues.md): Solve common issues with CodinIT, integrations in development environment - [Troubleshooting](https://codinit.dev/docs/support/troubleshooting.md): Solve common issues with CodinIT AI IDE, LLM providers, code generation errors, and AI-powered development environment problems. ## OpenAPI Specs - [openapi](https://codinit.dev/docs/openapi.json) ## Optional - [Blog](https://codinit.dev/blog) - [LinkedIn](https://www.linkedin.com/company/codinit-dev)