Foundation model companies and API platforms powering the AI ecosystem — from frontier models to inference endpoints.
Access Claude models via API — industry-leading for coding, analysis, and long-context tasks.
The most widely adopted AI API — powering GPT-4o, o1, DALL-E, and Whisper.
Meta's open-source LLM family — the most popular foundation for self-hosted and fine-tuned AI.
The fastest AI inference — custom LPU chips delivering 10x speed for open-source models.
One API, every model — unified access to OpenAI, Anthropic, Google, Meta, and 200+ more.
Europe's leading AI lab — open-weight models with frontier performance at competitive prices.
Run 200+ open-source models via fast, affordable API — the one-stop shop for open AI.
Enterprise AI platform specializing in RAG, embeddings, and text understanding.
High-performance inference platform — fast, cheap, and optimized for production workloads.
Jamba model family — hybrid SSM-Transformer architecture for efficient long-context AI.
1–10 of 10 tools