Back to changelog

CanopyWave Partnership: 90% Off DeepSeek v3.1

Exclusive partnership with CanopyWave brings massive 90% discount on DeepSeek v3.1, making advanced reasoning capabilities more accessible than ever.

CanopyWave partnership offering 90% off DeepSeek v3.1

We're thrilled to announce our new partnership with CanopyWave, bringing you an incredible 90% discount on DeepSeek v3.1 – one of the most powerful reasoning models available.

🎉 Massive Savings on Advanced AI

You can now access DeepSeek v3.1 at unprecedented pricing:

Input Pricing

  • $0.27$0.03 per million tokens
  • 90% off regular pricing

Output Pricing

  • $1.00$0.10 per million tokens
  • 90% off regular pricing

🧠 DeepSeek v3.1 Through CanopyWave

1canopywave/deepseek-v3.1

Context Window: 128,000 tokens

🚀 Getting Started

Immediate Access: The CanopyWave provider is available now through LLM Gateway with the promotional pricing applied automatically.

Simple Integration: Use model identifier canopywave/deepseek-v3.1 in your API calls to access DeepSeek v3.1 at the discounted rate.

No Additional Setup: Your existing LLM Gateway API key works seamlessly with the CanopyWave provider.

1import { llmgateway } from "@llmgateway/ai-sdk-provider";2import { generateText } from "ai";3
4const { text } = await generateText({5  model: llmgateway("canopywave/deepseek-v3.1"),6  prompt: "Solve this complex problem with advanced reasoning...",7});
1curl -X POST https://api.llmgateway.io/v1/chat/completions \2  -H "Authorization: Bearer YOUR_API_KEY" \3  -H "Content-Type: application/json" \4  -d '{5    "model": "canopywave/deepseek-v3.1",6    "messages": [{"role": "user", "content": "Hello DeepSeek!"}],7  }'

This partnership with CanopyWave demonstrates our commitment to making cutting-edge AI accessible to everyone. Start using canopywave/deepseek-v3.1 today and experience premium reasoning capabilities at game-changing prices.

Try it now in the Playground 🚀