Ep156: LLM Migrations to One Cloud: Coveo's Strategic Move to Amazon Bedrock
カートのアイテムが多すぎます
ご購入は五十タイトルがカートに入っている場合のみです。
カートに追加できませんでした。
しばらく経ってから再度お試しください。
ウィッシュリストに追加できませんでした。
しばらく経ってから再度お試しください。
ほしい物リストの削除に失敗しました。
しばらく経ってから再度お試しください。
ポッドキャストのフォローに失敗しました
ポッドキャストのフォロー解除に失敗しました
-
ナレーター:
-
著者:
このコンテンツについて
Learn how Coveo automated LLM migration like a "mind transplant," building frameworks to optimize prompts and maintain quality across model changes.
Topics Include:
- AWS and Coveo discuss their Gen-AI innovation using Amazon Bedrock and Nova.
- Coveo faced multi-cloud complexity, data residency requirements, and rising AI costs.
- Coveo indexes enterprise content across hundreds of sources while maintaining security permissions.
- The platform powers search, generative answers, and AI agents across commerce and support.
- CRGA is Coveo's fully managed RAG solution deployed in days, not months.
- Customers see 20-30% case reduction; SAP Concur saves €8 million annually.
- Original architecture used GPT on Azure; migration targeted Nova Lite on Bedrock.
- Infrastructure setup involved guardrails and load testing for 70 billion monthly tokens.
- Migrating LLMs is like a "mind transplant"—prompts must be completely re-optimized.
- Coveo built automated evaluation framework testing 20+ behaviors with each system change.
- Nova Lite improved answer accuracy, reduced hallucinations, and matched GPT-4o Mini performance.
- Migration simplified governance, enabled regional compliance, reduced latency, and lowered costs.
Participants:
- Sebastien Paquet – Vice President, AI Strategy, Coveo
- Yanick Houngbedji – Solutions Architect Canada ISV, Amazon Web Services
See how Amazon Web Services gives you the freedom to migrate, innovate, and scale your software company at https://aws.amazon.com/isv/
まだレビューはありません