Build an Intelligent Sales Assistant with Azure AI Foundry and Snowflake Cortex. Enterprise AI should be simple, secure, and connected. In this hands-on lab, you will learn how to integrate Azure AI Foundry with Snowflake Cortex using the Snowflake-managed MCP server so your AI agents can query Snowflake data with full governance and privacy controls. What you’ll do: - Set up Cortex Search and Cortex Analyst on sample sales data - Configure a Snowflake-managed MCP server without additional infrastructure - Create an AI Foundry Agent that uses Model Context Protocol (MCP) to orchestrate Cortex tools - Chat with the agent to analyze sales conversations, KPIs, and win/loss insights Why it matters: - Unified governance with RBAC, policies, and monitoring across tools - Standards-based interoperability with MCP - Faster time to value so you can focus on insights, not infrastructure You’ll build: ✅ Snowflake Search and Analyst services ✅ Managed MCP Server ✅ AI Foundry Agent and client Get started: https://lnkd.in/g9C_ScBB
Snowflake Developers
Software Development
Menlo Park, California 58,308 followers
Build Massive-Scale Data Apps Without Operational Burden #PoweredBySnowflake #SnowflakeBuild
About us
Snowflake delivers the AI Data Cloud — mobilize your data apps with near-unlimited scale and performance. #PoweredbySnowflake
- Website
-
https://www.snowflake.com/en/developers/
External link for Snowflake Developers
- Industry
- Software Development
- Company size
- 5,001-10,000 employees
- Headquarters
- Menlo Park, California
- Founded
- 2012
- Specialties
- big data, sql, data cloud, cloud data platform, developers , ai data cloud, agentic ai, ai, and data engineering
Updates
-
You can now use the familiar OpenAI SDK to securely power your AI apps with Snowflake's cutting-edge models. This quickstart demonstrates two key patterns for building external, production-ready AI apps: - Simple LLM Access: Use the OpenAI SDK to access Snowflake-hosted models like GPT-5 and Claude. - Advanced Agent Orchestration: Build custom agents with LangGraph that connect to Snowflake's Managed MCP Server via OAuth 2.0. You'll learn how to combine semantic search, metrics analysis, and natural language to SQL, all without data movement. Get started now with the quickstart from James Cha-Earley: https://lnkd.in/g9DW9Rsf
-
Metric governance meets Infrastructure-as-Code! 🛠️ The new snowflake_semantic_view Terraform resource is a game-changer for Snowflake developers, finally bringing IaC to your metrics. Now you can: - Version Control your semantic views using standard HCL. - Streamline CI/CD with peer-review and automated deployment. - Enforce a Single Source of Truth for all metrics, ensuring BI tools and AI agents operate on trusted, mathematically verified data. Full details: https://lnkd.in/g7aHhGEd
-
-
We're excited to announce the availability of Anthropic’s Claude Opus 4.5 for customers to use natively within the secure Snowflake perimeter. Opus 4.5 is Anthropic's new premium frontier model and its most intelligent to date. It sets a new standard for agentic coding and complex problem-solving. You can now leverage this powerful model for a wide range of enterprise use cases: - Cortex AI Functions: Use familiar SQL with AI_COMPLETE to build cost-efficient AI pipelines and analyze multimodal data directly in your data warehouse. - Build Enterprise Intelligence Agents: Extend the foundation for Snowflake Intelligence with Opus 4.5 to create advanced, governed AI agents that reason and orchestrate complex analytical workflows. Learn more: https://lnkd.in/gqjTVxkj
-
Streaming just got a massive upgrade! We are pleased to announce the General Availability of our next-generation, high-performance streaming architecture across all major cloud providers: AWS, Azure, and GCP. This new foundation is built on ❄️ Snowpipe Streaming, our real-time API designed to enable developers to ingest data directly from applications into Snowflake tables. This low-latency, row-based approach provides a significant architectural alternative to traditional file-based ingestion, making data available for query in seconds. The core goal of this next-gen architecture was to significantly push the boundaries of performance, cost efficiency, and scale for data ingestion workloads. Key benefits available today include: - Massive Throughput: Achieve ingestion rates up to 10 GB/s per table. - Low Latency: Data is typically ready to query in under 10 seconds from the point of ingestion. - Cost Predictability: A new, superior ingest-based pricing model provides more predictable and flat consumption costs for streaming workloads. Learn more: https://lnkd.in/g_-Ri-Y4
-
-
With the new SSIS replatform path in SnowConvert AI, you can convert legacy SSIS packages into native Snowflake + dbt projects. Keep your control flow inline for easier debugging, no context switching, and consistent naming. Build faster with modern dbt practices, automated lineage, and built-in CI/CD. Run your data and orchestration logic side-by-side in Snowflake and focus on what matters, like shipping reliable pipelines, not wrestling with tool sprawl. Get started: https://lnkd.in/gQnZJv3n
-
-
Connect and onboard your data with any API. Getting real-time data from internal microservices or third-party vendors has traditionally meant complex pipelines, high latency, and extra infrastructure to manage. With Snowflake Openflow, that’s changing. Join us for a deep dive demo to see how Openflow, Snowflake’s managed data integration service, lets you onboard data directly from any internal or external API, no additional pipeline maintenance required. 📅 Date: November 20 ⏰ Time: 10:00 AM PT You’ll learn how to: ✅ Design and develop your data flow with key architecture decisions ✅ Securely connect to APIs using multiple authentication methods ✅ Process raw API responses for seamless Snowflake ingestion ✅ Implement robust error handling for reliable data delivery Register: https://lnkd.in/g6DYC9np
-
-
At this year’s PyTorch conference, our AI Research and Engineering teams presented new work addressing 4 of the most pressing challenges in scalable AI: ❄️ Efficient large-scale training with DeepSpeed.ai and Arctic Training ❄️ Parallel orchestration of thousands of transformer models ❄️ Faster, lower-cost inference with Arctic Inference ❄️ Balanced multilingual performance with Arctic-Embed 2.0 As a Premier Member of the PyTorch Foundation, Snowflake continues to contribute to the advancement of the PyTorch ecosystem. These advancements make it easier for developers to train, deploy, and scale AI in production — all within the Snowflake AI Data Cloud. Check out the full write-up + resources from our PyTorch sessions: https://lnkd.in/gwCJv8_D
-
Build, test, and deploy dbt Projects directly in Snowflake. This quickstart for dbt Projects on Snowflake walks you through how to run your entire dbt workflow inside the Snowflake AI Data Cloud. Use familiar tools like Workspaces and Git to version, edit, and orchestrate SQL transformations end-to-end. You’ll learn how to: • Create and run dbt Projects natively within Snowflake • Use Workspaces as a file-based IDE for dbt development • Pull projects from GitHub and manage CI/CD workflows • Deploy, schedule, and monitor dbt jobs with built-in observability Get hands-on and build a complete dbt Project using Tasty Bytes sample data, all within Snowflake: https://lnkd.in/gidkqXrX
-
-
Your scikit-learn and pandas workflows can now easily scale up and run on GPUs in Snowflake! We are excited to announce that Snowflake ML now comes preinstalled with NVIDIA’s cuML and cuDF libraries, delivering native GPU acceleration for popular data science tools like scikit-learn, pandas, UMAP, and HDBSCAN – no code rewrites needed. Snowflake customers can now dramatically speed up ML model training and iterative exploration on massive datasets for use cases like topic modeling and genomics, all within a governed, unified platform. Benchmark results show up to 200x faster processing on GPU vs. CPU. All the details: https://lnkd.in/g98xxN7N