Snowflake Developers’ cover photo
Snowflake Developers

Snowflake Developers

Software Development

Menlo Park, California 66,245 followers

Build Massive-Scale Data Apps Without Operational Burden #PoweredBySnowflake #SnowflakeBuild

About us

Snowflake delivers the AI Data Cloud — mobilize your data apps with near-unlimited scale and performance. #PoweredbySnowflake

Website
https://www.snowflake.com/en/developers/
Industry
Software Development
Company size
5,001-10,000 employees
Headquarters
Menlo Park, California
Founded
2012
Specialties
big data, sql, data cloud, cloud data platform, developers , ai data cloud, agentic ai, ai, and data engineering

Updates

  • What if your data scientists could determine your most impactful ML features with just natural language? In this demo, we use Cortex Code inside Snowflake Notebooks to: - Identify which features actually drive churn prediction - Flag correlated signals that can distort results - Surface fragile features that may hurt model reliability - Suggest next steps like feature pruning or model changes No manual analysis. No guesswork. Just fast, explainable insights directly where your data lives. See how agentic ML in Cortex Code is transforming ML workflows and accelerating production: https://lnkd.in/g_hsYA7J

  • This is your inside look at how Snowflake's own Product Data Science team uses Cortex Code in their daily work. Hosted by James Cha-Earley Tyler Richards and Zachary Blackwood will feature real workflows with real outputs! (not demos) Workloads like: pipeline engineering, ad-hoc investigations, recurring analyses: tasks that used to cost hours now happen in a handful of prompts. Join the livestream on Tuesday, March 31st to learn: 👉 How Snowflake's data science team actually uses Cortex Code day-to-day 👉 How multi-step workflows collapse into single prompts, with validation and documentation built in 👉 Real lessons from months of internal use, plus live Q&A with the team!

    [LIVE] How Cortex Code Transforms Data Workflows at Snowflake

    [LIVE] How Cortex Code Transforms Data Workflows at Snowflake

    www.linkedin.com

  • What does it actually look like when an engineer takes a Jira ticket through dbt, Snowflake, semantic views, and a React frontend in a single terminal session? Join me with Trent Foley and Jonathan Gard from evolv Consulting as they demo the workflow live and we dig into the real questions: where AI fits in analytics engineering today, what has to be true about your data architecture before any of it works, and why engineering discipline matters more with AI than without it.

    [LIVE] AI-Assisted Analytics Engineering: Jira Ticket to PR in 30 Minutes

    [LIVE] AI-Assisted Analytics Engineering: Jira Ticket to PR in 30 Minutes

    www.linkedin.com

  • Exciting news from Snowflake AI Research! The new DeepSpeed Ulysses implementation for Hugging Face (HF) developed by our team is now fully integrated with Hugging Face Accelerate, the Transformers Trainer, and the TRL SFT Trainer. DeepSpeed Ulysses enables efficient training with ultra-long sequences by leveraging aggregate GPU memory across multiple devices — all while keeping communication overhead minimal. If you're working with long-context models, this is a big unlock. Learn more about how to leverage DeepSpeed Ulysses: https://lnkd.in/gNHDFTDX

    • No alternative text description for this image
  • Apache Iceberg™ has conquered batch analytics. What’s next? At Iceberg Summit, Russell Spitzer breaks down what V3 unlocked. Think vendor-neutral support for semi-structured and geospatial data, plus more efficient deletes. But he also gets into what’s still missing for AI and streaming workloads. This keynote looks ahead to V4 proposals like One File Commits, Improved Column Statistics, and Columnar Metrics, and what they unlock for builders. Watch the keynote in person or virtually on April 8 at 9:00 AM PT to see where Iceberg goes next: https://lnkd.in/gdf32zj2

    • No alternative text description for this image
  • Accelerate the entire machine learning lifecycle with agentic ML from Cortex Code You can plan, build, and run production-ready ML pipelines using natural language directly in Snowflake. Focus on the business problems, not the pipelines and tooling. With Cortex Code, you can: -Build and deploy high quality ML pipelines with built-in MLOps - Easily improve model performance and prediction accuracy quickly using natural language - Automate the tedious work so your data scientists can focus on higher impact initiatives See what's new: https://lnkd.in/gfQgzsQQ

  • The latest and greatest for developers in Cortex Code: 1️⃣ Cortex Code is now GA in Snowsight, bringing a persistent AI assistant directly into your workflows 2️⃣ Native Windows support for Cortex Code CLI expands access across dev environments 3️⃣ Agent Teams let you break down complex, multi-step work into parallel, coordinated tasks 4️⃣ New agent skills for Streamlit, Openflow, cost intelligence, ML, and more standardize how you build on data See what's new: https://lnkd.in/gvvVtJ6x 

    View organization page for Snowflake

    1,254,852 followers

    Want to build faster on your enterprise data—without the complexity? We've got you, with a number of major updates to Cortex Code. We're excited to have Cortex Code generally available in Snowsight for every Snowflake user, along with other updates including native Windows support for Cortex Code CLI. And, whether it's executing large multi-step projects or using specialized skills for cost optimization and agentic ML, we’re making it easier to bring ideas to production faster. This is enterprise AI built for speed and governance. ❄️ Dive in to the latest: https://lnkd.in/gRd6ADmC

  • The data landscape is undergoing a fundamental shift as organizations adopt open lakehouse architectures built on Apache Iceberg™. At the core of that shift is interoperability. You can use any engine to write to your Iceberg tables and query them from any engine. But even simple things like identifier casing can break that promise. A table created in one engine may not resolve in another. In this post, we break down how Snowflake solves this with CATALOG_CASE_SENSITIVITY in catalog-linked databases: - CASE_INSENSITIVE for Hadoop-style catalogs (Glue, Unity) - CASE_SENSITIVE for ANSI SQL environments (Polaris) Learn more: https://lnkd.in/gpKx2i4U

    • No alternative text description for this image
  • Get started with Snowflake DCM Projects. Snowflake DCM (Database Change Management) Projects let you define your Snowflake infrastructure as code using SQL-based definition files. In this quickstart, you'll learn: - How DCM Projects define Snowflake infrastructure as code - How to structure a DCM Project with a manifest and definition files - How to use Jinja templating to parameterize definitions across environments - How to plan (dry-run) and deploy changes using the Snowsight Workspaces UI - How to attach data quality expectations to objects using Data Metric Functions Get started: https://lnkd.in/g-4YQyi9

    • No alternative text description for this image

Affiliated pages

Similar pages