Skip to content

Conversation

@ekzhu
Copy link
Contributor

@ekzhu ekzhu commented Mar 14, 2025

use the latest version of llama-cpp-python to ensure uv sync --all-extras don't fail on windows.

reference: #5942 (comment)

@ekzhu ekzhu requested review from lokitoth and rysweet March 14, 2025 18:43
@codecov
Copy link

codecov bot commented Mar 14, 2025

Codecov Report

All modified and coverable lines are covered by tests ✅

Project coverage is 75.77%. Comparing base (0276aac) to head (a120543).
Report is 1 commits behind head on main.

Additional details and impacted files
@@           Coverage Diff           @@
##             main    #5948   +/-   ##
=======================================
  Coverage   75.76%   75.77%           
=======================================
  Files         191      191           
  Lines       13114    13114           
=======================================
+ Hits         9936     9937    +1     
+ Misses       3178     3177    -1     
Flag Coverage Δ
unittests 75.77% <ø> (+<0.01%) ⬆️

Flags with carried forward coverage won't be shown. Click here to find out more.

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • Test Analytics: Detect flaky tests, report on failures, and find test suite problems.
  • 📦 JS Bundle Analysis: Save yourself from yourself by tracking and limiting bundle sizes in JS merges.
@ekzhu ekzhu merged commit 5f9e37d into main Mar 14, 2025
57 checks passed
@ekzhu ekzhu deleted the ekzhu-upgrade-llama-cpp branch March 14, 2025 19:20
ekzhu added a commit that referenced this pull request Mar 14, 2025
use the latest version of llama-cpp-python to ensure `uv sync
--all-extras` don't fail on windows.

reference:
#5942 (comment)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

3 participants