Skip to content

ggml update to 0.11.0, llama-cpp update to 9030#51551

Merged
vicroms merged 13 commits into
microsoft:masterfrom
miyanyan:ggml-0.10.2
May 11, 2026
Merged

ggml update to 0.11.0, llama-cpp update to 9030#51551
vicroms merged 13 commits into
microsoft:masterfrom
miyanyan:ggml-0.10.2

Conversation

@miyanyan
Copy link
Copy Markdown
Contributor

@miyanyan miyanyan commented May 5, 2026

  • Changes comply with the maintainer guide.
  • SHA512s are updated for each updated download.
  • The "supports" clause reflects platforms that may be fixed by this new version, or no changes were necessary.
  • Any fixed CI baseline and CI feature baseline entries are removed from that file, or no entries needed to be changed.
  • All patch files in the port are applied and succeed.
  • The version database is fixed by rerunning ./vcpkg x-add-version --all and committing the result.
  • Exactly one version is added in each modified versions file.

@miyanyan miyanyan changed the title [ggml] update to 0.10.2 ggml update to 0.11.0, llama-cpp update to 9030 May 5, 2026
@miyanyan
Copy link
Copy Markdown
Contributor Author

miyanyan commented May 5, 2026

  • Switch vendored libs to system vcpkg ports:
    • cpp-httplib → vcpkg port (header-only, always needed by common lib)
    • nlohmann-json → vcpkg port (header-only, always needed by common lib)
    • stb → vcpkg port (header-only, needed by mtmd/tools)
    • miniaudio → vcpkg port (header-only, needed by mtmd/tools)
  • Keep vendored: sheredom/subprocess.h (no vcpkg port available)
  • Patch include paths (stb, miniaudio, cpp-httplib) and CMake target names (cpp-httplib → httplib::httplib)
  • Add new patch: unvendor.diff

Comment on lines +9 to +11
+ find_package(httplib CONFIG REQUIRED)
add_subdirectory(common)
- add_subdirectory(vendor/cpp-httplib)
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

     add_subdirectory(common)
-    add_subdirectory(vendor/cpp-httplib)
+    find_package(httplib CONFIG REQUIRED)
+    add_library(cpp-httplib ALIAS httplib::httplib)

would avoid of changing all the uses of cpp-httplib (and still export httplib::httplib to cmake config).

... And we probably need a find_dependency(httplib CONFIG) in the cmake confiug file (unless it is only used in executables - check the export).

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

cpp-httplib is only linked privately by llama-common:

target_link_libraries(llama-common PRIVATE cpp-httplib)

and llama-common is not exported as a CMake target in the installed package. The generated llama-config.cmake only creates/imports the llama target, whose interface links to ggml. So httplib is not part of the public CMake interface for consumers.

https://github.com/ggml-org/llama.cpp/blob/master/cmake/llama-config.cmake.in#L20

https://github.com/ggml-org/llama.cpp/blob/bbeb89d76c41bc250f16e4a6fefcc9b530d6e3f3/CMakeLists.txt#L250-L254

Comment thread ports/ggml/portfile.cmake
@BillyONeal
Copy link
Copy Markdown
Member

Merged with master to pick up CUDA version change from #51210.

@BillyONeal BillyONeal marked this pull request as draft May 5, 2026 22:48
@BillyONeal
Copy link
Copy Markdown
Member

Drafting due to legitimate build failures.

Comment thread ports/ggml/portfile.cmake
Comment on lines +13 to +14
fix-vulkan-spv-shadowing.diff
fix-vk-32bit.diff
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Note for reviewers: these patches have been turned into upstream PRs:

ggml-org/llama.cpp#22760
ggml-org/llama.cpp#22892

@vicroms vicroms merged commit 68fb1ad into microsoft:master May 11, 2026
16 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants