-
-
Notifications
You must be signed in to change notification settings - Fork 1.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[WIP] Handle meson based python-wheel #6454
base: master
Are you sure you want to change the base?
Conversation
Note that this is not ready for testing yet... still early WIP, simply ported in a new PR my pending changes from my local branch. |
additional info:
|
Yup, on my radar, will add that indeed. Although I wonder if numpy 1.26.x might still work with older DSM considering the new meson I'm trying to build-up.
That should be the case from
The per depend fully-generated meson cross-file should now fix that. I had encountered that same issue with cmake long ago where the
Long story short, the meson had never received such enhancement as things we're working just fine. But that's no longer true with python wheels whereas with the python virtual environment things gets totally confused for meson. Thus the need to have a fully functional meson cross-file defining all library and include path properly. have a look at current This is now mostly working with meson, still have a few things to go through but getting there.
That's a current known issue with numpy. We need to force settin the long bit accordingly for
EDIT: with regards to cython (which I just hit) there must be an issue with the PATH although there may be a way to set them in the meson native file such as:
Anyhow, as usual thnx for your feedback, and work slowly progressing on this. EDIT2: It turns out that I have yet to empty the env now when invoking meson build, which is not the case yet. Although it does work for regular meson builds it wont for python-based meson wheel builds. next on my todo. |
@hgy59 having a proof of concept that does build sucessfully for both aarch64 and x64 using latest numpy 2.2.3. Although struggling with armv7 and evansport... I'll check if I can make 1.26.4 to work instead for the moment. |
@th0ma7, was looking at the errors which remain:
And found the following which may be useful if you've not already considered:
Hope they can assist... |
I believe I now have something functional, but unmaintainable as-is. The goodUsing normal
I can now sucesfully cross-compile for arm7, evansport and x64 for DSM-7.1 The badThere is a know bug in gcc<=10 with aarch64 that makes the compiler to segfault. I tried pretty much every possible alternatives of flags/disabling in the code but I wasn't able to workaround it. The uglyPart of my crusade at making meson-python build to work, I ended-up at one point to reproduce the normal meson+ninja build. Surprisingly, this ended-up allowing me to sucesfully build numpy for aarch64... Missing is then the (re)packaging in wheel format, which hapens to be the exact same process as "The good" as long as I re-use the exact same builddir (which I ended-up figuring out tonight). This really is ugly, but does work. This last commit a2068f5 was not tested on previously working x64, evansport and armv7. I'll let this rest for tonight. Good news is, we're probably much closer now... just need to tie-up the remaining loose-ends. |
@th0ma7 the wheels created from python/*/Makefile are not yet added to The |
Thnx for catching this up, will include this. I'm also looking at how to install numpy in the crossenv... I got an idea on how i could reuse the newly cross-compiled numpy wheel so it gets installed into the cross portion of the crossenv so it can then be made available for other wheels that depends on it. Lastly, also looking at adding flexibility to have different vendor managed meson (other than numpy use case where the source package provides its own modified meson.py) and skipping that meson+ninja part when no vendor managed meson is provided (i.e. being the default use case) All in all, taking shape but will require a few more spare cycles before reaching the finishing line... |
@th0ma7 another small issue popped up:
The original wheels in the index (pypi) are cross compiled (like |
@th0ma7 I have successfully built python311-wheels with added python/numpy and python/numpy_1.26 for aarch64-7.1 and armv7-7.1. It would be interresting to validate whether such wheels created with gcc 8.5 will run under DSM 6. I guess if the *.so files within the wheels do not reference GLBC > 2.20 functions, it might work. My background: I am trying to build a final homeassistant package with support for DSM 6. This will be homeassistant 2024.3.3 that depends on numpy 1.26.0. This version is available in the index for x86_64 and aarch64 only, and I will have to build it at least for armv7 and evansport (i686). To support armv7 in homeassistant 2025.1.4, it will be |
Maybe this is similar to msgpack where it can fit in both? |
That's a long shot! Not sure how i can help you though. I could reinstall my armv7 using a 6.2.4 image to try it out if that helps? |
I got some pretty cool code locally that allows installing cross-compiled numpy wheel into the crossenv to allow building scipy and others... But I faced one major major major problem, gcc version. For @hgy59 All in all, this would require bumping our minimal version to DSM-7.2. EDIT: I'll sleep on it... and probably upload my new code online to safeguard it just in case even though it will fail to build. |
Good news, I was able to create a workaround patch for aarch64 ... a few loose ends but looking much better now. |
@hgy59 and @mreid-tt It may look like stagnating but after spending numerous hours on this I finally made a major leap forward which now allows using default This has definitively been taking way longer than anticipated but I believe things will now start to shape nicely 🤞 |
Disconnecting for tonight... but I have a strong feeling I'm inches away from a solution... And I think the issue remaining may be how I translated the LDFLAGS to meson (auto-generated under
@hgy59 feel free to pursue, fresh brain on this would be appreciated :) |
@th0ma7 I use a diyspk/nump-wheel package and include the numpy_test.py shown above. Added a service-setup.sh with
running on virtualdsm:
I can't find any binary that depends on libopenblas within the wheel, so I guess it is dynamically loaded and might have a specific search order. running with explicit library path
the AVX optimization seems not supported in virtualdsm. The definition of LD_LIBRARY_PATH is not a problem for the HA package (it already has it). to fix this, the rpath must be fixed/adjusted in the so files of the numpy wheel.
and both have |
@th0ma7 the above test works on DS-115j (armada370 - armv7) with DSM 7.1 when using LD_LIBRARY_PATH. |
@hgy59 would you mind commenting out the cpu-dispatch and cpu-baseline definition in numy's baseline I've added yesterday and retrying on you x86_64? and removing avx and re-test? I'll have to read further on this to get a proper understanding on how to using that (away from my build system atm). Also, I'm almost certain that the rpath is not functional atm, not only for meson-python wheels but probably overall for all meson builds. And that issue is new with this PR. Once that is fixed the LD_LIBRARY_PATH should not be needed. |
@th0ma7 my test even works on DS-218 (aarch64) with DSM 6.2.4 when patching BTW the |
working on it locally... Update: |
- most x64 archs are atom like - adjust cpu-baseline in python/numpy* (make it apollolake compatible)
This is really nice! Good work! What would be even better is the ability to completely avoid that altogether... but I doubt we can. Now let's find the issue with rpath... |
@th0ma7 my findings are, that we have to setup meson with --prefix for correct rpath, but I can't find where (or how) meson setup is called for our python-meson builds.
it is also documented for numpy on https://numpy.org/doc/stable/building/understanding_meson.html |
Aaahhh! Thnx for the pointer, i believe i know how to fix this later tonight! |
I did some more tests but didn't succeed. (added The so files in the builddir are binary the same as in the whl file. EDIT: |
I may have something that works now. The
Still, I don't get the rpath depth 2 neither the equivalent to As expected, the Now, I did try going all-in with a systematic For fun, have a look at
I recall struggling on this when adding cmake & meson long ago... Now I recall why a bit more. |
BTW the build log of openblas is flooded with warnings when building numpy only in diyspk:
This comes from It occurs while A pragmatic solution would be to create the include folder in pre_compile, since modules with dependencies require it. .PHONY: openblas_pre_compile
openblas_pre_compile:
install -d -m 755 $(STAGING_INSTALL_PREFIX)/include A better solution would be to create this folder in the Makefile that defines the include path. |
It now works now without LD_LIBRARY_PATH 🎉 .
|
Yes finally! I'm starting to wonder if this isn't an issue specific to numpy's vendor meson.... While functional it still needs to be fixed somewhat though. |
@hgy59 and @mreid-tt I haven't count how many times I've build numpy (and it's way too many), but I'm now almost certain this is a bug with meson. I've documented my findings in mesonbuild/meson#14354 and believe this relates to a long-standing bug at mesonbuild/meson#6541 I do have code to workaround that, enforcing to have Slowly pursuing... 🐢 🐌 |
@th0ma7, I really appreciate all your effort in resolving this. From the long-standing bug you mentioned, it appears that many attempts have been made across various Meson versions to address this issue. I also noticed references from the university HPC community about different patches to Meson that alter RPATH handling, especially in environments with wrappers or cross compilation. Though some of this is a bit over my head, I’m curious how you’d describe our current situation. In your new ticket, you mention using
That made me wonder about our specific scenario with |
@mreid-tt My assumption is rather that the rpath is not being discarded but actually hidden under the rpath depth. When using From what I gathered from our use-case, when setting A bit more background is needed here: when using a cross-file with meson or cmake, you need to empty your shell environment from any other duplicate flags (i.e. mostly I still have a few more tests to go through to confirm our final meson build environment state, but this is starting to stabilize with my latest commit e325eb1. Lastly, I still have to double-check its behavior outside of meson-python to ensure this isn't affecting our regular meson builds. Hope this helps? |
@th0ma7, thanks for sharing. To clarify my understanding about "rpath depth," Google AI describes it as:
Given this definition, what is the practical implication of specifying more directory levels than necessary (e.g., depth of 1+2) for our builds? Would this result in larger binaries, or merely a slight increase in build time? I'm trying to fully understand any potential impact of your proposed workaround. |
Hey @th0ma7, just checking — anything left to do on this PR before we merge? |
A bit of code cleanups and adding fortran to cmake for dsm7 and above. This has been taking way more cycles than expected, cycles that are limited. |
I'll do my best to get this merged over the coming days, hopefully within a week. Note that i did try to get panda and scikit-learn to build but i wasn't able to. Other changes may have to follow for those in particular. |
Description
Handle meson based python-wheel
Fixes:
Checklist
all-supported
completed successfullyType of change