You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
First, I want to thank you for developing and maintaining the Intel Extension for PyTorch.
Unfortunately, I ran into a problem during compilation...
running build_ext
running build_clib
WARNING: Please install flake8 by pip install -r requirements-flake8.txt to check format!
-- The C compiler identification is IntelLLVM 2025.0.4
-- The CXX compiler identification is IntelLLVM 2025.0.4
-- Detecting C compiler ABI info
-- Detecting C compiler ABI info - done
-- Check for working C compiler: /opt/intel/oneapi/compiler/2025.0/bin/icx - skipped
-- Detecting C compile features
-- Detecting C compile features - done
-- Detecting CXX compiler ABI info
-- Detecting CXX compiler ABI info - done
-- Check for working CXX compiler: /opt/intel/oneapi/compiler/2025.0/bin/icpx - skipped
-- Detecting CXX compile features
-- Detecting CXX compile features - done
CMake Warning (dev) at /home/anton/miniconda3/envs/pytorch-intel/share/cmake-3.31/Modules/FindPackageHandleStandardArgs.cmake:441 (message):
The package name passed to `find_package_handle_standard_args` (SYCL) does
not match the name of the calling package (SYCLToolkit). This can lead to
problems in calling code that expects `find_package` result variables
(e.g., `_FOUND`) to follow a certain pattern.
Call Stack (most recent call first):
/home/anton/miniconda3/envs/pytorch-intel/lib/python3.12/site-packages/torch/share/cmake/Caffe2/FindSYCLToolkit.cmake:125 (find_package_handle_standard_args)
/home/anton/miniconda3/envs/pytorch-intel/lib/python3.12/site-packages/torch/share/cmake/Caffe2/public/xpu.cmake:12 (find_package)
/home/anton/miniconda3/envs/pytorch-intel/lib/python3.12/site-packages/torch/share/cmake/Caffe2/Caffe2Config.cmake:101 (include)
/home/anton/miniconda3/envs/pytorch-intel/lib/python3.12/site-packages/torch/share/cmake/Torch/TorchConfig.cmake:68 (find_package)
CMakeLists.txt:35 (find_package)
This warning is for project developers. Use -Wno-dev to suppress it.
-- Found SYCL: /opt/intel/oneapi/compiler/2025.0/include;/opt/intel/oneapi/compiler/2025.0/include/sycl (found version "20250004")
CMake Warning at /home/anton/miniconda3/envs/pytorch-intel/lib/python3.12/site-packages/torch/share/cmake/Torch/TorchConfig.cmake:22 (message):
static library kineto_LIBRARY-NOTFOUND not found.
Call Stack (most recent call first):
/home/anton/miniconda3/envs/pytorch-intel/lib/python3.12/site-packages/torch/share/cmake/Torch/TorchConfig.cmake:121 (append_torchlib_if_found)
CMakeLists.txt:35 (find_package)
-- Found Torch: /home/anton/miniconda3/envs/pytorch-intel/lib/python3.12/site-packages/torch/lib/libtorch.so (Required is at least version "2.6")
-- Found IntelSYCL: /opt/intel/oneapi/compiler/2025.0/include;/opt/intel/oneapi/compiler/2025.0/include/sycl (found version "202001")
25.05.32567
-- IntelSYCL found. Compiling with SYCL support
-- XeTLA: Found arch from list: XE_HPC
-- XeTLA: Found arch from list: XE_HPG
-- XeTLA: Found arch from list: XE_LPG
CMake Error at csrc/gpu/aten/operators/xetla/kernels/CMakeLists.txt:49 (message):
XeTLA: unknown AOT target: tgllp
-- Configuring incomplete, errors occurred!
Traceback (most recent call last):
File "/home/anton/Tools/pytorch-intel/intel-extension-for-pytorch/setup.py", line 1237, in <module>
setup(
File "/home/anton/miniconda3/envs/pytorch-intel/lib/python3.12/site-packages/setuptools/__init__.py", line 108, in setup
return distutils.core.setup(**attrs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/anton/miniconda3/envs/pytorch-intel/lib/python3.12/site-packages/setuptools/_distutils/core.py", line 184, in setup
return run_commands(dist)
^^^^^^^^^^^^^^^^^^
File "/home/anton/miniconda3/envs/pytorch-intel/lib/python3.12/site-packages/setuptools/_distutils/core.py", line 200, in run_commands
dist.run_commands()
File "/home/anton/miniconda3/envs/pytorch-intel/lib/python3.12/site-packages/setuptools/_distutils/dist.py", line 970, in run_commands
self.run_command(cmd)
File "/home/anton/miniconda3/envs/pytorch-intel/lib/python3.12/site-packages/setuptools/dist.py", line 945, in run_command
super().run_command(command)
File "/home/anton/miniconda3/envs/pytorch-intel/lib/python3.12/site-packages/setuptools/_distutils/dist.py", line 989, in run_command
cmd_obj.run()
File "/home/anton/miniconda3/envs/pytorch-intel/lib/python3.12/site-packages/setuptools/command/bdist_wheel.py", line 373, in run
self.run_command("build")
File "/home/anton/miniconda3/envs/pytorch-intel/lib/python3.12/site-packages/setuptools/_distutils/cmd.py", line 316, in run_command
self.distribution.run_command(command)
File "/home/anton/miniconda3/envs/pytorch-intel/lib/python3.12/site-packages/setuptools/dist.py", line 945, in run_command
super().run_command(command)
File "/home/anton/miniconda3/envs/pytorch-intel/lib/python3.12/site-packages/setuptools/_distutils/dist.py", line 989, in run_command
cmd_obj.run()
File "/home/anton/miniconda3/envs/pytorch-intel/lib/python3.12/site-packages/setuptools/_distutils/command/build.py", line 135, in run
self.run_command(cmd_name)
File "/home/anton/miniconda3/envs/pytorch-intel/lib/python3.12/site-packages/setuptools/_distutils/cmd.py", line 316, in run_command
self.distribution.run_command(command)
File "/home/anton/miniconda3/envs/pytorch-intel/lib/python3.12/site-packages/setuptools/dist.py", line 945, in run_command
super().run_command(command)
File "/home/anton/miniconda3/envs/pytorch-intel/lib/python3.12/site-packages/setuptools/_distutils/dist.py", line 989, in run_command
cmd_obj.run()
File "/home/anton/Tools/pytorch-intel/intel-extension-for-pytorch/setup.py", line 1203, in run
self.run_command("build_clib")
File "/home/anton/miniconda3/envs/pytorch-intel/lib/python3.12/site-packages/setuptools/_distutils/cmd.py", line 316, in run_command
self.distribution.run_command(command)
File "/home/anton/miniconda3/envs/pytorch-intel/lib/python3.12/site-packages/setuptools/dist.py", line 945, in run_command
super().run_command(command)
File "/home/anton/miniconda3/envs/pytorch-intel/lib/python3.12/site-packages/setuptools/_distutils/dist.py", line 989, in run_command
cmd_obj.run()
File "/home/anton/Tools/pytorch-intel/intel-extension-for-pytorch/setup.py", line 803, in run
_gen_build_cfg_from_cmake(
File "/home/anton/Tools/pytorch-intel/intel-extension-for-pytorch/setup.py", line 620, in _gen_build_cfg_from_cmake
check_call(
File "/home/anton/miniconda3/envs/pytorch-intel/lib/python3.12/subprocess.py", line 413, in check_call
raise CalledProcessError(retcode, cmd)
subprocess.CalledProcessError: Command '['cmake', '/home/anton/Tools/pytorch-intel/intel-extension-for-pytorch', '-DBUILD_MODULE_TYPE=GPU', '-DBUILD_STATIC_ONEMKL=OFF', '-DBUILD_WITH_CPU=OFF', '-DBUILD_WITH_XPU=ON', '-DCMAKE_BUILD_TYPE=Release', '-DCMAKE_CXX_COMPILER=icpx', '-DCMAKE_C_COMPILER=icx', '-DCMAKE_INSTALL_INCLUDEDIR=include', '-DCMAKE_INSTALL_LIBDIR=lib', '-DCMAKE_INSTALL_PREFIX=/home/anton/Tools/pytorch-intel/intel-extension-for-pytorch/build/Release/packages/intel_extension_for_pytorch', '-DCMAKE_PREFIX_PATH=/home/anton/miniconda3/envs/pytorch-intel/lib/python3.12/site-packages/torch/share/cmake;/opt/intel/oneapi/pti/0.10/lib/cmake/pti;/opt/intel/oneapi/mkl/2025.0/lib/cmake;/opt/intel/oneapi/compiler/2025.0;/opt/intel/oneapi/tbb/2022.0/env/..;/opt/intel/oneapi/pti/0.10/lib/cmake/pti;/opt/intel/oneapi/mkl/2025.0/lib/cmake;/opt/intel/oneapi/compiler/2025.0', '-DCMAKE_PROJECT_VERSION=2.6.10', '-DIPEX_PROJ_NAME=intel_extension_for_pytorch', '-DLIBIPEX_GITREV=607135871', '-DLIBIPEX_VERSION=2.6.10+git6071358', '-DMKL_SYCL_THREADING=intel_thread', '-DPYTHON_BUILD_VERSION=3.12.0 | packaged by conda-forge | (main, Oct 3 2023, 08:43:22) [GCC 12.3.0]', '-DPYTHON_EXECUTABLE=/home/anton/miniconda3/envs/pytorch-intel/bin/python', '-DPYTHON_INCLUDE_DIR=/home/anton/miniconda3/envs/pytorch-intel/include/python3.12', '-DPYTHON_PLATFORM_INFO=Linux-6.12.10-76061203-generic-x86_64-with-glibc2.35', '-DPYTHON_VERSION=3.12.0', '-DUSE_AOT_DEVLIST=tgllp']' returned non-zero exit status 1.
I also want to point out that if you run the command like this bash compile_bundle.sh /opt/intel/oneapi pytorch. Then the process will come to an end and the package will be installed, but as far as I understand this is not the best installation option. Or am I wrong?
I will be very glad if you can help solve my problem)
The text was updated successfully, but these errors were encountered:
Describe the issue
First, I want to thank you for developing and maintaining the Intel Extension for PyTorch.
Unfortunately, I ran into a problem during compilation...
I followed the instructions and installed all the dependencies (https://pytorch-extension.intel.com/installation?platform=gpu&version=v2.6.10%2Bxpu&os=linux%2Fwsl2&package=source), then I ran the script.
As you can see, for the AOT prameter, I chose
tgllp
. I made this choice according to the table shown on the website - https://www.intel.com/content/www/us/en/docs/dpcpp-cpp-compiler/developer-guide-reference/2025-0/ahead-of-time-compilation.htmlHere is the information about my devices
Installed packages, drivers, etc. Maybe I missed something?
I also want to point out that if you run the command like this
bash compile_bundle.sh /opt/intel/oneapi pytorch
. Then the process will come to an end and the package will be installed, but as far as I understand this is not the best installation option. Or am I wrong?I will be very glad if you can help solve my problem)
The text was updated successfully, but these errors were encountered: