Skip to content

unknown AOT target #809

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
ubercomrade opened this issue Apr 16, 2025 · 2 comments
Open

unknown AOT target #809

ubercomrade opened this issue Apr 16, 2025 · 2 comments
Assignees
Labels
NotAnIssue XPU/GPU XPU/GPU specific issues

Comments

@ubercomrade
Copy link

Describe the issue

First, I want to thank you for developing and maintaining the Intel Extension for PyTorch.

Unfortunately, I ran into a problem during compilation...

running build_ext
running build_clib
WARNING: Please install flake8 by pip install -r requirements-flake8.txt to check format!
-- The C compiler identification is IntelLLVM 2025.0.4
-- The CXX compiler identification is IntelLLVM 2025.0.4
-- Detecting C compiler ABI info
-- Detecting C compiler ABI info - done
-- Check for working C compiler: /opt/intel/oneapi/compiler/2025.0/bin/icx - skipped
-- Detecting C compile features
-- Detecting C compile features - done
-- Detecting CXX compiler ABI info
-- Detecting CXX compiler ABI info - done
-- Check for working CXX compiler: /opt/intel/oneapi/compiler/2025.0/bin/icpx - skipped
-- Detecting CXX compile features
-- Detecting CXX compile features - done
CMake Warning (dev) at /home/anton/miniconda3/envs/pytorch-intel/share/cmake-3.31/Modules/FindPackageHandleStandardArgs.cmake:441 (message):
  The package name passed to `find_package_handle_standard_args` (SYCL) does
  not match the name of the calling package (SYCLToolkit).  This can lead to
  problems in calling code that expects `find_package` result variables
  (e.g., `_FOUND`) to follow a certain pattern.
Call Stack (most recent call first):
  /home/anton/miniconda3/envs/pytorch-intel/lib/python3.12/site-packages/torch/share/cmake/Caffe2/FindSYCLToolkit.cmake:125 (find_package_handle_standard_args)
  /home/anton/miniconda3/envs/pytorch-intel/lib/python3.12/site-packages/torch/share/cmake/Caffe2/public/xpu.cmake:12 (find_package)
  /home/anton/miniconda3/envs/pytorch-intel/lib/python3.12/site-packages/torch/share/cmake/Caffe2/Caffe2Config.cmake:101 (include)
  /home/anton/miniconda3/envs/pytorch-intel/lib/python3.12/site-packages/torch/share/cmake/Torch/TorchConfig.cmake:68 (find_package)
  CMakeLists.txt:35 (find_package)
This warning is for project developers.  Use -Wno-dev to suppress it.

-- Found SYCL: /opt/intel/oneapi/compiler/2025.0/include;/opt/intel/oneapi/compiler/2025.0/include/sycl (found version "20250004")
CMake Warning at /home/anton/miniconda3/envs/pytorch-intel/lib/python3.12/site-packages/torch/share/cmake/Torch/TorchConfig.cmake:22 (message):
  static library kineto_LIBRARY-NOTFOUND not found.
Call Stack (most recent call first):
  /home/anton/miniconda3/envs/pytorch-intel/lib/python3.12/site-packages/torch/share/cmake/Torch/TorchConfig.cmake:121 (append_torchlib_if_found)
  CMakeLists.txt:35 (find_package)


-- Found Torch: /home/anton/miniconda3/envs/pytorch-intel/lib/python3.12/site-packages/torch/lib/libtorch.so (Required is at least version "2.6")
-- Found IntelSYCL: /opt/intel/oneapi/compiler/2025.0/include;/opt/intel/oneapi/compiler/2025.0/include/sycl (found version "202001")
25.05.32567
-- IntelSYCL found. Compiling with SYCL support
-- XeTLA: Found arch from list: XE_HPC
-- XeTLA: Found arch from list: XE_HPG
-- XeTLA: Found arch from list: XE_LPG
CMake Error at csrc/gpu/aten/operators/xetla/kernels/CMakeLists.txt:49 (message):
  XeTLA: unknown AOT target: tgllp


-- Configuring incomplete, errors occurred!
Traceback (most recent call last):
  File "/home/anton/Tools/pytorch-intel/intel-extension-for-pytorch/setup.py", line 1237, in <module>
    setup(
  File "/home/anton/miniconda3/envs/pytorch-intel/lib/python3.12/site-packages/setuptools/__init__.py", line 108, in setup
    return distutils.core.setup(**attrs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/anton/miniconda3/envs/pytorch-intel/lib/python3.12/site-packages/setuptools/_distutils/core.py", line 184, in setup
    return run_commands(dist)
           ^^^^^^^^^^^^^^^^^^
  File "/home/anton/miniconda3/envs/pytorch-intel/lib/python3.12/site-packages/setuptools/_distutils/core.py", line 200, in run_commands
    dist.run_commands()
  File "/home/anton/miniconda3/envs/pytorch-intel/lib/python3.12/site-packages/setuptools/_distutils/dist.py", line 970, in run_commands
    self.run_command(cmd)
  File "/home/anton/miniconda3/envs/pytorch-intel/lib/python3.12/site-packages/setuptools/dist.py", line 945, in run_command
    super().run_command(command)
  File "/home/anton/miniconda3/envs/pytorch-intel/lib/python3.12/site-packages/setuptools/_distutils/dist.py", line 989, in run_command
    cmd_obj.run()
  File "/home/anton/miniconda3/envs/pytorch-intel/lib/python3.12/site-packages/setuptools/command/bdist_wheel.py", line 373, in run
    self.run_command("build")
  File "/home/anton/miniconda3/envs/pytorch-intel/lib/python3.12/site-packages/setuptools/_distutils/cmd.py", line 316, in run_command
    self.distribution.run_command(command)
  File "/home/anton/miniconda3/envs/pytorch-intel/lib/python3.12/site-packages/setuptools/dist.py", line 945, in run_command
    super().run_command(command)
  File "/home/anton/miniconda3/envs/pytorch-intel/lib/python3.12/site-packages/setuptools/_distutils/dist.py", line 989, in run_command
    cmd_obj.run()
  File "/home/anton/miniconda3/envs/pytorch-intel/lib/python3.12/site-packages/setuptools/_distutils/command/build.py", line 135, in run
    self.run_command(cmd_name)
  File "/home/anton/miniconda3/envs/pytorch-intel/lib/python3.12/site-packages/setuptools/_distutils/cmd.py", line 316, in run_command
    self.distribution.run_command(command)
  File "/home/anton/miniconda3/envs/pytorch-intel/lib/python3.12/site-packages/setuptools/dist.py", line 945, in run_command
    super().run_command(command)
  File "/home/anton/miniconda3/envs/pytorch-intel/lib/python3.12/site-packages/setuptools/_distutils/dist.py", line 989, in run_command
    cmd_obj.run()
  File "/home/anton/Tools/pytorch-intel/intel-extension-for-pytorch/setup.py", line 1203, in run
    self.run_command("build_clib")
  File "/home/anton/miniconda3/envs/pytorch-intel/lib/python3.12/site-packages/setuptools/_distutils/cmd.py", line 316, in run_command
    self.distribution.run_command(command)
  File "/home/anton/miniconda3/envs/pytorch-intel/lib/python3.12/site-packages/setuptools/dist.py", line 945, in run_command
    super().run_command(command)
  File "/home/anton/miniconda3/envs/pytorch-intel/lib/python3.12/site-packages/setuptools/_distutils/dist.py", line 989, in run_command
    cmd_obj.run()
  File "/home/anton/Tools/pytorch-intel/intel-extension-for-pytorch/setup.py", line 803, in run
    _gen_build_cfg_from_cmake(
  File "/home/anton/Tools/pytorch-intel/intel-extension-for-pytorch/setup.py", line 620, in _gen_build_cfg_from_cmake
    check_call(
  File "/home/anton/miniconda3/envs/pytorch-intel/lib/python3.12/subprocess.py", line 413, in check_call
    raise CalledProcessError(retcode, cmd)
subprocess.CalledProcessError: Command '['cmake', '/home/anton/Tools/pytorch-intel/intel-extension-for-pytorch', '-DBUILD_MODULE_TYPE=GPU', '-DBUILD_STATIC_ONEMKL=OFF', '-DBUILD_WITH_CPU=OFF', '-DBUILD_WITH_XPU=ON', '-DCMAKE_BUILD_TYPE=Release', '-DCMAKE_CXX_COMPILER=icpx', '-DCMAKE_C_COMPILER=icx', '-DCMAKE_INSTALL_INCLUDEDIR=include', '-DCMAKE_INSTALL_LIBDIR=lib', '-DCMAKE_INSTALL_PREFIX=/home/anton/Tools/pytorch-intel/intel-extension-for-pytorch/build/Release/packages/intel_extension_for_pytorch', '-DCMAKE_PREFIX_PATH=/home/anton/miniconda3/envs/pytorch-intel/lib/python3.12/site-packages/torch/share/cmake;/opt/intel/oneapi/pti/0.10/lib/cmake/pti;/opt/intel/oneapi/mkl/2025.0/lib/cmake;/opt/intel/oneapi/compiler/2025.0;/opt/intel/oneapi/tbb/2022.0/env/..;/opt/intel/oneapi/pti/0.10/lib/cmake/pti;/opt/intel/oneapi/mkl/2025.0/lib/cmake;/opt/intel/oneapi/compiler/2025.0', '-DCMAKE_PROJECT_VERSION=2.6.10', '-DIPEX_PROJ_NAME=intel_extension_for_pytorch', '-DLIBIPEX_GITREV=607135871', '-DLIBIPEX_VERSION=2.6.10+git6071358', '-DMKL_SYCL_THREADING=intel_thread', '-DPYTHON_BUILD_VERSION=3.12.0 | packaged by conda-forge | (main, Oct  3 2023, 08:43:22) [GCC 12.3.0]', '-DPYTHON_EXECUTABLE=/home/anton/miniconda3/envs/pytorch-intel/bin/python', '-DPYTHON_INCLUDE_DIR=/home/anton/miniconda3/envs/pytorch-intel/include/python3.12', '-DPYTHON_PLATFORM_INFO=Linux-6.12.10-76061203-generic-x86_64-with-glibc2.35', '-DPYTHON_VERSION=3.12.0', '-DUSE_AOT_DEVLIST=tgllp']' returned non-zero exit status 1.

I followed the instructions and installed all the dependencies (https://pytorch-extension.intel.com/installation?platform=gpu&version=v2.6.10%2Bxpu&os=linux%2Fwsl2&package=source), then I ran the script.

bash compile_bundle.sh /opt/intel/oneapi tgllp

As you can see, for the AOT prameter, I chose tgllp. I made this choice according to the table shown on the website - https://www.intel.com/content/www/us/en/docs/dpcpp-cpp-compiler/developer-guide-reference/2025-0/ahead-of-time-compilation.html

Here is the information about my devices

clinfo -l
Platform #0: Intel(R) OpenCL
 `-- Device #0: 11th Gen Intel(R) Core(TM) i7-11370H @ 3.30GHz
Platform #1: Intel(R) OpenCL Graphics
 `-- Device #0: Intel(R) Iris(R) Xe Graphics

Installed packages, drivers, etc. Maybe I missed something?

intel-gpu-tools/jammy-updates,now 1.26-2ubuntu0.1 amd64 [installed]
intel-igc-cm/now 1.0.225.54083-1097~22.04 amd64 [installed,local]
intel-level-zero-gpu-raytracing/now 1.0.0-92~u22.04 amd64 [installed,local]
intel-media-va-driver/jammy-updates,now 22.3.1+dfsg1-1ubuntu2 amd64 [installed,automatic]
intel-microcode/jammy-security,jammy-updates,now 3.20250211.0ubuntu0.22.04.1 amd64 [installed,automatic]
intel-ocloc/now 25.05.32567.18-1099~22.04 amd64 [installed,local]
intel-oneapi-ccl-2021.14/all,now 2021.14.0-505 amd64 [installed,automatic]
intel-oneapi-ccl-devel-2021.14/all,now 2021.14.0-505 amd64 [installed,automatic]
intel-oneapi-ccl-devel/all,now 2021.14.0-505 amd64 [installed,upgradable to: 2021.15.0-397]
intel-oneapi-common-licensing-2025.0/all,now 2025.0.1-15 all [installed,automatic]
intel-oneapi-common-licensing-2025.1/all,now 2025.1.0-359 all [installed,automatic]
intel-oneapi-common-oneapi-vars-2025.0/all,now 2025.0.1-15 all [installed,automatic]
intel-oneapi-common-oneapi-vars-2025.1/all,now 2025.1.0-359 all [installed,automatic]
intel-oneapi-common-vars/all,now 2025.1.0-359 all [installed,automatic]
intel-oneapi-compiler-cpp-eclipse-cfg-2025.0/all,now 2025.0.4-1519 all [installed,automatic]
intel-oneapi-compiler-dpcpp-cpp-common-2025.0/all,now 2025.0.4-1519 all [installed,automatic]
intel-oneapi-compiler-dpcpp-cpp-runtime-2025.0/all,now 2025.0.4-1519 amd64 [installed,automatic]
intel-oneapi-compiler-dpcpp-eclipse-cfg-2025.0/all,now 2025.0.4-1519 all [installed,automatic]
intel-oneapi-compiler-shared-2025.0/all,now 2025.0.4-1519 amd64 [installed,automatic]
intel-oneapi-compiler-shared-common-2025.0/all,now 2025.0.4-1519 all [installed,automatic]
intel-oneapi-compiler-shared-runtime-2025.0/all,now 2025.0.4-1519 amd64 [installed,automatic]
intel-oneapi-dev-utilities-2025.0/all,now 2025.0.0-599 amd64 [installed,automatic]
intel-oneapi-dev-utilities-eclipse-cfg-2025.0/all,now 2025.0.0-599 all [installed,automatic]
intel-oneapi-dpcpp-cpp-2025.0/all,now 2025.0.4-1519 amd64 [installed]
intel-oneapi-dpcpp-debugger-2025.0/all,now 2025.0.0-663 amd64 [installed,automatic]
intel-oneapi-icc-eclipse-plugin-cpp-2025.0/all,now 2025.0.4-1519 all [installed,automatic]
intel-oneapi-mkl-classic-devel-2025.0/all,now 2025.0.1-14 amd64 [installed,automatic]
intel-oneapi-mkl-classic-include-2025.0/all,now 2025.0.1-14 amd64 [installed,automatic]
intel-oneapi-mkl-cluster-2025.0/all,now 2025.0.1-14 amd64 [installed,automatic]
intel-oneapi-mkl-cluster-devel-2025.0/all,now 2025.0.1-14 amd64 [installed,automatic]
intel-oneapi-mkl-core-2025.0/all,now 2025.0.1-14 amd64 [installed,automatic]
intel-oneapi-mkl-core-devel-2025.0/all,now 2025.0.1-14 amd64 [installed,automatic]
intel-oneapi-mkl-devel-2025.0/all,now 2025.0.1-14 amd64 [installed,automatic]
intel-oneapi-mkl-devel/all,now 2025.0.1-14 amd64 [installed,upgradable to: 2025.1.0-801]
intel-oneapi-mkl-sycl-2025.0/all,now 2025.0.1-14 amd64 [installed,automatic]
intel-oneapi-mkl-sycl-blas-2025.0/all,now 2025.0.1-14 amd64 [installed,automatic]
intel-oneapi-mkl-sycl-data-fitting-2025.0/all,now 2025.0.1-14 amd64 [installed,automatic]
intel-oneapi-mkl-sycl-devel-2025.0/all,now 2025.0.1-14 amd64 [installed,automatic]
intel-oneapi-mkl-sycl-dft-2025.0/all,now 2025.0.1-14 amd64 [installed,automatic]
intel-oneapi-mkl-sycl-include-2025.0/all,now 2025.0.1-14 amd64 [installed,automatic]
intel-oneapi-mkl-sycl-lapack-2025.0/all,now 2025.0.1-14 amd64 [installed,automatic]
intel-oneapi-mkl-sycl-rng-2025.0/all,now 2025.0.1-14 amd64 [installed,automatic]
intel-oneapi-mkl-sycl-sparse-2025.0/all,now 2025.0.1-14 amd64 [installed,automatic]
intel-oneapi-mkl-sycl-stats-2025.0/all,now 2025.0.1-14 amd64 [installed,automatic]
intel-oneapi-mkl-sycl-vm-2025.0/all,now 2025.0.1-14 amd64 [installed,automatic]
intel-oneapi-mpi-2021.14/all,now 2021.14.2-7 amd64 [installed,automatic]
intel-oneapi-mpi-devel-2021.14/all,now 2021.14.2-7 amd64 [installed,automatic]
intel-oneapi-openmp-2025.0/all,now 2025.0.4-1519 amd64 [installed,automatic]
intel-oneapi-openmp-common-2025.0/all,now 2025.0.4-1519 all [installed,automatic]
intel-oneapi-tbb-2022.0/all,now 2022.0.0-402 amd64 [installed,automatic]
intel-oneapi-tbb-devel-2022.0/all,now 2022.0.0-402 amd64 [installed,automatic]
intel-oneapi-tcm-1.2/all,now 1.2.0-589 amd64 [installed,automatic]
intel-oneapi-umf-0.9/all,now 0.9.1-6 amd64 [installed,automatic]
intel-opencl-icd/now 25.05.32567.18-1099~22.04 amd64 [installed,local]
intel-pti-0.10/all,now 0.10.2-6 amd64 [installed,automatic]
intel-pti-dev-0.10/all,now 0.10.2-6 amd64 [installed,automatic]
intel-pti-dev/all,now 0.10.0-284 amd64 [installed,upgradable to: 0.11.0-304]
libdrm-intel1/jammy,now 2.4.120-1pop1~1706792268~22.04~bfb54ee amd64 [installed,automatic]
libdrm-intel1/jammy,now 2.4.120-1pop1~1706792268~22.04~bfb54ee i386 [installed,automatic]
libze-intel-gpu1/now 25.05.32567.18-1099~22.04 amd64 [installed,local]

I also want to point out that if you run the command like this bash compile_bundle.sh /opt/intel/oneapi pytorch. Then the process will come to an end and the package will be installed, but as far as I understand this is not the best installation option. Or am I wrong?

I will be very glad if you can help solve my problem)

@jingxu10
Copy link
Contributor

jingxu10 commented Apr 17, 2025

Seems like XeTLA doesn't support tgllp. I'll check internally.

@jingxu10 jingxu10 self-assigned this Apr 17, 2025
@jingxu10 jingxu10 added XPU/GPU XPU/GPU specific issues NotAnIssue labels Apr 17, 2025
@jingxu10
Copy link
Contributor

jingxu10 commented Apr 17, 2025

Unfortunately, Tiger Lake is not supported.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
NotAnIssue XPU/GPU XPU/GPU specific issues
Projects
None yet
Development

No branches or pull requests

2 participants