Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Python Backend complains "triton_python_backend_utils" has no attribute "InferenceRequest" #4743

Open
Michael-Jing opened this issue Aug 5, 2022 · 13 comments
Labels
bug Something isn't working

Comments

@Michael-Jing
Copy link

I'm using the python business logic scripting, and a conda packed python environment with python3.8. Both 22.06 and 22.07 version show the following error message "UNAVAILABLE: Internal: AttributeError: module 'triton_python_backend_utils' has no attribute 'InferenceRequest'". but it works ok on the third party docker image flyingmachine/tritonserver-w-ort-1.11.0

@tanmayv25
Copy link
Contributor

@Tabrizian ^^^

@Tabrizian
Copy link
Member

Hi @Michael-Jing, sorry about the delay. Can you please share the structure of your model repository? Are you copying triton_python_backend_utils in your model directory? If yes, that is why you are observing this error.

@Michael-Jing
Copy link
Author

Michael-Jing commented Aug 23, 2022

Hi, sorry for the late reply, I don't copy triton_python_backend_utils.
part of the repository is like this
image

and I use the following config

parameters: { key: "EXECUTION_ENV_PATH", value: {string_value: "$$TRITON_MODEL_DIRECTORY/../python38.tar.gz"} }

@tanmayv25
Copy link
Contributor

tanmayv25 commented Sep 7, 2022

Opening a bug with the team to investigate further. Most likely it is a user issue.
@Michael-Jing Can you share your model repository and exact steps to reproduce the issue? We don't think tritonserver-w-ort-1.11.0 should make any difference. Can you look carefully into the differences between the two?

@tanmayv25 tanmayv25 added the bug Something isn't working label Sep 7, 2022
@Michael-Jing
Copy link
Author

Michael-Jing commented Sep 12, 2022

Hi, I found that the reason for the error is I defined a funciton with the following Signature in my worker.py

import triton_python_backend_utils as pbu

def process(request: pbu.InferenceRequest):

after I removed the pbu.InferenceRequest type annotation for request, it works fine.

@dyastremsky
Copy link
Contributor

Thanks for updating us, Michael! Closing ticket.

@sfc-gh-zhwang
Copy link

sfc-gh-zhwang commented Jul 20, 2023

why remove type annotation would help? Or maybe I should ask, why use pbu.InferenceRequest would cause issue?

@david-waterworth
Copy link

david-waterworth commented Aug 21, 2023

I find this confusing as well - all the examples make use of the type pb_utils.InferenceRequest but it's not defined in the triton_python_backend_utils.py script as a python type, it seems to be imported from c_python_backend_utils.InferenceRequest via https://github.com/triton-inference-server/python_backend/blob/main/src/pb_stub.cc

@tobiasvitt
Copy link

We are also facing issues with triton_python_backend_utils annotations.
Currently, we are using triton server 22.05-py3 and don't face any issues with the following code inside our models:

import triton_python_backend_utils as tpbu
class TritonPythonModel:
    ...
    def execute(self, requests: List[List[tpbu.Tensor]]) -> List[List[tpbu.Tensor]]:
        ...

However, we tried to update the triton server to 23.08-py3, and now we get the following error:

AttributeError: module 'triton_python_backend_utils' has no attribute 'Tensor'

When we remove the annotations, the model is loaded successfully.

Why is this the case?

@Fleyderer
Copy link

Facing the same issue with pb_utils.Tensor and pb_utils.InferenceRequest, using triton server 23.07-py3

Okay, I remove annotations...

@tobiasvitt
Copy link

tobiasvitt commented Oct 31, 2023

@Tabrizian @tanmayv25, any update here? The ticket was closed, but it seems like many people have issues with this. How can this class of bugs be fixed without removing annotations? I don't see a reason why it would be intended to stop supporting annotations. For me this is still a bug.

@Tabrizian
Copy link
Member

Sorry, we haven't be able to still get to this issue. Will update you as soon as there are any updates.

@david-waterworth
Copy link

david-waterworth commented Feb 28, 2024

The issue appears to be the timing of when the stub is setup, i.e.

https://github.com/triton-inference-server/python_backend/blob/ba616e26c256f11c41f7249c6a55220af8becee9/src/pb_stub.cc#L442

If I use properties/methods of the python stub at the top level of my module, i.e.

import triton_python_backend_utils as pb_utils
logger = pb_utils.Logger

this fails with

error: creating server: Invalid argument - load failed for model 'encoder': version 1 is at UNAVAILABLE state: Internal: AttributeError: module 'triton_python_backend_utils' has no attribute 'Logger'

But the following works fine

import triton_python_backend_utils as pb_utils
class TritonPythonModel:
    def initialize(self, args: Dict[str, str]) -> None:
        logger = pb_utils.Logger

And I can use the c versions at the module level (including type annotations).

So the script appears to be imported, then the stub is setup, then the model is initalised. The order would ideally be setup the stub, import the script then initalise the model.

I've upgraded to 23.12 and still seeing this.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Development

No branches or pull requests

8 participants