You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am facing issue while uploading file size more than 70 MB. If the file size is less than 50 MB, it gets uploaded always.
But when is larger than that, it fails more than 50% of times.
I am trying below snippet
from azure.identity import ClientSecretCredential
from azure.storage.blob import BlobServiceClient
import os
client_secret = ''
container_name = ''
account_url = 'https://sharedstorage.blob.core.windows.net/'
tenant_id = ''
client_id = ''
credential = ClientSecretCredential(tenant_id, client_id, client_secret)
blob_service_client = BlobServiceClient(account_url=account_url, credential=credential)
container_client = blob_service_client.get_container_client(container_name)
filename = "/work/out/person/Group.A_009.zip"
with open(filename, "rb") as fl:
data = fl.read()
container_client.upload_blob(name=os.path.basename(filename), data=data, overwrite=True)
Logs
# When the file size is 106MB
>>> filename = "/work/out/person/Group.A_009.zip"
>>>
>>> with open(filename, "rb") as fl:
... data = fl.read()
... container_client.upload_blob(name=os.path.basename(filename), data=data, overwrite=True)
...
Traceback (most recent call last):
File "<console>", line 3, in <module>
File "/usr/local/lib/python3.8/site-packages/azure/core/tracing/decorator.py", line 94, in wrapper_use_tracer
return func(*args, **kwargs)
File "/usr/local/lib/python3.8/site-packages/azure/storage/blob/_container_client.py", line 1125, in upload_blob
blob.upload_blob(
File "/usr/local/lib/python3.8/site-packages/azure/core/tracing/decorator.py", line 94, in wrapper_use_tracer
return func(*args, **kwargs)
File "/usr/local/lib/python3.8/site-packages/azure/storage/blob/_blob_client.py", line 775, in upload_blob
return upload_block_blob(**options)
File "/usr/local/lib/python3.8/site-packages/azure/storage/blob/_upload_helpers.py", line 178, in upload_block_blob
return client.commit_block_list(
File "/usr/local/lib/python3.8/site-packages/azure/core/tracing/decorator.py", line 94, in wrapper_use_tracer
return func(*args, **kwargs)
File "/usr/local/lib/python3.8/site-packages/azure/storage/blob/_generated/operations/_block_blob_operations.py", line 1555, in commit_block_list
_request = build_commit_block_list_request(
File "/usr/local/lib/python3.8/site-packages/azure/storage/blob/_generated/operations/_block_blob_operations.py", line 599, in build_commit_block_list_request
return HttpRequest(method="PUT", url=_url, params=_params, headers=_headers, content=content, **kwargs)
File "/usr/local/lib/python3.8/site-packages/azure/core/rest/_rest_py3.py", line 114, in __init__
default_headers = self._set_body(
File "/usr/local/lib/python3.8/site-packages/azure/core/rest/_rest_py3.py", line 150, in _set_body
default_headers, self._data = set_content_body(content)
File "/usr/local/lib/python3.8/site-packages/azure/core/rest/_helpers.py", line 148, in set_content_body
raise TypeError(
TypeError: Unexpected type for 'content': '<class 'xml.etree.ElementTree.Element'>'. We expect 'content' to either be str, bytes, a open file-like object or an iterable/asynciterable.
# When the file size is around 50MB
>>> file_path = filename = '/work/out/person/Sing.A_3.zip'
>>> with open(filename, "rb") as fl:
... data = fl.read()
... container_client.upload_blob(name=os.path.basename(filename), data=data, overwrite=True)
...
<azure.storage.blob._blob_client.BlobClient object at 0x70135e1466d0>
Extra(if it helps)
Whenever I am facing issue with upload of file, larger than 50MB, below error is consistent at that time.
>>> blobs = container_client.list_blobs()
>>> for blob in blobs:
... print(blob.name)
...
Traceback (most recent call last):
File "<console>", line 1, in <module>
File "/usr/local/lib/python3.8/site-packages/azure/core/paging.py", line 123, in __next__
return next(self._page_iterator)
File "/usr/local/lib/python3.8/site-packages/azure/core/paging.py", line 83, in __next__
self.continuation_token, self._current_page = self._extract_data(self._response)
File "/usr/local/lib/python3.8/site-packages/azure/storage/blob/_list_blobs_helper.py", line 109, in _extract_data_cb
self.current_page = [self._build_item(item) for item in self._response.segment.blob_items]
AttributeError: 'NoneType' object has no attribute 'blob_items'
Same issue I face when try to use adlfs library, adding snippet of that too.
I am facing issue while uploading file size more than 70 MB. If the file size is less than 50 MB, it gets uploaded always.
But when is larger than that, it fails more than 50% of times.
I am trying below snippet
Logs
Extra(if it helps)
Whenever I am facing issue with upload of file, larger than 50MB, below error is consistent at that time.
Same issue I face when try to use adlfs library, adding snippet of that too.
Note: azcopy command always works through the terminal with the large file size.
adlfs==2024.4.1
fsspec==2024.3.1
azure-storage-blob==12.20.0
Any help will be highly appreciated.
The text was updated successfully, but these errors were encountered: