Skip to content

Commit 15d647d

Browse files
committed
add new docstring example to 'jobs.items' class for list_iter()
1 parent 62cab36 commit 15d647d

File tree

1 file changed

+14
-0
lines changed

1 file changed

+14
-0
lines changed

scrapinghub/client/items.py

Lines changed: 14 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -37,6 +37,20 @@ class Items(_DownloadableProxyMixin, _ItemsResourceProxy):
3737
'size': 100000,
3838
}]
3939
40+
- retrieve items via a generator of lists. This is most useful in cases
41+
where the job has a huge amount of items and it needs to be broken down
42+
into chunks when consumed. This example shows a job with 3 items::
43+
44+
>>> gen = job.items.list_iter(chunksize=2)
45+
>>> next(gen)
46+
[{'name': 'Item #1'}, {'name': 'Item #2'}]
47+
>>> next(gen)
48+
[{'name': 'Item #3'}]
49+
>>> next(gen)
50+
Traceback (most recent call last):
51+
File "<stdin>", line 1, in <module>
52+
StopIteration
53+
4054
- retrieve 1 item with multiple filters::
4155
4256
>>> filters = [("size", ">", [30000]), ("size", "<", [40000])]

0 commit comments

Comments
 (0)