- Sponsor
-
Notifications
You must be signed in to change notification settings - Fork 3
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
'Split and Stitch' Encoding #9
Comments
A lot of water under the bridge since this issue was opened! Adding split and stitch encoding (a.k.a chunking), is going to be a huge refactor of the way job objects are handled. The job ojects should be refactored anyway, so adding chunking is a good reason to do it. My list above is a little outdated now: "Chunking" the jobs
This doesn't need to be a task. We can chunk quickly and easily in the queuer. We can even only chunk jobs that are over a certain duration. Since Celery's object primitives allow for some pretty flexible nesting, we can check the duration of the source media and if it's over a certain duration, split it into a group of tasks with a callback (i.e. a "chord"), then we can wrap unchunked job plain-Jane-tasks and chunked job chords alike in a group.
Our chunked job callbacks can be Celery tasks themselves to do the necessary ffmpeg join and cleanup. We can add a custom Celery task state for concatenating and removign temporary files to have this show up in queuer-side progress too. Pickling for postencode
PyRemoteObjs are in-memory references. There is no way for media pool items to survive the round-trip between queuer and worker, except for using an ID to reference the in-memory objects on the queuer. Maybe now with Resolve 18 it would be possible to use the new We're keeping it simple. Link if the same project is open, leave it to the user to reiterate the timelines manually when the proejct is open next time. |
/cib |
Branch feature/issue-9--Split-and-Stitch-Encoding created! |
I've been messing around with running separate FFmpeg processes on segments of the same video file.
There are a bunch of benefits here:
I've got this working reliably locally - with no performance gains obviously since I'm running all the FFmpeg processes on the same machine.
To get this working here we'll need a few things:
A new task to parse the original job into segmentsJob must be pickled, not sent as JSON. We need to transport Resolve's PyRemoteObj MediaPoolItem as task resultsThe text was updated successfully, but these errors were encountered: