Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Move cron scheduling inside application #347

Merged
merged 5 commits into from
Feb 12, 2024
Merged

Move cron scheduling inside application #347

merged 5 commits into from
Feb 12, 2024

Conversation

m90
Copy link
Member

@m90 m90 commented Feb 6, 2024

Closes #268

* Move cron scheduling inside application

* Make envvar a fallback and check for errors

* Panic significantly less

* propagate error out of runBackup

* Add structured logging

* FIx error propagation to exit

* Enable the new scheduler by default

* Review fixes

* Added docs and better error propagation
@m90
Copy link
Member Author

m90 commented Feb 9, 2024

@pixxon Just to give un update on the status of this: I did some minor tweaking in regards to error handling and logging and would think this is now ready to go from a code perspective.

As we're now having a long running process though, I'd still like to confirm this doesn't leak memory somehow, so I'll let the current state of this branch (v2.37.0-alpha.3) run on all of my setups and check memory usage over the course of a week. If this looks unsuspicous, I'd say let's ship it.

@pixxon
Copy link
Contributor

pixxon commented Feb 9, 2024

Sounds great!

I am not familiar with golang, but maybe there are some tools to help finding memory leaks. ( In the c++ word, there are some great static analyzers that help, or even better are the memory sanitizers that check for addresses that were not freed properly. )

@m90
Copy link
Member Author

m90 commented Feb 9, 2024

The only way to really leak memory in Go is to have goroutines running that never return, meaning everything in their scope isn't possible to be garbage collected. I definitely managed to do this before, even when having static analysis look for such problems. I don't think it's likely here, but I'd feel safer if I've seen it running for a bit.

One simple thing we could do is print the numbers of Go routines from time to time to see if the number stays stable, but I guess I'll defer that until it looks like there really is a leak.

@m90
Copy link
Member Author

m90 commented Feb 9, 2024

I got nerd sniped and fired up pprof which let me indeed find a leak: the Docker client leaks a persistent HTTP connection when you don't call Close on the client object.

After doing this, the number of Goroutines seems stable when I run a per-minute schedule locally for ~15mins or so.

See 68ba6ce as a fix

I'll push this as v2.37.0-alpha.4 and deploy it.

@m90
Copy link
Member Author

m90 commented Feb 9, 2024

Seems the advice on closing it is right there in the docs https://pkg.go.dev/github.com/docker/docker/client#section-readme ....

m90 added 4 commits February 11, 2024 21:39
* Hoist control for exiting script a level up

* Do not accidentally nil out errors

* Log when running schedule

* Remove duplicate log line

* Warn on cron schedule that will never run
@m90 m90 marked this pull request as ready for review February 12, 2024 11:53
@m90
Copy link
Member Author

m90 commented Feb 12, 2024

I added some additional profiling options and had this running for a few days. So far, it doesn't look like anything is leaking to me, so I'll go ahead and merge and release this now.

@m90 m90 merged commit 4c74313 into main Feb 12, 2024
2 checks passed
@m90 m90 deleted the next branch February 12, 2024 15:04
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Consider using a dedicated cron package
2 participants