Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Cache persistence #33

Open
ldasilva-net opened this issue Feb 20, 2015 · 19 comments
Open

Cache persistence #33

ldasilva-net opened this issue Feb 20, 2015 · 19 comments
Assignees
Milestone

Comments

@ldasilva-net
Copy link

I use memoize in a Node Web Service application.

  1. When I shutdown the service, can I rescue the cache created before the shutdown?
  2. The same scenario when I move my service to other server.
    Can I save the cache in disk and then when the service begin assign to the memoize object again?
    Thanks
@medikoo
Copy link
Owner

medikoo commented Feb 20, 2015

@ldasilva-net cache is not exposed in any way, and there's no way to access it as whole. I can probably introduce some function so copy of cache is returned on request, and that cache can be injected via some external mean.
Still that may work only for normalizers which produce deterministic id's for arguments. e.g. internally it is guaranteed only in primitive mode. Normal mode generates ids for cache by incrementation, such cache cannot be reliably reused.

@medikoo medikoo self-assigned this Feb 20, 2015
@ldasilva-net
Copy link
Author

@medikoo ok, well... great! Thanks a lot for considering this.

@ldasilva-net
Copy link
Author

Just thinking... I dont know the internal funcionality of your lib, but I presume that the object maintains all the cache in memory. Maybe you can consider using some system stored on disk for saving on long term thought caches, where the object use the disk for store/read the cache. For example, in my scenario, I have information that rarely changes, so my idea is that most of the request mantein cached.... ok, you need a realy lot of situations for full the memory, I'm just thinking out loud.
Thanks

@medikoo
Copy link
Owner

medikoo commented Feb 20, 2015

@ldasilva-net it'll never become internal part of memoizee, as memoizee is about doing one thing, and persistence is rare custom case that can be easily done aside.

I'm also not really convinced at this point to introduce option of cache import/export, as as I pointed it can work reliably only in some configuration cases, so functionality from start will be partially broken, and that opens door for other issues.

Can you explain better your use case? What exactly do you memoize that you'd like to store persistently? Do you really need it, or is it just nice to have feature?

@ldasilva-net
Copy link
Author

@medikoo I explain the scenario: I have a API build in Node with Express. The methods in there go to the database where execute querys and aggregations who in some cases are heavy (for process time), then response a JSON with the data (so I'm using now primitive mode). All this method are cached and its work very very well.
A lot of this request are repetead a lot, the most of users using the API request the same things so its very important for me no reprocess the same querys who I process before. And because the database data change 2 or 3 times in the year, save the cache its very usefull. Imagine when I need modify or stop the Web Service or move to another server... I lost all the cache. If the cache exists in a flat file these are resolve.
Thanks!

@medikoo medikoo added this to the v1 milestone Feb 25, 2015
@jameswomack
Copy link

I have the same use case as @LiberD

@medikoo
Copy link
Owner

medikoo commented Jul 11, 2016

I plan a bigger overhaul of this package this year, I'll make sure then that this use case is addressed

@xaiki
Copy link

xaiki commented Dec 9, 2016

hey @medikoo !

before all: thanks for a great package !
any update on this ?

@medikoo
Copy link
Owner

medikoo commented Dec 10, 2016

@xaiki not yet, and not very soon, Earliest in 1st quarter of next year

@epayet
Copy link
Contributor

epayet commented Feb 7, 2017

Hi @medikoo

First of all, great work on this module :)

I have a use case very close to what was explained here. What I want to do, is when the maxAge is reached and you need to call the original function again, if this fails, I want to respond what was in the cache before (even if it is stale).

For that, I need read access to the cache so that I'm able to respond something even if the original function call fails.

What do you think? An import/export solution would solve my issue? In my case, it's just export.
Is there currently a way to get access to the cache? I'm happy to help on this.

Many thanks.

@medikoo
Copy link
Owner

medikoo commented Feb 8, 2017

@epayet your case seems very close to one reported at #51 I plan to work on addressing that in March, I'll ensure same handling would be possible for maxAge when used with no prefetch option

@epayet
Copy link
Contributor

epayet commented Feb 8, 2017

Thanks @medikoo !
It looks like the case described in #51 is exactly what I need :)

@breathe
Copy link

breathe commented Jan 14, 2018

I'd like to use the cache import/export functionality in my test-suite (jest).

For my use, I will monkey patch the http client library (axios) used in my test-suite to optionally use a memoizee'd version. Then when executing the test suite I can choose to run service-level tests (by running suite without loading the memoizee cache or injecting the custom axios implementation) or to run faster-executing unit-test-like tests (by running suite with loaded memoizee cache and memoizee'd axios).

@JeffSpies
Copy link

@medikoo, great library! Has there been any progress on this (or #76)? Is this still on the 1.0 roadmap?

@medikoo
Copy link
Owner

medikoo commented Sep 4, 2018

Has there been any progress on this (or #76)?

Unfortunately not yet

Is this still on the 1.0 roadmap?

Yes, it is. 1.0 is specced out at #73 and at convienient moment will be implemented. Still at this point delivery date is uncertain

@JeffSpies
Copy link

You definitely have the best js memoization library, and while I understand that it wasn't the original purpose, I do think persistence would be an impactful feature (e.g., every http request library has multiple caching/VCR libraries). If you had a moment to document your desired approach, there may be folks (myself included) that could help.

@jameskhamil
Copy link

Guys, it's been four years lol, maybe it's time to use a different library

Here's one: https://github.com/borisdiakur/memoize-fs

@medikoo medikoo changed the title Where memoize save the cache? Cache persistence May 27, 2022
@drone1
Copy link

drone1 commented May 27, 2022

Guys, it's been four years lol, maybe it's time to use a different library

Here's one: https://github.com/borisdiakur/memoize-fs

Caching to disk is not a scalable solution IMHO

@vincerubinetti
Copy link

vincerubinetti commented Jun 6, 2024

Caching to disk is not a scalable solution IMHO

Even if this statement is true (and I doubt it's wholesale true), not everything needs to be scalable.

Needing disk cache + function memoize is not at all an uncommon or unreasonable need. This library in python has a TTL disk cache with a @cache.memoize function decorator.

Related issue on another js memoize library: sindresorhus/memoize#86

Guys, it's been four years lol, maybe it's time to use a different library
Here's one: https://github.com/borisdiakur/memoize-fs

Thanks for this, I'm going to try it out. Had a hard time finding alternatives.

Edit: Unfortunately it's maxAge (time-to-live) option doesn't work across runs, only in memory. As such, none of cacache, memoize, memoize-fs, memoizee, etc. met all my requirements, so I had to make my own simple solution. See previous link if you're curious.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

10 participants