You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
There's often a need to fit a single model to several datasets, where only e.g. the normalisation is changing.
At the moment, the model is evaluated once for each dataset, but there's much room for improvement here: we need only evaluate the model once and can use different normalisations for each one. The caveat is; if the normalisation is being used in a convolution model, the effect is potentially non-linear, but in theory we have all the information to know that at compile time.
For now:
A solution would be to introduce a new data wrapper that detects if the domains are all the same / overlapping, and then uses the transformer function to effectively copy the model n times and apply the normalisations.
Abuse or implement an alternative AutoCache
Just concatenate all the datasets into a new dataset wrapper that implements the dataset API so the model is just evaluated once (but then can't adjust normalisations).
The text was updated successfully, but these errors were encountered:
There's often a need to fit a single model to several datasets, where only e.g. the normalisation is changing.
At the moment, the model is evaluated once for each dataset, but there's much room for improvement here: we need only evaluate the model once and can use different normalisations for each one. The caveat is; if the normalisation is being used in a convolution model, the effect is potentially non-linear, but in theory we have all the information to know that at compile time.
For now:
n
times and apply the normalisations.AutoCache
The text was updated successfully, but these errors were encountered: