Skip to content
Paco Zamora Martinez edited this page Jun 14, 2015 · 6 revisions
  1. Is it possible to use a larger bunch_size at validation step?

  2. Why is SDAE training stopping after the first layer showing an error output of incorrect matrix dimensions?

Is it possible to use a larger bunch_size at validation step?

Yes, it is. A field bunch_size could be defined at the table received by train_dataset and validate_dataset methods of trainable.supervised_trainer objects:

trainer:train_dataset{
  input_dataset  = in_ds,
  output_dataset = out_ds,
  shuffle        = random_object,
  bunch_size     = 32, -- TRAINING BUNCH SIZE
}
trainer:validate_dataset{
  input_dataset  = in_ds,
  output_dataset = out_ds,
  bunch_size     = 1024, -- VALIDATION BUNCH SIZE
}
Why is SDAE training stopping after the first layer showing an error output of incorrect matrix dimensions?

It is a common mistake, probably you forget to use the parameter which is received by noise_pipeline functions. See this example:

INPUT_DATASET = whatever...
 ...
    noise_pipeline = { function(GIVEN_DS)
                         return dataset.salt_noise{
                                  ds=INPUT_DATASET, ....
                                }
                       end }
 ...

This example will produce the error, because the INPUT_DATASET is used inside the function defined for noise_pipeline table, and this variable is taken as closure of the function. However, the SDAE procedure exepcts that you use the GIVEN ARGUMENT ds, which has been prepared to contain the data after training the first Auto-Encoder. So, the code must be like this:

 ...
    noise_pipeline = { function(GIVEN_DS)
                         return dataset.salt_noise{
                                  ds=GIVEN_DS, ....
                                }
                       end }
 ...
Clone this wiki locally