Regularization for Deep Learning (Part 1: Data Pre-Processing)

Definition : According to Goodfellow et al. (2016): “Regularization is any modification we make to a learning algorithm that is intended to reduce its generalization error but not its training error.”   Regularization techniques : Data pre-Processing : Dataset Augmentation. Constraints : L¹/L ² regularization. Early stopping. Additional layers : Dropout. Max-Norm Regularization. Batch Normalization.

Read More