跳转至

05 | Tricks

74 个字 预计阅读时间不到 1 分钟

正则化

正则化约束参数的大小

如何验证正则化的大小

dropout

Randomly drop units (along with their connections) during training § Each unit is retained with a fixed dropout rate p, independent of other units § The hyper-parameter p needs to be chosen (tuned) o Often, between 20% and 50% of the units are dropped

Early-stopping

Batch normalization layers

GPU