—practically also useful for other NNet/models
Hsuan-Tien Lin (NTU CSIE) Machine Learning Techniques 16/24
Deep Learning Denoising Autoencoder
Dealing with Noise
•
direct possibility:data cleaning/pruning, remember? :-)
•
awild
possibility:adding noise
to data?•
idea:robust
autoencoder should not only letg(x) ≈ x
but also allow
g( ˜ x) ≈ x
even whenx ˜
slightly different fromx
• denoising
autoencoder:run basic autoencoder with data
{(˜
x 1
,y 1 = x 1
), (˜ x 2
,y 2 = x 2
), . . . , (x ˜ N
,y N = x N
)}, wherex ˜ n
=x n
+artificial noise
—often used
instead of basic autoencoder
in deep learning•
useful for data/image processing:g( ˜ x)
adenoised
version of˜ x
•
effect: ‘constrain/regularize’g
towardsnoise-tolerant
denoisingartificial noise/hint
asregularization!
—practically also useful for other NNet/models
Deep Learning Denoising Autoencoder
Dealing with Noise
•
direct possibility:data cleaning/pruning, remember? :-)
•
awild
possibility:adding noise
to data?•
idea:robust
autoencoder should not only letg(x) ≈ x
but also allow
g( x) ≈ ˜ x
even whenx ˜
slightly different fromx
• denoising
autoencoder:run basic autoencoder with data
{(˜
x 1
,y 1 = x 1
), (˜ x 2
,y 2 = x 2
), . . . , (x ˜ N
,y N = x N
)}, wherex ˜ n
=x n
+artificial noise
—often used
instead of basic autoencoder
in deep learning•
useful for data/image processing:g( ˜ x)
adenoised
version of˜ x
•
effect: ‘constrain/regularize’g
towardsnoise-tolerant
denoisingartificial noise/hint
asregularization!
—practically also useful for other NNet/models
Hsuan-Tien Lin (NTU CSIE) Machine Learning Techniques 16/24
Deep Learning Denoising Autoencoder
Dealing with Noise
•
direct possibility:data cleaning/pruning, remember? :-)
•
awild
possibility:adding noise
to data?•
idea:robust
autoencoder should not only letg(x) ≈ x
but also allowg( x) ≈ ˜ x
even whenx ˜
slightly different fromx
• denoising
autoencoder:run basic autoencoder with data
{(˜
x 1
,y 1 = x 1
), (˜ x 2
,y 2 = x 2
), . . . , (x ˜ N
,y N = x N
)}, wherex ˜ n
=x n
+artificial noise
—often used
instead of basic autoencoder
in deep learning•
useful for data/image processing:g( ˜ x)
adenoised
version of˜ x
•
effect: ‘constrain/regularize’g
towardsnoise-tolerant
denoisingartificial noise/hint
asregularization!
—practically also useful for other NNet/models
Deep Learning Denoising Autoencoder
Dealing with Noise
•
direct possibility:data cleaning/pruning, remember? :-)
•
awild
possibility:adding noise
to data?•
idea:robust
autoencoder should not only letg(x) ≈ x
but also allowg( x) ≈ ˜ x
even whenx ˜
slightly different fromx
• denoising
autoencoder:run basic autoencoder with data
{(˜
x 1
,y 1 = x 1
), (˜ x 2
,y 2 = x 2
), . . . , (x ˜ N
,y N = x N
)}, wherex ˜ n
=x n
+artificial noise
—often used
instead of basic autoencoder
in deep learning•
useful for data/image processing:g( ˜ x)
adenoised
version of˜ x
•
effect: ‘constrain/regularize’g
towardsnoise-tolerant
denoisingartificial noise/hint
asregularization!
—practically also useful for other NNet/models
Hsuan-Tien Lin (NTU CSIE) Machine Learning Techniques 16/24
Deep Learning Denoising Autoencoder
Dealing with Noise
•
direct possibility:data cleaning/pruning, remember? :-)
•
awild
possibility:adding noise
to data?•
idea:robust
autoencoder should not only letg(x) ≈ x
but also allowg( x) ≈ ˜ x
even whenx ˜
slightly different fromx
• denoising
autoencoder:run basic autoencoder with data
{(˜
x 1
,y 1 = x 1
), (˜ x 2
,y 2 = x 2
), . . . , (x ˜ N
,y N = x N
)}, wherex ˜ n
=x n
+artificial noise
—often used
instead of basic autoencoder
in deep learning•
useful for data/image processing:g( ˜ x)
adenoised
version of˜ x
•
effect: ‘constrain/regularize’g
towardsnoise-tolerant
denoisingartificial noise/hint
asregularization!
—practically also useful for other NNet/models
Deep Learning Denoising Autoencoder
Dealing with Noise
•
direct possibility:data cleaning/pruning, remember? :-)
•
awild
possibility:adding noise
to data?•
idea:robust
autoencoder should not only letg(x) ≈ x
but also allowg( x) ≈ ˜ x
even whenx ˜
slightly different fromx
• denoising
autoencoder:run basic autoencoder with data
{(˜
x 1
,y 1 = x 1
), (˜ x 2
,y 2 = x 2
), . . . , (x ˜ N
,y N = x N
)}, wherex ˜ n
=x n
+artificial noise
—often used
instead of basic autoencoder
in deep learning•
useful for data/image processing:g( ˜ x)
adenoised
version of˜ x
•
effect: ‘constrain/regularize’g
towardsnoise-tolerant
denoisingartificial noise/hint
asregularization!
—practically also useful for other NNet/models
Hsuan-Tien Lin (NTU CSIE) Machine Learning Techniques 16/24
Deep Learning Denoising Autoencoder
Dealing with Noise
•
direct possibility:data cleaning/pruning, remember? :-)
•
awild
possibility:adding noise
to data?•
idea:robust
autoencoder should not only letg(x) ≈ x
but also allowg( x) ≈ ˜ x
even whenx ˜
slightly different fromx
• denoising
autoencoder:run basic autoencoder with data
{(˜
x 1
,y 1 = x 1
), (˜ x 2
,y 2 = x 2
), . . . , (x ˜ N
,y N = x N
)}, wherex ˜ n
=x n
+artificial noise
—often used
instead of basic autoencoder
in deep learning•
useful for data/image processing:g( ˜ x)
adenoised
version of˜ x
•
effect: ‘constrain/regularize’g
towardsnoise-tolerant
denoisingartificial noise/hint
asregularization!
—practically also useful for other NNet/models
Deep Learning Denoising Autoencoder
Dealing with Noise
•
direct possibility:data cleaning/pruning, remember? :-)
•
awild
possibility:adding noise
to data?•
idea:robust
autoencoder should not only letg(x) ≈ x
but also allowg( x) ≈ ˜ x
even whenx ˜
slightly different fromx
• denoising
autoencoder:run basic autoencoder with data
{(˜
x 1
,y 1 = x 1
), (˜ x 2
,y 2 = x 2
), . . . , (x ˜ N
,y N = x N
)}, wherex ˜ n
=x n
+artificial noise
—often used
instead of basic autoencoder
in deep learning•
useful for data/image processing:g( ˜ x)
adenoised
version of˜ x
•
effect: ‘constrain/regularize’g
towardsnoise-tolerant
denoisingartificial noise/hint
asregularization!
—practically also useful for other NNet/models
Hsuan-Tien Lin (NTU CSIE) Machine Learning Techniques 16/24
Deep Learning Denoising Autoencoder
Dealing with Noise
•
direct possibility:data cleaning/pruning, remember? :-)
•
awild
possibility:adding noise
to data?•
idea:robust
autoencoder should not only letg(x) ≈ x
but also allowg( x) ≈ ˜ x
even whenx ˜
slightly different fromx
• denoising
autoencoder:run basic autoencoder with data
{(˜
x 1
,y 1 = x 1
), (˜ x 2
,y 2 = x 2
), . . . , (x ˜ N
,y N = x N
)}, wherex ˜ n
=x n
+artificial noise
—often used
instead of basic autoencoder
in deep learning•
useful for data/image processing:g( ˜ x)
adenoised
version of˜ x
•
effect: ‘constrain/regularize’g
towardsnoise-tolerant
denoisingartificial noise/hint
asregularization!
—practically also useful for other NNet/models
Deep Learning Denoising Autoencoder
Fun Time
Which of the following cannot be viewed as a regularization technique?
1
hint the model with artificially-generated noisy data2
stop gradient descent early3
add a weight elimination regularizer4
all the above are regularization techniquesReference Answer: 4
1 is our new friend for regularization, while 2 and 3 are old friends.
Hsuan-Tien Lin (NTU CSIE) Machine Learning Techniques 17/24
Deep Learning Denoising Autoencoder
Fun Time
Which of the following cannot be viewed as a regularization technique?
1
hint the model with artificially-generated noisy data2
stop gradient descent early3
add a weight elimination regularizer4
all the above are regularization techniquesReference Answer: 4
1 is our new friend for regularization, while 2 and 3 are old friends.
Deep Learning Principal Component Analysis