• 沒有找到結果。

artificial noise/hint as regularization!

—practically also useful for other NNet/models

Hsuan-Tien Lin (NTU CSIE) Machine Learning Techniques 16/24

Deep Learning Denoising Autoencoder

Dealing with Noise

direct possibility:

data cleaning/pruning, remember? :-)

a

wild

possibility:

adding noise

to data?

idea:

robust

autoencoder should not only let

g(x) ≈ x

but also allow

g( ˜ x) ≈ x

even when

x ˜

slightly different from

x

denoising

autoencoder:

run basic autoencoder with data

{(˜

x 1

,

y 1 = x 1

), (

˜ x 2

,

y 2 = x 2

), . . . , (

x ˜ N

,

y N = x N

)}, where

x ˜ n

=

x n

+

artificial noise

—often used

instead of basic autoencoder

in deep learning

useful for data/image processing:

g( ˜ x)

a

denoised

version of

˜ x

effect: ‘constrain/regularize’

g

towards

noise-tolerant

denoising

artificial noise/hint

as

regularization!

—practically also useful for other NNet/models

Deep Learning Denoising Autoencoder

Dealing with Noise

direct possibility:

data cleaning/pruning, remember? :-)

a

wild

possibility:

adding noise

to data?

idea:

robust

autoencoder should not only let

g(x) ≈ x

but also allow

g( x) ≈ ˜ x

even when

x ˜

slightly different from

x

denoising

autoencoder:

run basic autoencoder with data

{(˜

x 1

,

y 1 = x 1

), (

˜ x 2

,

y 2 = x 2

), . . . , (

x ˜ N

,

y N = x N

)}, where

x ˜ n

=

x n

+

artificial noise

—often used

instead of basic autoencoder

in deep learning

useful for data/image processing:

g( ˜ x)

a

denoised

version of

˜ x

effect: ‘constrain/regularize’

g

towards

noise-tolerant

denoising

artificial noise/hint

as

regularization!

—practically also useful for other NNet/models

Hsuan-Tien Lin (NTU CSIE) Machine Learning Techniques 16/24

Deep Learning Denoising Autoencoder

Dealing with Noise

direct possibility:

data cleaning/pruning, remember? :-)

a

wild

possibility:

adding noise

to data?

idea:

robust

autoencoder should not only let

g(x) ≈ x

but also allow

g( x) ≈ ˜ x

even when

x ˜

slightly different from

x

denoising

autoencoder:

run basic autoencoder with data

{(˜

x 1

,

y 1 = x 1

), (

˜ x 2

,

y 2 = x 2

), . . . , (

x ˜ N

,

y N = x N

)}, where

x ˜ n

=

x n

+

artificial noise

—often used

instead of basic autoencoder

in deep learning

useful for data/image processing:

g( ˜ x)

a

denoised

version of

˜ x

effect: ‘constrain/regularize’

g

towards

noise-tolerant

denoising

artificial noise/hint

as

regularization!

—practically also useful for other NNet/models

Deep Learning Denoising Autoencoder

Dealing with Noise

direct possibility:

data cleaning/pruning, remember? :-)

a

wild

possibility:

adding noise

to data?

idea:

robust

autoencoder should not only let

g(x) ≈ x

but also allow

g( x) ≈ ˜ x

even when

x ˜

slightly different from

x

denoising

autoencoder:

run basic autoencoder with data

{(˜

x 1

,

y 1 = x 1

), (

˜ x 2

,

y 2 = x 2

), . . . , (

x ˜ N

,

y N = x N

)}, where

x ˜ n

=

x n

+

artificial noise

—often used

instead of basic autoencoder

in deep learning

useful for data/image processing:

g( ˜ x)

a

denoised

version of

˜ x

effect: ‘constrain/regularize’

g

towards

noise-tolerant

denoising

artificial noise/hint

as

regularization!

—practically also useful for other NNet/models

Hsuan-Tien Lin (NTU CSIE) Machine Learning Techniques 16/24

Deep Learning Denoising Autoencoder

Dealing with Noise

direct possibility:

data cleaning/pruning, remember? :-)

a

wild

possibility:

adding noise

to data?

idea:

robust

autoencoder should not only let

g(x) ≈ x

but also allow

g( x) ≈ ˜ x

even when

x ˜

slightly different from

x

denoising

autoencoder:

run basic autoencoder with data

{(˜

x 1

,

y 1 = x 1

), (

˜ x 2

,

y 2 = x 2

), . . . , (

x ˜ N

,

y N = x N

)}, where

x ˜ n

=

x n

+

artificial noise

—often used

instead of basic autoencoder

in deep learning

useful for data/image processing:

g( ˜ x)

a

denoised

version of

˜ x

effect: ‘constrain/regularize’

g

towards

noise-tolerant

denoising

artificial noise/hint

as

regularization!

—practically also useful for other NNet/models

Deep Learning Denoising Autoencoder

Dealing with Noise

direct possibility:

data cleaning/pruning, remember? :-)

a

wild

possibility:

adding noise

to data?

idea:

robust

autoencoder should not only let

g(x) ≈ x

but also allow

g( x) ≈ ˜ x

even when

x ˜

slightly different from

x

denoising

autoencoder:

run basic autoencoder with data

{(˜

x 1

,

y 1 = x 1

), (

˜ x 2

,

y 2 = x 2

), . . . , (

x ˜ N

,

y N = x N

)}, where

x ˜ n

=

x n

+

artificial noise

—often used

instead of basic autoencoder

in deep learning

useful for data/image processing:

g( ˜ x)

a

denoised

version of

˜ x

effect: ‘constrain/regularize’

g

towards

noise-tolerant

denoising

artificial noise/hint

as

regularization!

—practically also useful for other NNet/models

Hsuan-Tien Lin (NTU CSIE) Machine Learning Techniques 16/24

Deep Learning Denoising Autoencoder

Dealing with Noise

direct possibility:

data cleaning/pruning, remember? :-)

a

wild

possibility:

adding noise

to data?

idea:

robust

autoencoder should not only let

g(x) ≈ x

but also allow

g( x) ≈ ˜ x

even when

x ˜

slightly different from

x

denoising

autoencoder:

run basic autoencoder with data

{(˜

x 1

,

y 1 = x 1

), (

˜ x 2

,

y 2 = x 2

), . . . , (

x ˜ N

,

y N = x N

)}, where

x ˜ n

=

x n

+

artificial noise

—often used

instead of basic autoencoder

in deep learning

useful for data/image processing:

g( ˜ x)

a

denoised

version of

˜ x

effect: ‘constrain/regularize’

g

towards

noise-tolerant

denoising

artificial noise/hint

as

regularization!

—practically also useful for other NNet/models

Deep Learning Denoising Autoencoder

Dealing with Noise

direct possibility:

data cleaning/pruning, remember? :-)

a

wild

possibility:

adding noise

to data?

idea:

robust

autoencoder should not only let

g(x) ≈ x

but also allow

g( x) ≈ ˜ x

even when

x ˜

slightly different from

x

denoising

autoencoder:

run basic autoencoder with data

{(˜

x 1

,

y 1 = x 1

), (

˜ x 2

,

y 2 = x 2

), . . . , (

x ˜ N

,

y N = x N

)}, where

x ˜ n

=

x n

+

artificial noise

—often used

instead of basic autoencoder

in deep learning

useful for data/image processing:

g( ˜ x)

a

denoised

version of

˜ x

effect: ‘constrain/regularize’

g

towards

noise-tolerant

denoising

artificial noise/hint

as

regularization!

—practically also useful for other NNet/models

Hsuan-Tien Lin (NTU CSIE) Machine Learning Techniques 16/24

Deep Learning Denoising Autoencoder

Dealing with Noise

direct possibility:

data cleaning/pruning, remember? :-)

a

wild

possibility:

adding noise

to data?

idea:

robust

autoencoder should not only let

g(x) ≈ x

but also allow

g( x) ≈ ˜ x

even when

x ˜

slightly different from

x

denoising

autoencoder:

run basic autoencoder with data

{(˜

x 1

,

y 1 = x 1

), (

˜ x 2

,

y 2 = x 2

), . . . , (

x ˜ N

,

y N = x N

)}, where

x ˜ n

=

x n

+

artificial noise

—often used

instead of basic autoencoder

in deep learning

useful for data/image processing:

g( ˜ x)

a

denoised

version of

˜ x

effect: ‘constrain/regularize’

g

towards

noise-tolerant

denoising

artificial noise/hint

as

regularization!

—practically also useful for other NNet/models

Deep Learning Denoising Autoencoder

Fun Time

Which of the following cannot be viewed as a regularization technique?

1

hint the model with artificially-generated noisy data

2

stop gradient descent early

3

add a weight elimination regularizer

4

all the above are regularization techniques

Reference Answer: 4

1 is our new friend for regularization, while 2 and 3 are old friends.

Hsuan-Tien Lin (NTU CSIE) Machine Learning Techniques 17/24

Deep Learning Denoising Autoencoder

Fun Time

Which of the following cannot be viewed as a regularization technique?

1

hint the model with artificially-generated noisy data

2

stop gradient descent early

3

add a weight elimination regularizer

4

all the above are regularization techniques

Reference Answer: 4

1 is our new friend for regularization, while 2 and 3 are old friends.

Deep Learning Principal Component Analysis

Linear Autoencoder Hypothesis

nonlinear autoencoder

sophisticated

linear autoencoder

simple

linear: more efficient? less overfitting?

相關文件