—practically also useful for other NNet/models

Hsuan-Tien Lin (NTU CSIE) Machine Learning Techniques 16/24

Deep Learning Denoising Autoencoder

### Dealing with Noise

### •

direct possibility:**data cleaning/pruning, remember? :-)**

### •

a**wild**

possibility: **adding noise**

to data?
### •

idea:**robust**

autoencoder should not only let**g(x) ≈** **x**

but also allow

**g(** ˜ **x) ≈** **x**

even when**x** ˜

slightly different from**x**

### • **denoising**

autoencoder:
run basic autoencoder with data

{(˜

**x** _{1}

,**y** _{1} = **x** _{1}

), (### ˜ **x** _{2}

,**y** _{2} = **x** _{2}

), . . . , (**x** ˜ _{N}

,**y** _{N} = **x** _{N}

)},
where**x** ˜ n

=**x** n

+**artificial noise**

—often used

**instead of basic autoencoder**

in deep learning
### •

useful for data/image processing:**g(** ˜ **x)**

a**denoised**

version of### ˜ **x**

### •

effect: ‘constrain/regularize’**g**

towards**noise-tolerant**

denoising
**artificial noise/hint**

as**regularization!**

—practically also useful for other NNet/models

Deep Learning Denoising Autoencoder

### Dealing with Noise

### •

direct possibility:**data cleaning/pruning, remember? :-)**

### •

a**wild**

possibility: **adding noise**

to data?
### •

idea:**robust**

autoencoder should not only let**g(x) ≈** **x**

but also allow

**g(** **x) ≈** ˜ **x**

even when**x** ˜

slightly different from**x**

### • **denoising**

autoencoder:
run basic autoencoder with data

{(˜

**x** _{1}

,**y** _{1} = **x** _{1}

), (### ˜ **x** _{2}

,**y** _{2} = **x** _{2}

), . . . , (**x** ˜ _{N}

,**y** _{N} = **x** _{N}

)},
where**x** ˜ n

=**x** n

+**artificial noise**

—often used

**instead of basic autoencoder**

in deep learning
### •

useful for data/image processing:**g(** ˜ **x)**

a**denoised**

version of### ˜ **x**

### •

effect: ‘constrain/regularize’**g**

towards**noise-tolerant**

denoising
**artificial noise/hint**

as**regularization!**

—practically also useful for other NNet/models

Hsuan-Tien Lin (NTU CSIE) Machine Learning Techniques 16/24

Deep Learning Denoising Autoencoder

### Dealing with Noise

### •

direct possibility:**data cleaning/pruning, remember? :-)**

### •

a**wild**

possibility: **adding noise**

to data?
### •

idea:**robust**

autoencoder should not only let**g(x) ≈** **x**

but also allow**g(** **x) ≈** ˜ **x**

even when**x** ˜

slightly different from**x**

### • **denoising**

autoencoder:
run basic autoencoder with data

{(˜

**x** _{1}

,**y** _{1} = **x** _{1}

), (### ˜ **x** _{2}

,**y** _{2} = **x** _{2}

), . . . , (**x** ˜ _{N}

,**y** _{N} = **x** _{N}

)},
where**x** ˜ n

=**x** n

+**artificial noise**

—often used

**instead of basic autoencoder**

in deep learning
### •

useful for data/image processing:**g(** ˜ **x)**

a**denoised**

version of### ˜ **x**

### •

effect: ‘constrain/regularize’**g**

towards**noise-tolerant**

denoising
**artificial noise/hint**

as**regularization!**

—practically also useful for other NNet/models

Deep Learning Denoising Autoencoder

### Dealing with Noise

### •

direct possibility:**data cleaning/pruning, remember? :-)**

### •

a**wild**

possibility: **adding noise**

to data?
### •

idea:**robust**

autoencoder should not only let**g(x) ≈** **x**

but also allow**g(** **x) ≈** ˜ **x**

even when**x** ˜

slightly different from**x**

### • **denoising**

autoencoder:
run basic autoencoder with data

{(˜

**x** _{1}

,**y** _{1} = **x** _{1}

), (### ˜ **x** _{2}

,**y** _{2} = **x** _{2}

), . . . , (**x** ˜ _{N}

,**y** _{N} = **x** _{N}

)},
where**x** ˜ n

=**x** n

+**artificial noise**

—often used

**instead of basic autoencoder**

in deep learning
### •

useful for data/image processing:**g(** ˜ **x)**

a**denoised**

version of### ˜ **x**

### •

effect: ‘constrain/regularize’**g**

towards**noise-tolerant**

denoising
**artificial noise/hint**

as**regularization!**

—practically also useful for other NNet/models

Hsuan-Tien Lin (NTU CSIE) Machine Learning Techniques 16/24

Deep Learning Denoising Autoencoder

### Dealing with Noise

### •

direct possibility:**data cleaning/pruning, remember? :-)**

### •

a**wild**

possibility: **adding noise**

to data?
### •

idea:**robust**

autoencoder should not only let**g(x) ≈** **x**

but also allow**g(** **x) ≈** ˜ **x**

even when**x** ˜

slightly different from**x**

### • **denoising**

autoencoder:
run basic autoencoder with data

{(˜

**x** _{1}

,**y** _{1} = **x** _{1}

), (### ˜ **x** _{2}

,**y** _{2} = **x** _{2}

), . . . , (**x** ˜ _{N}

,**y** _{N} = **x** _{N}

)},
where**x** ˜ n

=**x** n

+**artificial noise**

—often used

**instead of basic autoencoder**

in deep learning
### •

useful for data/image processing:**g(** ˜ **x)**

a**denoised**

version of### ˜ **x**

### •

effect: ‘constrain/regularize’**g**

towards**noise-tolerant**

denoising
**artificial noise/hint**

as**regularization!**

—practically also useful for other NNet/models

Deep Learning Denoising Autoencoder

### Dealing with Noise

### •

direct possibility:**data cleaning/pruning, remember? :-)**

### •

a**wild**

possibility: **adding noise**

to data?
### •

idea:**robust**

autoencoder should not only let**g(x) ≈** **x**

but also allow**g(** **x) ≈** ˜ **x**

even when**x** ˜

slightly different from**x**

### • **denoising**

autoencoder:
run basic autoencoder with data

{(˜

**x** _{1}

,**y** _{1} = **x** _{1}

), (### ˜ **x** _{2}

,**y** _{2} = **x** _{2}

), . . . , (**x** ˜ _{N}

,**y** _{N} = **x** _{N}

)},
where**x** ˜ n

=**x** n

+**artificial noise**

—often used

**instead of basic autoencoder**

in deep learning
### •

useful for data/image processing:**g(** ˜ **x)**

a**denoised**

version of### ˜ **x**

### •

effect: ‘constrain/regularize’**g**

towards**noise-tolerant**

denoising
**artificial noise/hint**

as**regularization!**

—practically also useful for other NNet/models

Hsuan-Tien Lin (NTU CSIE) Machine Learning Techniques 16/24

Deep Learning Denoising Autoencoder

### Dealing with Noise

### •

direct possibility:**data cleaning/pruning, remember? :-)**

### •

a**wild**

possibility: **adding noise**

to data?
### •

idea:**robust**

autoencoder should not only let**g(x) ≈** **x**

but also allow**g(** **x) ≈** ˜ **x**

even when**x** ˜

slightly different from**x**

### • **denoising**

autoencoder:
run basic autoencoder with data

{(˜

**x** _{1}

,**y** _{1} = **x** _{1}

), (### ˜ **x** _{2}

,**y** _{2} = **x** _{2}

), . . . , (**x** ˜ _{N}

,**y** _{N} = **x** _{N}

)},
where**x** ˜ n

=**x** n

+**artificial noise**

—often used

**instead of basic autoencoder**

in deep learning
### •

useful for data/image processing:**g(** ˜ **x)**

a**denoised**

version of### ˜ **x**

### •

effect: ‘constrain/regularize’**g**

towards**noise-tolerant**

denoising
**artificial noise/hint**

as**regularization!**

—practically also useful for other NNet/models

Deep Learning Denoising Autoencoder

### Dealing with Noise

### •

direct possibility:**data cleaning/pruning, remember? :-)**

### •

a**wild**

possibility: **adding noise**

to data?
### •

idea:**robust**

autoencoder should not only let**g(x) ≈** **x**

but also allow**g(** **x) ≈** ˜ **x**

even when**x** ˜

slightly different from**x**

### • **denoising**

autoencoder:
run basic autoencoder with data

{(˜

**x** _{1}

,**y** _{1} = **x** _{1}

), (### ˜ **x** _{2}

,**y** _{2} = **x** _{2}

), . . . , (**x** ˜ _{N}

,**y** _{N} = **x** _{N}

)},
where**x** ˜ n

=**x** n

+**artificial noise**

—often used

**instead of basic autoencoder**

in deep learning
### •

useful for data/image processing:**g(** ˜ **x)**

a**denoised**

version of### ˜ **x**

### •

effect: ‘constrain/regularize’**g**

towards**noise-tolerant**

denoising
**artificial noise/hint**

as**regularization!**

—practically also useful for other NNet/models

Hsuan-Tien Lin (NTU CSIE) Machine Learning Techniques 16/24

Deep Learning Denoising Autoencoder

### Dealing with Noise

### •

direct possibility:**data cleaning/pruning, remember? :-)**

### •

a**wild**

possibility: **adding noise**

to data?
### •

idea:**robust**

autoencoder should not only let**g(x) ≈** **x**

but also allow**g(** **x) ≈** ˜ **x**

even when**x** ˜

slightly different from**x**

### • **denoising**

autoencoder:
run basic autoencoder with data

{(˜

**x** _{1}

,**y** _{1} = **x** _{1}

), (### ˜ **x** _{2}

,**y** _{2} = **x** _{2}

), . . . , (**x** ˜ _{N}

,**y** _{N} = **x** _{N}

)},
where**x** ˜ n

=**x** n

+**artificial noise**

—often used

**instead of basic autoencoder**

in deep learning
### •

useful for data/image processing:**g(** ˜ **x)**

a**denoised**

version of### ˜ **x**

### •

effect: ‘constrain/regularize’**g**

towards**noise-tolerant**

denoising
**artificial noise/hint**

as**regularization!**

—practically also useful for other NNet/models

Deep Learning Denoising Autoencoder

### Fun Time

Which of the following cannot be viewed as a regularization technique?

### 1

hint the model with artificially-generated noisy data### 2

stop gradient descent early### 3

add a weight elimination regularizer### 4

all the above are regularization techniques### Reference Answer: 4

1 is our new friend for regularization, while 2 and 3 are old friends.

Hsuan-Tien Lin (NTU CSIE) Machine Learning Techniques 17/24

Deep Learning Denoising Autoencoder

### Fun Time

Which of the following cannot be viewed as a regularization technique?

### 1

hint the model with artificially-generated noisy data### 2

stop gradient descent early### 3

add a weight elimination regularizer### 4

all the above are regularization techniques### Reference Answer: 4

1 is our new friend for regularization, while 2 and 3 are old friends.

Deep Learning Principal Component Analysis