Transfer Learning with Matlab

Transfer learning is a term that is often encountered by deep learning practitioners. For those who do not understand, the following explanation may be useful.

In accordance with the ‘word’, transfer learning means using past learning to use new domains or cases. The benefit is that we don’t learn from scratch again. The easiest way is to freeze the pretrained model termed pretrained, and replaces the last layer of the old number of classes with the new model class.

The following video uses Matlab 2021a to demonstrate how transfer learning works, along with techniques for freezing and augmenting data with rotation and translation. The benefit of augmentation is to increase the amount of training data by changing ‘slightly’ from the initial training data.

Tinggalkan Balasan

Isikan data di bawah atau klik salah satu ikon untuk log in:

Logo WordPress.com

You are commenting using your WordPress.com account. Logout /  Ubah )

Gambar Twitter

You are commenting using your Twitter account. Logout /  Ubah )

Foto Facebook

You are commenting using your Facebook account. Logout /  Ubah )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.