Gambaran besar transformer.
Pakai font times new roman karena abis bikin figures buat 3 paper semingguan ini. Kebiasaan wkwkwkwkw. Pas sadar, udah beres. Ya udah lah. Bodo. Udah bagus ada video WAWKAWKAKW (bakal ada yang baca deskripsi ga yaaa? WAWKWAWKAKWAK)
Video pendukung:
Positional encoding: https://youtu.be/pNc1X6JBBNk
Word embedding: https://youtu.be/AycnaUbxbOk
Residual connection: https://youtu.be/8xxqBORMsFE
Sumber-sumber:
Vaswani, A., Shazeer, N., Parmar, N., Uszkoreit, J., Jones, L., Gomez, A.N., Kaiser, Ł. and Polosukhin, I., 2017. Attention is all you need. Advances in neural information processing systems, 30.
https://www.youtube.com/watch?v=z1xs9jdZnuY&t=1s
https://towardsdatascience.com/an-intuitive-explanation-of-self-attention-4f72709638e1
https://towardsdatascience.com/illustrated-self-attention-2d627e33b20a
https://ai.stackexchange.com/questions/23889/what-is-the-purpose-of-decoder-mask-triangular-mask-in-transformer
Konten, script, suara: Marcella Astrid
Editing, subtitle, ekstra humor: Diaz Jubairy (https://www.linkedin.com/in/diazjubairy/)
Link buat yang mau nyawer:
saweria.co/marcellaastrid
buymeacoffee.com/marcellaastrid
#Transformer #DeepLearning #AnakAI