๋ณธ๋ฌธ ๋ฐ”๋กœ๊ฐ€๊ธฐ
728x90
๋ฐ˜์‘ํ˜•

nlp12

ELMO 1. Intro ๊ฐ™์€ read๋ผ๊ณ  ํ•ด๋„ ํ˜„์žฌํ˜•๊ณผ ๊ณผ๊ฑฐํ˜•์ด ์žˆ์Œ -> ์•ž์—์„œ๋งŒ ์˜ˆ์ธก์„ ํ•ด์„œ ์ถœ๋ ฅํ•˜๋ฉด ์ •ํ™•ํžˆ ๋ชจ๋ฅด๊ธฐ ๋•Œ๋ฌธ์—, ๋’ค์—์„œ๋ถ€ํ„ฐ ์˜ค๋Š” ์• ๋“ค์„ ๊ฐ€์ง€๊ณ  ์˜ˆ์ธก์„ ํ•ด์„œ read๊ฐ€ ๊ณผ๊ฑฐํ˜•์œผ๋กœ ์“ฐ์ธ๋‹ค! ๋ผ๊ณ  ์•Œ๋ ค์ฃผ๋Š”๊ฒŒ ์—˜๋ชจ์˜ ์—ญํ•  2. Overall architecture read์— ํ•ด๋‹นํ•˜๋Š” ์นœ๊ตฌ๋ฅผ ๋ฝ‘๋Š”๋‹ค forward ๋ถ€๋ถ„๊ณผ backward ๋ถ€๋ถ„์„ ํ•จ๊ป˜ ํ•™์Šต์‹œํ‚ด ์ด๋•Œ, word embedding ๋ถ€๋ถ„, LSTM1์ธต, LSTM2์ธต ๋“ฑ ๊ฐ๊ฐ์˜embedding๊ณผ LSTM๋ผ๋ฆฌ concat์„ ์‹œํ‚ด ์ดํ›„, ์•Œ๋งž๊ฒŒ ๊ฐ€์ค‘์น˜๋ฅผ ๊ณฑํ•ด์คŒ ( ์ด๋•Œ ์•„๋ž˜์— ์žˆ์„์ˆ˜๋ก ๋ฌธ๋ฒ•์ ์ธ ์ธก๋ฉด์—์„œ์˜ ๋ฒกํ„ฐ์ด๊ณ , ์œ„๋กœ ๊ฐˆ์ˆ˜๋ก ๋ฌธ๋งฅ์— ๋งž๋Š” ๋ฒกํ„ฐ๋ผ๊ณ  ํ•จ) ์ดํ›„, ๊ฐ€์ค‘ํ•ฉ์„ ํ•˜๋ฉด ํ•˜๋‚˜์˜ ๋ฒกํ„ฐ๊ฐ€ ๋งŒ๋“ค์–ด์ง → read์— ๋Œ€ํ•œ embedding ์ธต์— elmo ๊ฐ’.. 2023. 7. 6.
XLNet: Generalized Autoregressive Pretraining for Language Understanding ๐Ÿ’ก [๋…ผ๋ฌธ๋ฆฌ๋ทฐ] XLNet: Generalized Autoregressive Pretraining for Language Understanding XLNet: Generalized Autoregressive Pretraining for Language Understanding Zhilin Yang, Zihang Dai, Yiming Yang, Jaime Carbonell, Ruslan Salakhutdinov, Quoc V. Le https://arxiv.org/abs/1906.08237 1. Introduction Unsupervised Representation Learning์€ Large-scale์˜ corpora๋ฅผ ํ†ตํ•ด Pre… https://jeonsworld.github.io/NLP/xlnet.. 2023. 7. 5.
Seq2Seq 1. Intro 1) DNNs(Deep Neural Networks) ์Œ์„ฑ ์ธ์‹, ์‚ฌ๋ฌผ ์ธ์‹๊ณผ ๊ฐ™์€ ๋ฌธ์ œ์— ์•„์ฃผ ์ข‹์Œ ํ•˜์ง€๋งŒ ์ด ์นœ๊ตฌ๋Š” input, output์ด ๊ณ ์ •๋œ ์ฐจ์›์˜ ๋ฒกํ„ฐ๋กœ ์ธ์ฝ”๋”ฉ ๋œ ๋ฌธ์ œ์—๋งŒ ์ ์šฉํ•  ์ˆ˜ ์žˆ๋‹ค๋Š” ๋‹จ์  ๋ฐœ์ƒ 2) sequential problems ์Œ์„ฑ์ธ์‹์ด๋‚˜ ๊ธฐ๊ณ„ ๋ฒˆ์—ญ ๊ฐ™์€ ๋ฌธ์ œ๋“ค์€ ๊ธธ์ด๋ฅผ ์•Œ ์ˆ˜ ์—†๋Š” ์‹œํ€€์Šค๋กœ ํ‘œํ˜„๋จ ๋Œ€ํ‘œ ์˜ˆ๋กœ) question-answering ๋ฌธ์ œ๋Š” ์งˆ๋ฌธ์— ๋Œ€ํ•œ ์ •๋‹ต ์‹œํ€€์Šค๋กœ ๋งค์นญํ•ด์ค˜์•ผ ํ•จ ๋”ฐ๋ผ์„œ DNN์€ ์ž…์ถœ๋ ฅ ์ฐจ์›์„ ์•Œ์•„์•ผ ํ•˜๊ณ , ๊ณ ์ •๋˜์–ด์•ผ ํ•˜๊ธฐ ๋•Œ๋ฌธ์—, ๊ธฐ์กด์˜ ๋ฐฉ๋ฒ•์€ ํ•ด๊ฒฐํ•˜๊ธฐ์— ์–ด๋ ค์›€์ด ๋ฐœ์ƒ ex) ‘๋‚˜๋Š” ๋„ˆ๋ฅผ ์ •๋ง ์‚ฌ๋ž‘ํ•ด’ ⇒ ‘ I love you so much’ ex) ๋ฌธ์žฅ ๋‹จ์–ด ๊ฐœ์ˆ˜์— ๋งž์ถฐ์„œ I love you very ๋ผ๋Š” ์–ด์ƒ‰ํ•œ ๋ฌธ์žฅ ์ถœ๋ ฅ .. 2023. 7. 5.
Bert 1. Intro ๊ธฐ์กด์— GPT๋‚˜ ์•ž์˜ ๋ชจ๋ธ์—์„œ๋Š” ๋‹ค ํ•œ๋ฐฉํ–ฅ์œผ๋กœ๋งŒ ์›€์ง์ด๋Š”(์™ผ→์šฐ)๋กœ ์˜ˆ์ธกํ•˜๋Š” ๋ชจ๋ธ์„ ๋งŒ๋“ค์—ˆ์—ˆ์Œ ํ•˜์ง€๋งŒ ์–˜๋„ค๋“ค์€ ๊ฒฐ๊ตญ ์˜ˆ์ธกํ•  ๋•Œ๋Š” ๋‹จ๋ฐฉํ–ฅ์œผ๋กœ๋งŒ ์ฝ์–ด์„œ ์˜ˆ์ธกํ•ด์•ผ ํ•˜๊ธฐ์— ์ด์ „ ํ† ํฐ๋งŒ ์ฐธ์กฐํ•  ์ˆ˜ ์žˆ๋‹ค๋Š” ๋‹จ์ ์ด ์กด์žฌ ⇒ ๋‹ค์Œ ๋ฌธ์žฅ์— ๋Œ€ํ•œ ์˜ˆ์ธก์ด๋‚˜ ๋ฌธ์žฅ ๋นˆ์นธ์— ๋Œ€ํ•œ ์˜ˆ์ธก์˜ ๊ฒฝ์šฐ ์น˜๋ช…์ ์ž„. ๊ทธ๋ž˜์„œ bERT๋ฅผ ํ†ตํ•ด ์–‘๋ฐฉํ–ฅ ๋ชจ๋ธ์„ ์‚ฌ์šฉํ•˜๋ ค๊ณ  ํ•˜๋Š” ๊ฒƒ์ž„. 2. Overall architecture ํŠน์ • ๊ณผ์ œ๋ฅผ ํ•˜๊ธฐ ์ „ ์‚ฌ์ „ ํ›ˆ๋ จ ์ž„๋ฒ ๋”ฉ(embedding)์„ ํ†ตํ•ด ์„ฑ๋Šฅ์„ ์ข‹๊ฒŒ ํ•  ์ˆ˜ ์žˆ๋Š” ์–ธ์–ด๋ชจ๋ธ ์‚ฌ์ „ ํ›ˆ๋ จ ์–ธ์–ด ๋ชจ๋ธ์ž„(pre-training) unlabeled data๋กœ๋ถ€ํ„ฐ pre-train ์ง„ํ–‰ ํ•œ ํ›„์—, labeled data๋ฅผ ๊ฐ€์ง€๊ณ  fine-tuning ์ง„ํ–‰ํ•˜๋Š” ๋ชจ๋ธ encoder ๋ชจ๋ธ๋งŒ ๊ฐ€์ ธ๋‹ค ์”€.. 2023. 7. 5.
VIT [AN IMAGE IS WORTH 16X16 WORDS: TRANSFORMERS FOR IMAGE RECOGNITION AT SCALE] ๐Ÿ’ก 0. Abstract While the Transformer architecture has become the de-facto standard for natural language processing tasks, its applications to computer vision remain limited. In vision, attention is either applied in conjunction with convolutional networks, or used to replace certain components of convolutional networks while keeping their overall structure in place. We show that this reliance on .. 2023. 7. 5.
GPT-1 1. Intro Text์˜ unlabeled๋œ ๋ฐ์ดํ„ฐ๋Š” ํ’๋ถ€ํ•จ ๋ฐ˜๋ฉด์— labeled๋œ ๋ฐ์ดํ„ฐ๋Š” ํ’๋ถ€ํ•˜์ง€ ์•Š๊ณ  ๋นˆ์•ฝํ•จ ๋”ฐ๋ผ์„œ model์ด ์ ์ ˆํ•œ ์ž‘์—…์„ ์ˆ˜ํ–‰ํ•˜๊ธฐ ์‰ฝ์ง€ ์•Š๋Š”๋‹ค๋Š” ๋ฌธ์ œ์  ๋ฐœ์ƒ ๊ทธ๋ž˜์„œ ๋‚˜์˜จ ์•„์ด๋””์–ด๊ฐ€ unsupervisedํ•œ ๋ฐ์ดํ„ฐ๋ฅผ ๋จผ์ € ํ•™์Šต์‹œํ‚ค๊ณ , label๊ฐ’์ด ์žˆ๋Š” ๋ฐ์ดํ„ฐ๋กœ ์žฌํ•™์Šต์‹œํ‚ค๋Š” ๋ฐฉ์‹์ด ๋‚˜์˜ค๊ฒŒ ๋จ. 2. Overall architectudre unsupervised pre-training + supervised fine-tuning ๊ตฌ์กฐ๋กœ ์ด๋ฃจ์–ด์ง 3. Unsupervised pre-training label๊ฐ’์ด ์—†๋Š” unsupervised data๋ฅผ input์œผ๋กœ ๋„ฃ์Œ word embedding ์ง„ํ–‰ํ•˜๊ณ  positional encoding ํ•ด์คŒ decoder์˜ masked self.. 2023. 7. 5.
728x90
๋ฐ˜์‘ํ˜•