[Review] GPT - 작성 중
OpenAI의 걸작. 가장 핫하다면 핫하다고 볼 수 있는. Generative Pre-Training Model, GPT를 version 순서대로 리뷰해보자.
- Improving Language Understanding by Generative Pre-Training (GPT-1)
- Language Models are Unsupervised Multitask Learners (GPT-2)
- Language Models are Few-Shot Learners (GPT-3)
- GPT-4 Technical Report (GPT-4)
1. GPT-1
2. GPT-2
3. GPT-3
4. GPT-4
Reference
Websites
[사이트 출처 1]
Papers
[1] Radford, A., Narasimhan, K., Salimans, T., & Sutskever, I. (2018). Improving language understanding by generative pre-training.
[2] Radford, A., Wu, J., Child, R., Luan, D., Amodei, D., & Sutskever, I. (2019). Language models are unsupervised multitask learners. OpenAI blog, 1(8), 9.
[3] Brown, T., Mann, B., Ryder, N., Subbiah, M., Kaplan, J. D., Dhariwal, P., … & Amodei, D. (2020). Language models are few-shot learners. Advances in neural information processing systems, 33, 1877-1901.
[4] OpenAI, “Gpt-4 technical report,” 2023, https://cdn.openai.com/papers/gpt-4.pdf.
댓글남기기