Skip to main content (모델 요약) LLaMa3 (paper)
- param size: 8B, 70B
- context length: 8k
- pretraining data: 15T+ tokens of data from publicly available sources
- fine-tuning data: publicly available instruction datasets, as well as over 10M human-annotated examples
- Python 으로 경량화 모델 로드: ollama