A

Alejandro Paredes La Torre, Barbara Flores, Diego Rodriguez

Articles by Alejandro Paredes La Torre, Barbara Flores, Diego Rodriguez

Academic · 1 min

Knowledge Distillation for Large Language Models

arXiv:2603.13765v1 Announce Type: new Abstract: We propose a resource-efficient framework for compressing large language models through knowledge distillation, combined with guided chain-of-thought reinforcement learning. Using …

Alejandro Paredes La Torre, Barbara Flores, Diego Rodriguez
9 views