Distillation Less is More: Task-aware Layer-wise Distillation for Language Model Compression Paper • 2210.01351 • Published Oct 4, 2022 • 3
Less is More: Task-aware Layer-wise Distillation for Language Model Compression Paper • 2210.01351 • Published Oct 4, 2022 • 3
Reasoning models Less is More: Recursive Reasoning with Tiny Networks Paper • 2510.04871 • Published Oct 6, 2025 • 500
Less is More: Recursive Reasoning with Tiny Networks Paper • 2510.04871 • Published Oct 6, 2025 • 500
Distillation Less is More: Task-aware Layer-wise Distillation for Language Model Compression Paper • 2210.01351 • Published Oct 4, 2022 • 3
Less is More: Task-aware Layer-wise Distillation for Language Model Compression Paper • 2210.01351 • Published Oct 4, 2022 • 3
Reasoning models Less is More: Recursive Reasoning with Tiny Networks Paper • 2510.04871 • Published Oct 6, 2025 • 500
Less is More: Recursive Reasoning with Tiny Networks Paper • 2510.04871 • Published Oct 6, 2025 • 500