IMAGE SUPER-RESOLUTION METHOD BASED ON KNOWLEDGE DISTILLATION COMPRESSION MODEL AND DEVICE THEREOF

    公开(公告)号:US20240233077A1

    公开(公告)日:2024-07-11

    申请号:US18405750

    申请日:2024-01-05

    IPC分类号: G06T3/4053 G06T5/60 G06T5/73

    摘要: An image super-resolution method based on a knowledge distillation compression model and a device thereof are disclosed. A small student network model is cascaded into a teacher network with high performance to better complete knowledge distillation, so that the performance of a student network can gradually approach the teacher network, and then the compression of a super-resolution network is completed. Using a distillation strategy of the present disclosure not only avoids manually designing feature conversion between different networks to align, but also greatly reduces the optimization difficulty of the student network. In order to alleviate the problem of inefficient distillation caused by a representation gap between teachers and students, the present disclosure regards a similarity relationship between layers of teachers as knowledge, so that students can learn the similarity relationship of teachers in their own space instead of directly imitating complex features of teachers. The present disclosure significantly compresses a parameter quantity and calculation consumption of a super-resolution network model, reduces the deployment difficulty of the super-resolution network model in an apparatus with limited resources, and has a strong practical application value.