MobileFaceSwap: A Lightweight Framework for Video Face Swapping
Zhiliang Xu, Zhibin Hong, Changxing Ding, Zhen Zhu, Junyu Han, jingtuo liu, Errui Ding
[AAAI-22] Main Track
Abstract:
Advanced face swapping methods have achieved appealing results. However, most of these methods have many parameters and computations, which makes it challenging to apply them in real-time applications or deploy them on edge devices like mobile phones. In this work, we propose a lightweight Identity-aware Dynamic Network (IDN) for subject-agnostic face swapping by dynamically adjusting the model parameters according to the identity information. In particular, we design an efficient Identity Injection Module (IIM) by introducing two dynamic neural network techniques, including the weights prediction and weights modulation. Once the IDN is updated, it can be applied to swap faces given any target image or video. The presented IDN contains only 0.50M parameters and needs 0.33G FLOPs per frame, making it capable for real-time video face swapping on mobile phones. In addition, we introduce a knowledge distillation-based method for stable training, and a loss reweighting module is employed to obtain better synthesized results. Finally, our method achieves comparable results with the teacher models and other state-of-the-art methods.
Introduction Video
Sessions where this paper appears
-
Poster Session 4
Red 1 -
Poster Session 7
Red 1