Personalized Game Character Design via Fast Style Transfer API Integration
DOI:
https://doi.org/10.54097/wga0qc20Keywords:
Neural Style Transfer, Game Character Customization, Real-Time Rendering, Adaptive Instance Normalization, Computational CreativityAbstract
As player expectations for personalized visual identities continue to rise, traditional game character design workflows struggle to efficiently support diverse artistic customization. Manual character styling is time-consuming, costly, and difficult to scale in real-time interactive environments. To address this challenge, this study proposes a fast neural style transfer (FST) framework for real-time game character customization. The proposed system integrates a feed-forward convolutional neural network with perceptual loss and Adaptive Instance Normalization (AdaIN) to enable efficient and flexible artistic style transfer for both 2D characters and 3D texture maps. The framework supports arbitrary and interpolated styles while preserving character structure and visual consistency. To evaluate system performance, we conduct quantitative experiments using SSIM, FID, and inference latency, alongside user-centered evaluations measuring usability and aesthetic satisfaction. Experimental results demonstrate that the proposed method achieves real-time performance with an average inference latency of 43.2 ms, while maintaining high visual fidelity and strong content preservation. User studies further indicate high usability and personalization satisfaction. Overall, this work demonstrates the feasibility of integrating fast neural style transfer into interactive game pipelines and provides practical insights for AI-assisted character customization systems.
Downloads
References
[1] Gatys, L. A., Ecker, A. S., & Bethge, M. (2016). Image style transfer using convolutional neural networks. Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2414–2423.
[2] Johnson, J., Alahi, A., & Fei-Fei, L. (2016). Perceptual losses for real-time style transfer and super-resolution. European Conference on Computer Vision (ECCV), 694–711.
[3] Huang, X., & Belongie, S. (2017). Arbitrary style transfer in real-time with adaptive instance normalization. Proceedings of the IEEE International Conference on Computer Vision (ICCV), 1501–1510.
[4] Wu, X., Li, Y., Dong, H., Qian, X., & Fu, Y. (2021). StyleFormer: Real-time arbitrary style transfer via parametric style composition. Proceedings of the IEEE International Conference on Computer Vision (ICCV), 1456–1465.
[5] Xia, X., Liu, J., Zhang, F., & Wang, H. (2020). Joint bilateral learning for real-time universal photorealistic style transfer. Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2202–2211.
[6] Ulyanov, D., Lebedev, V., Vedaldi, A., & Lempitsky, V. (2016). Texture networks: Feed-forward synthesis of textures and stylized images. Proceedings of the International Conference on Machine Learning (ICML), 1349–1357.
[7] Zhang, H., & Dana, K. (2017). Multi-style generative network for real-time transfer. Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 343–351.
[8] Sanakoyeu, A., Kotovenko, D., Lang, S., & Ommer, B. (2018). A style-aware content loss for real-time HD style transfer. Proceedings of the European Conference on Computer Vision (ECCV), 698–714.
[9] Shi, T., Han, X., Jiang, Y., Xu, C., & Xu, C. (2019). Face-to-parameter translation for game character auto-creation. Proceedings of the IEEE International Conference on Computer Vision (ICCV), 161–170.
[10] Chen, D., Yuan, L., Liao, J., Yu, N., & Hua, G. (2018). Stereoscopic neural style transfer. Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 6654–6663.
Downloads
Published
Issue
Section
License
Copyright (c) 2026 Frontiers in Computing and Intelligent Systems

This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.

