Advanced Textile Technology ›› 2024, Vol. 32 ›› Issue (8): 117-126.

Previous Articles     Next Articles

Clothing pattern style transfer based on edge enhancement and association loss

  

  1. School of Computer Science, Xi 'an Polytechnic University, Xi 'an 710600
  • Online:2024-08-10 Published:2024-09-02

基于边缘增强和关联损失的服装图像风格迁移

  

  1. 西安工程大学计算机科学学院, 西安 710600

Abstract: With the continuous upgrading and iteration of image processing and deep mining technologies, and their continuous application in people's daily work and life, many scholars have also deepened their research on images.
Clothing design is an important field of image application, and the pattern style of clothing can to some extent affect customer satisfaction with clothing.The transfer of clothing pattern styles can replace corresponding styles of clothing according to individual needs, which is a product of the development of image processing and greatly caters to the spiritual needs of the public.Style transfer is mainly based on deep learning algorithms, identifying the edges, colors, and textures of the style map, and transferring them to the edited content map, so that the final generated image contains the basic texture features of the style image. The research on clothing patterns mainly focuses on changing styles, which is also the focus of research. Integrating numerous different styles into corresponding clothing patterns can make clothing styles more diverse and quickly meet people's needs.The traditional clothing pattern style transfer style is relatively single, mostly consisting of simple texture features, and the image generation effect is not ideal. There are still many difficulties in improving the clarity, quality, and contrast of style and color for image transfer with different styles.Clothing style transfer generally uses convolutional neural networks as the basic algorithm for feature recognition, mainly to extract features from the original image and record them in the feature map. Different features correspond to different feature maps, and then through feature calculation, align the style map and content map to complete the transfer process.However, most existing style transfer algorithms are suitable for image style transfer, and directly applying these methods to clothing style transfer results in unsatisfactory results. 
This article proposes an edge enhancement and association loss based clothing transfer method called EnAdaIN. Firstly, the original edge features of the image are extracted, and then Mask R-CNN is used for semantic segmentation. Then, the content image and style image are added to the improved EnAdaIN model based on spatial association loss. After obtaining the style transfer pattern, the extracted edge features and semantic style image are fused, Finally, the pattern style of the clothing is transferred. The spatial association loss algorithm that combines content loss algorithm and style loss algorithm can further improve the feature similarity and detail display of images. The experiment shows that the peak signal-to-noise ratio of the model in this article has improved by more than 0.95 percentage points compared to other models, the structural similarity has improved by more than 2.43 percentage points, and the transfer efficiency of the model has improved by more than 3.53 percentage points. The generated image information has richer colors and more obvious features, further improving the contrast and quality of the image.

Key words: AdaIN, associated loss, style transfer, clothing pattern transfer

摘要: 服装图案风格的迁移可以依据个人需要调整对应的风格服饰,满足大众对精神生活越来越高的需求。传统的服装图案风格迁移多采用简单纹理,内容较为单一,图像效果不够理想。针对这些问题,提出了一种基于边缘增强和关联损失的服装图案风格迁移方法(EnAdaIN)。首先,依据Kirsch算子对图像的原始边缘特征进行提取,同时结合Mask R-CNN深度学习方法对服装图像进行语义分割;然后,通过改进AdaIN算法构建基于空间关联损失的EnAdaIN方法,并输出风格迁移图像。EnAdaIN方法在融合边缘特征与语义风格图像的基础上实现服装图像的风格迁移,在融合空间关联损失算法的基础上进一步提升图像的特征相似度。实验表明:方法峰值信噪比相较于其他方法提升超过0.95%,结构相似性提升超过2.43%,方法的迁移效率提升超过了3.53%,生成后的图像信息色彩更为丰富、特征明显,进一步提升了图像的对比度和质量。

关键词: AdaIN, 关联损失, 风格迁移, 服装图案迁移

CLC Number: