Wenyan Wu | Modeling | Best Researcher Award
Dr. Wenyan Wu at Guangdong University of Technology | China
Dr. Wenyan Wu is an emerging researcher whose work focuses on the intersection of artificial intelligence, multimodal learning, and intelligent systems with applications in emotion recognition, sentiment analysis, and human-computer interaction. Since creating her ORCID record in August 2022, Dr. Wu has actively contributed to advancing research in cross-modal data analysis, integrating deep learning frameworks with cognitive and affective computing techniques. Her recent publication, βModality-Enhanced Multimodal Integrated Fusion Attention Model for Sentiment Analysisβ (Applied Sciences, 2025), introduces a novel attention-based fusion approach to improve sentiment analysis accuracy by effectively capturing inter-modal dependencies across text, audio, and visual cues. In βCollaborative Analysis of Learnersβ Emotional States Based on Cross-Modal Higher-Order Reasoningβ (Applied Sciences, 2024), Dr. Wu explores emotion-aware learning environments, presenting innovative reasoning mechanisms for identifying and analyzing learnersβ affective states to enhance adaptive education systems. Her research on βMask-Wearing Detection in Complex Environments Based on Improved YOLOv7β (Applied Sciences, 2024) demonstrates her interdisciplinary expertise, combining computer vision and deep neural networks to address real-world safety monitoring challenges. Earlier, her foundational study, βA Novel Method for Cross-Modal Collaborative Analysis and Evaluation in the Intelligence Eraβ (Applied Sciences, 2022), laid the groundwork for her later research by proposing an integrated model for data collaboration across modalities in intelligent environments. Dr. Wuβs scholarly output reflects her strong analytical and technical acumen, emphasizing multimodal integration, attention mechanisms, and deep learning optimization. Her contributions not only advance theoretical understanding but also provide practical frameworks for developing emotionally intelligent and context-aware AI systems, bridging the gap between computational models and human-centered design in modern intelligent applications.
Profile: OrcidΒ
Featured PublicationsΒ
Zhang, Z., Wu, W., Yuan, T., & Feng, G. (2025). Modality-enhanced multimodal integrated fusion attention model for sentiment analysis. Applied Sciences, 15(19), 10825.
Wu, W., Zhao, J., Shen, X., & Feng, G. (2024). Collaborative analysis of learnersβ emotional states based on cross-modal higher-order reasoning. Applied Sciences, 14(13), 5513.
Feng, G., Yang, Q., Tang, C., Liu, Y., Wu, X., & Wu, W. (2024). Mask-wearing detection in complex environments based on improved YOLOv7. Applied Sciences, 14(9), 3606.
Wu, W., Hu, Q., Feng, G., & He, Y. (2022). A novel method for cross-modal collaborative analysis and evaluation in the intelligence era. Applied Sciences, 13(1), 163.