Dr. Ming Meng | Digital Human | Best Researcher Award
Communication University of China | China
AUTHOR PROFILE
EARLY ACADEMIC PURSUITS
Dr. Ming Meng embarked on his academic journey with a solid foundation in computer science and related fields. He earned his Ph.D. from Beihang University, where he was supervised by Professor Zhou Zhong at the National Key Laboratory of Virtual Reality. During his doctoral studies, he focused on advancements in virtual reality and computer vision, laying the groundwork for his future research in digital media and intelligent technologies.
PROFESSIONAL ENDEAVORS
Dr. Meng’s professional career includes significant roles and experiences in both research and academia. He served as a Postdoctoral Fellow at the School of Data Science and Intelligent Media, Communication University of China, from September 2017 to June 2022. In this role, he contributed to research at the National Radio and Television Administration Laboratory for Intelligent Media Microservice Technology Research and Application. His career also encompasses a strong involvement in international conferences and collaborations, such as his contributions to the Asian Conference on Computer Vision (ACCV) and the IEEE International Symposium on Mixed and Augmented Reality (ISMAR).
CONTRIBUTIONS AND RESEARCH FOCUS
Dr. Meng’s research is centered around digital human technologies and intelligent media. His notable contributions include:
- Development of methods for geometric-driven indoor structure recovery and distortion-aware room layout estimation from omnidirectional and fisheye images.
- Advancements in 3D stitching and augmented virtual environments.
- Creation of innovative object detection and viewpoint quality evaluation techniques for panoramic images.
His research focus on digital human technologies has led to several influential publications and patents, demonstrating his expertise in digital media and virtual reality applications.
IMPACT AND INFLUENCE
Dr. Meng’s work has significantly impacted the fields of digital human and intelligent media, influencing both academic research and practical applications. His research on 3D room reconstruction and video fusion systems has contributed to advancements in augmented reality and virtual reality technologies. His patents, including methods for automatic layout recovery and scene structure depth estimation, underscore his role in pioneering new techniques in digital media.
ACADEMIC CITATIONS
Dr. Meng’s research is widely cited in the context of digital human technologies and intelligent media. His papers, such as those published in Neural Computing & Applications and Science China Information Sciences, highlight his contributions to the development of advanced virtual reality and augmented reality systems. His work on distortion-aware re-projection fusion networks and model-guided 3D stitching is recognized for its impact on improving object detection and immersive environments.
LEGACY AND FUTURE CONTRIBUTIONS
Dr. Meng’s legacy in digital human and intelligent media will be marked by his contributions to advancing virtual reality and augmented reality technologies. His future work is expected to further explore and develop innovative solutions in digital media, enhancing applications related to virtual environments and intelligent systems. As he continues his research, Dr. Meng will likely contribute to significant advancements in digital human technology and its applications in various domains.
DIGITAL HUMAN
Dr. Ming Meng’s research extensively explores digital human technologies, focusing on advancements in virtual reality, augmented reality, and intelligent media. His work on scene structure depth estimation and automatic layout recovery from omnidirectional images reflects his commitment to advancing digital human applications. His patents and publications highlight his contributions to the field, emphasizing the development of innovative solutions for digital media and virtual environments.
NOTABLE PUBLICATION
-
Citation: 03 Year: 2024
-
Citation: 03 Year: 2021
-
Citation: 05 Year: 2019
-
Citation: 10 Year: 2018