澳大利亚卧龙岗大学专家报告

发布日期:2013-12-06 浏览次数: [字体: ]

报 告 人:WangLei      澳大利亚卧龙岗大学

报告题目:Multiple Kernel Learning by Incorporating Data Radius Information

时       间:12月11日下午2:30-3:30

地       点:明理楼(S2)-109室

CV:Lei Wang received the B.Eng degree and the M.Eng degree from Southeast University, China in 1996 and 1999, respectively, and the PhD degree from School of EEE of Nanyang Technological University, Singapore in 2004. He is now Senior Lecturer of Faculty of Engineering and Information Sciences of University of Wollongong. Lei Wang was awarded the Australian Post-doctoral Fellowship by Australian Research Council in 2007 and the Early Career Researcher Award by Australian Academy of Science in 2009. His research interests include machine learning, pattern recognition, and computer vision. Lei Wang has published more than 80 peer-reviewed papers, including those on highly regarded journals and conferences such as IEEE TPAMI, IEEE TNN, CVPR, ICCV and ECCV, etc. He is the Area Chair of Pacific-Rim Symposium on Image and Video Technology in 2010, 2011 and 2013, and has been the Technical Program Committee member of 20+ international conferences and workshops. Lei Wang is also the regular reviewer of 20+ international journals. He is a Senior member of IEEE.

Content:

Incorporating the scattering radius of data in a kernel-induced feature space has been demonstrated as a promising way to improve multiple kernel learning. Nevertheless, directly incorporating the radius into MKL not only incurs significant computational overhead but also can adversely affect kernel learning performance due to the notorious sensitivity of this radius to outliers. This talk presents our approach to improving this situation. Inspired by the intrinsic relationship between the scattering radius of data and the trace of total scattering matrix of data, we propose to incorporate the latter into MKL instead. Our approach can effectively preserve the merits of incorporating the scattering radius of data, and the resulting optimization can be efficiently solved. Also, it achieves several advantages: 1) more robust in the presence of noise or outliers; 2) computationally more efficient by avoiding the quadratic optimization for computing the radius; and 3) readily solvable by the existing off-the-shelf MKL packages. Comprehensive experimental results demonstrate the effectiveness and efficiency of our approach. In addition, this talk will introduce our recent work on adaptively learning optimal neighborhood kernels and a feature selection approach via global kernel preservation.