Kernel Fusion and Feature Selection in Machine Learning

V. Mottl, O. Krasotkina, O. Seredin (Russia), and I. Muchnik (USA)


Kernel fusion, feature selection, regression estimation, pattern recognition.


In machine learning, when the kernel-based approach is used for estimating dependences in a set of entities, in particular, for solving the problems of pattern recognition or regression esti mation, only one kernel function is tacitly assumed to be de fined on the set of entities. At the same time, it is typical for practice that there are several viewpoints at the numerical pair wise comparison of entities. In this work, we systematically exploit the fact that any kernel function on a set of entities of arbitrary kind embeds that set into a linear space in which it plays the role of inner product. To fuse the kernels heuristi cally suggested by different experts into an entire learning technique, we propose to consider Cartesian product of the re spective number of linear spaces, each supplied with a specific kernel as specific inner product. The main requirement placed upon the fusion principle is avoidance of discrete selection in eliminating redundant kernels with the purpose of achieving acceptable computational complexity. A real-valued feature on the given set of entities defines, actually, a simplest kernel, therefore, the proposed kernel fusion principle is, at the same time, a principle of minimizing the feature space dimensional ity in feature-based machine learning.

Important Links:

Go Back