Cost-sensitive Information Maximization

R. Kamimura (Japan)


mutual information maximization, competitive learning, winnertakeall, cost


In this paper, we propose a new information-theoretic method for competitive learning. The method is called cost-sensitive information maximization, because informa tion is increased by controlling the associated cost. In formation is defined by normalized competitive unit out puts, and the corresponding cost is the average distance between input patterns and connection weights. We ap plied the method to a simple artificial problem and a po litical data analysis. Experimental results confirmed that cost minimization can be used to increase information for some cases. When cost minimization is not so powerful to increase information, information maximization as well as entropy maximization are needed to reinforce information increase. One of the main findings of this paper is that cost minimization as well as conventional competitive learning focus on imitating input patterns, while information maxi mization aims to extract distinctive features by which input patterns can explicitly be separated.

Important Links:

Go Back