# Meeting Details

Title: Weighted entropy Probability and Financial Mathematics Seminar Yuri Suhov, Penn State/University of Cambridge, UK The entropy $h(X)=-\sum_ip_i\log\,p_i$ measures an expected amount of information/uncertainty related to a random variable $X$ taking values $i$ with probabilities $p_i$. The weighted entropy, $h^{\rm w}_\phi(X)$, is defined as $-\sum_i\phi (i)p_i\log\,p_i$ where $\phi (i)\geq 0$ is a weight function representing `utilities' of different values $i$ which we want to take into account. As in the case of a standard entropy, one can introduce conditional and relative weighted entropies; weighted differential entropies can also be defined. In this talk, I will discuss a recent progress in studying weighted entropies and their possible use in various areas.