MEASURING SURPRISE WITH RELATIVE ENTROPY

Authors

  • Sevda GÜRSAKAL Uludağ Üniversitesi
  • Dilek MURAT Uludağ Üniversitesi
  • Necmi GÜRSAKAL Uludağ Üniversitesi

DOI:

https://doi.org/10.17740/eas.soc.2015‐V2‐02

Keywords:

Relative entropy, Kullback- Leibler divergence, Bayesian statistics, R

Abstract

In recent times entropy concept has been used in many fields such as neural networks, pattern recognition, and statistics for the goodness of fit tests. Surprise attracts human attention and from marketing and advertisement to education attracting human attention is very important. In a Bayesian context, this paper tries to explain how to measure 'surprise' by using relative entropy concept and discusses the various implications of this measurement.

Published

2015-03-15

Issue

Section

Makaleler