Shannon's Information Measures and Markov Structures

May 22

Monday, May 22, 2017

2:00 pm - 3:00 pm
Gross Hall 304B

Add to calendar

Presenter

Raymond W. Yeung, Institute of Network Coding, The Chinese University of Hong Kong

In the 1990's, the theory of I-Measure was developed as a full-fledged set-theoretic interpretation of Shannon's information measures. In this talk, we first give an overview of this theory. Then we discuss a set of tools developed on the I-Measure that is most suitable for studying a special Markov structure called full conditional mutual independence (FCMI), which turns out to be a building block for Markov random fields. One application of these tools is to show that the I-Measure of a Markov chain (a special case of a Markov random field) exhibits a very simple structure and is always nonnegative.In the last part of the talk, we discuss some recent results along this line: i. a characterization of the Markov structure of a subfield of a Markov random field; ii. the Markov chain being the only Markov random field such that the I-Measure is always nonnegative.

Contact

Peterson, Kathy
613-7829
kathy.peterson@duke.edu