Dr. Marc Finzi will discuss his work on computationally bounded information theory (“epiplexity”) at this week’s regular research meeting, Monday February 9th at 3 pm ET (at the regular zoom link). Details below.
Bio:
Marc Finzi is a Research Scientist at OpenAI. He received his Ph.D. in Computer Science from New York University for work on inductive biases in deep learning. His research focuses on designing and analyzing learning systems that better exploit structure in data, including his recent work on epiplexity.
Title:
“From Entropy to Epiplexity: Rethinking Information for Computationally Bounded Intelligence”
Abstract:
Can new and useful information be constructed by merely applying deterministic transformations to existing data? Theorems in classical Shannon and Algorithmic information theory like the data processing inequality appear to say no, creating confusion around topics like synthetic data. In this talk I discuss epiplexity, a definition of structural information hinging on the limited computation available to the observer. Along the way, we touch on topics that stretch conventional information theory explanations like cellular automata, pseudorandom number generation, and emergent phenomena.
Leave a reply to coffeeprofoundlyf45d887029 Cancel reply