Wojciech Szpankowski - Purdue University
Students: Spring 2025, unless noted otherwise, sessions will be virtual on Zoom.
What is Information?
Jan 24, 2007
Download:![Video Icon](/assets/images/icons/file-video.png)
![Watch on Youtube](/news_and_events/events/security_seminar/images/youtube_32x32.png)
Abstract
Information permeates every corner of our lives and shapes ouruniverse. Understanding and harnessing information holds the potential for
significant advances. The breadth and depth of underlying concepts of
the science of information transcend traditional disciplinary boundaries
of scientific and commercial endeavors. Information can be manifested
in various forms: business information is measured in dollars;
chemical information is contained in shapes of molecules;
biological information stored and processed in our cells prolongs life.
So what is information? In this talk we first attempt to identify the
most important features of information and define it in the broadest
possible sense. We subsequently turn to the notion and theory of information
introduced by Claude Shannon in 1948 that served as the backbone for
digital communication. We go on to bridge Shannon information with
Boltzmann's entropy, Maxwell's demon, Landauer's principle and
Bennett's irreversible computations. We point out, however,
that while Shannon created a successful and beautiful theory
of information for communication, a wide spread application of information
theory to economics, biology, life science and complex networks seems to be
still awaiting us. We shall discuss some examples that recently crop up in
biology, chemistry, computer science, and quantum physics. We conclude
with a list of challenges for future research.
We hope to put forward some educated questions, rather than answers,
to the issues and tools that lay before researchers interested in information.
About the Speaker
Before coming to Purdue, Wojciech Szpankowski was assistant professor at the Technical University of Gdansk, and in 1984 he was assistant professor at the McGill University, Montreal. During 1992-93, he was professeur invité at INRIA, Rocquencourt, France. His research interests cover analysis of algorithms, data compression, information theory, analytic combinatorics, random structures, networking, stability problems in distributed systems, modeling of computer systems and computer communication networks, queueing theory, and operations research. His recent work is devoted to the probabilistic analysis of algorithms on words, analytic information theory, and designing efficient multimedia data compression schemes based on approximate pattern matching.
He is a recipient of the Humboldt Fellowship. He has been a guest editor for special issues in IEEE Transactions on Automatic Control, Theoretical Computer Science, Random Structures & Algorithms, and Algorithmica. Currently, he is editing a special issue on "Analysis of Algorithms" in Algorithmica. He serves on the editorial boards of Theoretical Computer Science, Discrete Mathematics and Theoretical Computer Science, and the book series Advances in the Theory of Computation and Computational Mathematics.
He is a recipient of the Humboldt Fellowship. He has been a guest editor for special issues in IEEE Transactions on Automatic Control, Theoretical Computer Science, Random Structures & Algorithms, and Algorithmica. Currently, he is editing a special issue on "Analysis of Algorithms" in Algorithmica. He serves on the editorial boards of Theoretical Computer Science, Discrete Mathematics and Theoretical Computer Science, and the book series Advances in the Theory of Computation and Computational Mathematics.
Ways to Watch
![YouTube](/assets/images/youtube_200.png)