When : Monday, October 22, 2007 - 17:00
Speaker : Charles H. Bennett
Affiliation : IBM Research, Yorktown Heights
Where : Aula Magna 'A. Lepschy'
Abstract :
The quantum principles of superposition and entanglement have led to afundamental recasting of the theory of information and computation, andare especially helpful in understanding the nature of privacy. The mostprivate information, exemplified the path taken by a particle in theinfamous two-slit experiment, exists only conditionally andtemporarily--after the experiment is over even God has forgotten what'happened'. Less private are classical secrets, facts known only to afew, or information--like the lost poems of Sappho--that once was publicbut has been lost over time. Finally there is information that has beenreplicated and propagated so widely as to be infeasible to conceal andunlikely to be forgotten. Modern information technology has caused anexplosion of such information, with the beneficial side effect of makingit harder for tyrants to rewrite the history of their misdeeds; and itis tempting to hope that all macroscopic information is permanent,making such cover-ups impossible in principle. However, by comparingentropy flows into and out of the Earth with estimates of the planet'sstorage capacity, we conclude that most macroscopic information--forexample the pattern of sand grains on an ancient beach--is impermanent,becoming irrecoverable in principle from evidence on Earth though stillrecorded in the Universe. Depending on the diligence and forgetfulnessof his enemies, former US labor leader Jimmy Hoffa, thought to have beenmurdered in 1977, may by now have acquired this ambiguousepistemological status.BIOGRAPHY of CHARLES H. BENNETT July 2005 Charles H. Bennett was born in 1943, the son of music teachers Anne Wolfe Bennett and Boyd Bennett. He graduated from Croton-Harmon High School in 1960 and from Brandeis University, majoring in chemistry, in 1964. He received his PhD from Harvard in 1970 for molecular dynamics studies (computer simulation of molecular motion) under David Turnbull and Berni Alder. For the next two years he continued this research under the late Aneesur Rahman at Argonne Laboratory. Since coming to IBM Reseach in 1972, he has worked on various aspects of the relation between physics and information. In 1973, building on the work of IBM's Rolf Landauer, he showed that general-purpose computation can be performed by a logically and thermodynamically reversible apparatus, which can operate with arbitrarily little energy dissipation per step because it avoids throwing away information about past logical states; and in 1982 he proposed a reinterpretation of Maxwell's demon, attributing its inability to break the second law to the thermodynamic cost of destroying, rather than acquiring, information. In collaboration with Gilles Brassard of the University of Montreal he developed a practical system of quantum cryptography, allowing secure communicationbetween parties who share no secret information initially, based on the uncertainty principle instead of usual computational assumptions such as the difficulty of factoring, and with the help of John Smolin built a working demonstration of it in 1989. Other research interests include algorithmic information theory, in which the concepts of information and randomness are developed in terms of the input/output relation of universal computers, and the analogous use of universal computers to define the intrinsic complexity or 'logical depth' of a physical state as the time required by a universal computer to simulate the evolution of the state from a random initial state. In 1983-5 as visiting professor of computer science at Boston University, he taught courses on cryptography and thephysics of computation. In 1993 Bennett and Brassard, in collaboration with Claude Crepeau, Richard Jozsa, Asher Peres, and William Wootters, discovered 'quantum teleportation,' an effect in which the complete information in an unknown quantum state is decomposed into purely classical information and purely non-classical Einstein-Podolsky-Rosen (EPR) correlations, sent through two separate channels, and later reassembled in a new location to produce an exact replica of the original quantum state that was destroyed in the sending process. In 1995-7, working with Smolin, Wootters, IBM's David DiVincenzo, and other collaborators, he helped found the quantitative theory of entanglement and introduced several techniques for faithful transmission of classical and quantum information through noisy channels, part of the larger and recently very active field of quantum information and computation theory. Recently he has worked on the capacities for quantum channels and interactions to simulate one another and the tradeoffs among communications resources. He is an IBM Fellow, a Fellow of the American Physical Society, and a member of the National Academy of Sciences. He is married with three grown children. His wife, Theodora M. Bennett, recently retired from directing a housing mobility program in Yonkers. His main hobbies are photography and music.
When : Thursday, October 18, 2007 - 17:00
Speaker : Prof. Panganamala R. Kumar, Franklin Woeltge Professor of Electrical and Computer Engineering, and
Affiliation : Univ. of Illinois
Where : Aula Magna 'A. Lepschy'
Abstract :
We address the issue of organizing principles for three different types of emerging systems: wireless networks, sensor networks, and networked control. In wireless networks, there is no a priori notion of links: transmitting nodes simply radiate energy, and receiving nodes hear a superposition of all such transmissions. The wireless medium therefore offers many more possibilities for communicating information than just relaying packets from node to node. We address the question of what should be the architecture of wireless networks, as well as determining fundamental limits on their information carrying capacity. Sensor networks are comprised of nodes equipped with sensors monitoring their environment, as well as computational and wireless communication capabilities. Besides merely transmitting information, nodes can also combine, discard or process information. Thus the entire network comprises a computational cum communication system. We address the issue of how information should be processed within such networks. Finally, we turn to the problem of networked control, where nodes can act on their environment, as well as sense. We propose an abstraction of virtual collocation for enabling the proliferation of such systems, and provide an overview of efforts in the Convergence Lab at the University of Illinois.(Joint work with G. Baliga, V. Borkar, A. Giridhar, S. Graham, P. Gupta, K. Plarre, C. Robinson, H-J. Schuetz, R. Solis, L-L. Xie).
When : Thursday, June 21, 2007 - 17:00
Speaker : Keshav Pingali
Affiliation : University of Texas, Austin
Where : Aula Magna A. Lepschy, DEI
Abstract :
The advent of multicore processors has shifted the burden of improving program execution speed from chip manufacturers to software developers. Experience has shown however that parallel programming is very difficult; therefore, there is an urgent need for new ideas in programming languages, compilers, and runtime systems.In this talk, we describe the Galois system, an object-based system that uses optimistic parallelization to speed up complex ``irregular'' applications (i.e., applications that manipulate large, pointer-based data structures such as graphs). Two such real-world applications are described in this paper: a Delaunay mesh refinement algorithm and a graphics application that performs agglomerative clustering. Between them, these two applications manipulate several of the more important irregular data structures including lists, graphs, priority queues and trees.There are three main aspects to the Galois system: (1) a small number of syntactic constructs for packaging optimistic parallelism as iteration over ordered and unordered sets, (2) assertions aboutmethods in class libraries, and (3) a runtime scheme for detecting and recovering from potentially unsafe accesses to shared memory made by an optimistic computation.We show that Delaunay mesh generation and agglomerative clustering can be parallelized in a straight-forward way using the Galois approach, and we present experimental results on two different multiprocessors to show that this approach is practical.This is joint work with Milind Kulkarni, Kavita Bala, Paul Chew,Ganesh Ramanarayan, and Bruce Walter.--------------------------------------------------------------------------Biography:Keshav Pingali is a professor in the Computer Science department at the University of Texas, Austin, where he holds the W.A.'Tex' Moncrief Chair of Grid and Distributed Computing. He received the B.Tech. degree in Electrical Engineering from IIT, Kanpur, India in 1978, and the S.M. E.E., and Sc.D. degrees from MIT in 1986. He was on the faculty of the Department of Computer Science at Cornell University from 1986 to 2006, where he held the India Chair of Computing.Pingali's research has focused on programming languages and compiler technology for program understanding, restructuring, andoptimization. His group is known for its contributions tomemory-hierarchy optimization; some of these have been patented.Algorithms and tools developed by his projects are used in manycommercial products such as Intel's IA-64 compiler, SGI's MIPSPro compiler, and HP's PA-RISC compiler. In his current research, he is investigating optimistic parallelization techniques for multicore processors, and language-based fault tolerance. Among other awards, Pingali has won the President's Gold Medal at I.I.T. Kanpur (1978), IBM Faculty Development Award (1986-88), NSF Presidential Young Investigator Award (1989-94), Ip-Lee Teaching Award of the College of Engineering at Cornell (1997), and the Russell teaching award of the College of Arts and Sciences at Cornell (1998). In 2000, he was a visiting professor at I.I.T., Kanpur where he held the Rama Rao Chaired Professorship.
When : Wednesday, April 18, 2007 - 16:30
Speaker : Prof. Anthony Acampora
Affiliation : Department of Electrical and Computer Engineering
Where : aula magna 'Lepschy'
Abstract :
Abstract
As a research discipline, the field of telecommunications is at least 50 years old. In this talk, we shall briefly review some of the major milestone achievements of these past 50 years, and the impact that these have had on the commercial practice of telecommunications. Included will be such subjects as information/communications theory, digital transmission and switching, signaling systems, intelligent networks, optical communications, satellite systems, the Internet, and cellular radio networks. After this brief review, several topics of current research interest will be introduced and discussed. These include cooperation in wireless networks, wireless peer-to-peer networks, and broadband access networks. For each, we shall present a detailed description of the issues, proposed approaches for investigating and resolving these issues, and sample results reported to date. Also included will be a detailed discussion of techniques for bounding the performance of such networks, along with a comparison of the performance of practical solutions against these best-case bounds.
About the speaker
Anthony Acampora joined the Jacobs School of the University of California at San Diego in 1995, and was director of the Center for Wireless Comm. until 1999. For seven years, he was on the Columbia University faculty, where he directed the Center for Telecommunications Research. For 20 years, he was a research scientist and manager at AT&T Bell Labs. He holds 30 patents and is an IEEE fellow and author of a key textbook: An Introduction to Broadband Networks. He received his Ph.D. in electrical engineering in 1973 from Brooklyn Polytechnic Institute.
When : Tuesday, March 13, 2007 - 16:30
Speaker : Prof. J.K. Aggarwal
Affiliation : Cullen Professor, Department of Electrical and Computer Engineering, University of Texas at Austin
Where : Aula Magna A. Lepschy
Abstract :
The development of computer vision systems able to detect humans and to recognize their activities is a broad effort with applications in areas including virtual reality, smart monitoring and surveillance systems, motion analysis in sports, medicine and choreography, and vision-based user interfaces, etc. The understanding of human activity is a diverse and complex subject that includes tracking and modeling human activity, and representing video events at the semantic level. Its scope ranges from understanding the actions of an isolated person to understanding the actions and interactions of a crowd, or the interaction of objects like pieces of luggage or cars with persons.At The University of Texas at Austin, we are pursuing a number of projects on human motion. Professor Aggarwal will present his research on modeling and recognition of human actions and interactions, and humannb and object interactions. The work includes the study of interactions at the gross level as well as at the detailed level. The two levels present different problems in terms of observation and analysis. At the gross level we model persons as blobs, and at the detailed level we conceptualize human actions in terms of an operational triplet 'agent-motion-target' similar to 'verb argument structure' in linguistics. We consider atomic actions, composite actions and interactions, and continued and recursive activities. In addition, we consider the interactions between a person and an object including climbing a fence. The issues considered in these problems will illustrate the richness and the difficulty associated with understanding human motion. Application of the above research to monitoring and surveillance will be discussed together with actual examples