Quick links

Colloquium

What Should the FCC Do About Net Neutrality

Date and Time
Thursday, November 20, 2008 - 4:30pm to 6:00pm
Location
Woolworth Center 202
Type
Colloquium
Speaker
Phil Weiser
Host
Edward Felten
“Net Neutrality” means that Internet users get treated equally, no matter which web sites or services they use. Does this rule matter? How can government enforce it without hurting innovation? Phil Weiser, a law professor and veteran of the Department of Justice, will offer an answer. Abstract: Institutional Design, Network Neutrality, and FCC Reform

The Federal Communications Commission’s management of the Comcast affair and its ultimate decision in that case sanctioning Comcast’s behavior underscore several critical points about the future direction of Internet policy in the U.S. First, the broadband providers are unlikely to, without any form of effective oversight, refrain from either strategic behavior or innocent intended, but unfortunate resulting conduct. Second, the era of the “unregulation of the Internet” is over. And, third, the FCC has failed to develop a thoughtful institutional strategy for making the necessary judgments and implementing an effective oversight regime to govern the behavior of broadband providers.

To address the above points, the FCC should develop a new institutional strategy for confronting network neutrality issues like the one raised in the Comcast case and others likely to arise in the future. In particular, the FCC should authorize a private body to play a self-regulatory role and thereby enlist the traditional Internet ethos of collaboration as a policymaking tool. Such a self-regulatory effort, however, cannot succeed without effective oversight and involvement from the FCC, meaning, among other things, that its own adjudicative processes should be designed and managed in a far more effective manner than employed in the Comcast case. Notably, the FCC should develop processes more like a true adjudicative model, where it can find facts through evidence submitted under oath and subject to cross-examination, and use the deterrent value of such processes to encourage honest and good faith collaboration among the relevant stakeholders as well as compliance with the relevant standards of conduct.

While there is a role for some front-end guidance and definition of the relevant policy goals, the FCC cannot simply announce a set of rules or principles and declare victory. Indeed, as the Comcast case underscores, broadband providers will, for any number of reasons, test the bounds of principles like “reasonable network management” and the FCC will thus need to develop an institutional response for addressing such issues. To be sure, putting such a model into practice presents a number of challenges, but there are valuable guideposts on the road and enlightened broadband providers, applications developers, and end users alike recognize that it is in their collective interest to avoid overly intrusive or unduly politicized regulatory processes that have historically not governed the Internet. Over time, moreover, the model suggested herein may be suitable for a number of telecommunications policy issues that have yet to generate a robust institutional response from the FCC, including managing property rights in spectrum, overseeing interoperability between IP-IP communications, and ensuring interconnection between Internet backbone providers.

Biographical Information:

Professor Phil Weiser is a professor of law and telecommunications at the University of Colorado. At CU, he has worked to establish a national center of excellence in telecommunications and technology law, founding the Journal on Telecommunications & High Technology Law and the Silicon Flatirons Center for Law, Technology, and Entrepreneurship as well as writing and teaching in the areas of telecommunications and information policy. Over the last ten years, Weiser has co-authored two books (Digital Crossroads: American Telecommunications Policy in the Internet Age (MIT Press 2005) and Telecommunications Law and Policy (Carolina Academic Press 2006)), numerous articles, and has testified before both houses of Congress.

Prior to joining the CU faculty, Professor Weiser served as senior counsel to the Assistant Attorney General in charge of the Antitrust Division at the United States Department of Justice, advising him primarily on telecommunications matters. Before his appointment at the Justice Department, Weiser served as a law clerk to Justices Byron R. White and Ruth Bader Ginsburg at the United States Supreme Court and to Judge David Ebel at the Tenth Circuit Court of Appeals. Weiser graduated with high honors from both the New York University School of Law and Swarthmore College.

Professor Weiser is also a past LAPA Fellow.

Sponsored by CITP, LAPA and Microsoft. Laura Cummings-Abdo Program Assistant Center for Information Technology Policy Princeton University Sherrerd Hall, Room 303 Princeton, NJ 08544 Tel: 609-258-9658 http://citp.princeton.edu/

GPU Computing

Date and Time
Thursday, October 9, 2008 - 4:30pm to 6:00pm
Location
Computer Science Small Auditorium (Room 105)
Type
Colloquium
Speaker
Ian Buck, from NVIDIA
Host
Adam Finkelstein
Many researchers have observed that general purpose computing with programmable graphics hardware (GPUs) has shown promise to solve many of the world's compute intensive problems, many orders of magnitude faster the conventional CPUs. The challenge has been working within the constraints of a graphics programming environment to leverage this huge performance potential. GPU computing with CUDA is a new approach to computing where hundreds of on-chip processor cores simultaneously communicate and cooperate to solve complex computing problems, transforming the GPU into a massively parallel processor. The NVIDIA C-compiler for the GPU provides a complete development environment gives developers the tools they need to solve new problems in computation-intensive applications such as product design, data analysis, technical computing, and game physics. In this talk, I will provide a brief history of computing with GPUs, how CUDA can solve compute intensive problems. Looking forward, the shift to heterogeneous based computing is exciting new territory for languages, operating systems, compilers and architectures and will share some of NVIDIA's vision of that future.

Bio:

Ian Buck completed his Ph.D. at the Stanford Graphics Lab in 2004. His thesis was titled "Stream Computing on Graphics Hardware," researching programming models and computing strategies for using graphics hardware as a general purpose computing platform. His work included developing the "Brook" software toolchain for abstracting the GPU as a general purpose streaming coprocessor. Ian received is B.S.E from Princeton and was a proud former member of the Princeton CS Graphics Group. He is currently Software Director for GPU Computing.

Games With a Purpose

Date and Time
Wednesday, October 8, 2008 - 4:30pm to 6:00pm
Location
Computer Science Small Auditorium (Room 105)
Type
Colloquium
Speaker
Luis von Ahn
Host
Edward Felten
http://citp.princeton.edu/events/lectures/luis-von-ahn/

Sensing for Autonomous Driving: Some Lessons from the DARPA Urban Challenge Race

Date and Time
Friday, September 26, 2008 - 1:00pm to 2:30pm
Location
Computer Science Small Auditorium (Room 105)
Type
Colloquium
Speaker
Prof. Dan Huttenlocher, from Cornell University
Host
Fei-fei Li
Team Cornell's Skynet is one of six vehicles that successfully completed the 2007 DARPA Urban Challenge, with over 55 miles of fully autonomous driving in an urban environment. The competition included many scenarios such as staying in a lane, merging into traffic, passing other vehicles, obeying queueing order at stop signs, parking, and robot-robot interaction. Skynet was designed to drive "human-like" with smooth, predictable behaviors, even in the presence of a vast array of uncertainties. In this talk I will describe the vehicle design with a focus on the systems for perception and planning, and will present some results from the semi-finals and the final race. In particular I will discuss the pose estimation system which uses visual input to improve the estimate of the vehicle's location, and the object tracking system which can simultaneously track dozens of objects while accurately estimating their speed and heading. I will also discuss some of the limitations of such systems, which played a role in the fender bender between Skynet and MIT's Talos robot. Speaker: Dan Huttenlocher is the John P. and Rilla Neafsey Professor of Computing, Information Science and Business at Cornell University, where he holds a joint appointment in the Computer Science Department and the Johnson Graduate School of Management. His current research interests are in computer vision, social and information networks, geometric algorithms and autonomous driving. He has been recognized for his research and teaching contributions, including being named an NSF Presidential Young Investigator, New York State Professor of the Year and Fellow of the ACM. In addition to academic posts he has been chief technical officer of Intelligent Markets, a provider of advanced trading systems on Wall Street, and spent more than ten years at Xerox PARC directing work that led to the ISO JBIG2 image-compression standard.

Building a Strong Foundation for the Future Internet

Date and Time
Wednesday, May 14, 2008 - 4:00pm to 5:30pm
Location
Computer Science Small Auditorium (Room 105)
Type
Colloquium
Speaker
Jennifer Rexford, from Princeton University
Host
Jennifer Rexford
The Internet is unquestionably a tremendous success---a research experiment that truly escaped from the lab. However, the Internet faces many technical challenges that, while deeply rooted in early design decisions, have grown even more complex as the network has evolved into a world-wide commercial infrastructure.

In this talk, we argue that perhaps the most important goal for a future Internet is the ability to define, model, and analyze it precisely, so we can make stronger statements about its basic properties. Using Internet routing as a driving example, we discuss the research challenges in designing protocols that are simultaneously programmable (so they are flexible and can evolve over time) and perform well in a competitive economic environment (where different parts of the system are controlled by parties with different, sometimes conflicting, objectives).

We believe that answering these fundamental questions presents a wonderful opportunity for theoretical research in computer science, electrical engineering, economics, and mathematics. _______________________________________________

Learning Structured Bayesian Networks: Combining Abstraction Hierarchies and Tree-Structured Conditional Probability Tables

Date and Time
Wednesday, May 7, 2008 - 1:00pm to 2:30pm
Location
Computer Science 418B
Type
Colloquium
Speaker
Marie desJardins, from University of Maryland
Host
Jennifer Rexford
In this talk, I will describe our research on incorporating background knowledge in the form of feature hierarchies during Bayesian network learning. Feature hierarchies enable the learning system to aggregate categorical variables in meaningful ways, thus enabling an appropriate "discretization" for a categorical variable. In addition, by choosing the appropriate level of abstraction for the parent of a node, we also support compact representations for the local probability models in the network. We combine this notion of selecting an appropriate abstraction with context-specific independence representations, which capture local ndependence relationships among the random variables in the Bayesian network. Capturing this local structure is important because it reduces the number of parameters required to represent the distribution. This can lead to more robust parameter estimation and structure selection, more efficient inference algorithms, and more interpretable models.

I will describe our primary contribution, the Tree-Abstraction-Based Search (TABS) algorithm, which learns a data distribution by inducing the graph structure and parameters of a Bayesian network from training data. TABS combines tree structure and attribute-value hierarchies to compactly represent conditional probability tables. In order to construct the attribute-value hierarchies, we investigate two data-driven techniques: a global clustering method, which uses all of the training data to build the attribute-value hierarchies, and can be performed as a preprocessing step; and a local clustering method, which uses only the local network structure to learn attribute-value hierarchies. Empirical results in several benchmark domains show that (1) combining tree structure and attribute-value hierarchies improves the accuracy of generalization, while providing a significant reduction in the number of parameters in the learned networks, and (2) data-derived hierarchies perform as well or better than expert-provided hierarchies.

BIOGRAPHY Dr. Marie desJardins is an associate professor in the Department of Computer Science and Electrical Engineering at the University of Maryland, Baltimore County. Her research is in artificial intelligence, focusing on the areas of machine learning, multi-agent systems, planning, interactive AI techniques, information management, reasoning with uncertainty, and decision theory.

Dr. desJardins can be contacted at the Dept. of Computer Science and Electrical Engineering, University of Maryland Baltimore County, 1000 Hilltop Circle, Baltimore MD 21250, mariedj@cs.umbc.edu,(410) 455-3967.

How are Mobile Phones Changing Families

Date and Time
Wednesday, April 30, 2008 - 4:30pm to 6:00pm
Location
McCosh Hall 46
Type
Colloquium
Speaker
Jim Katz
Host
Edward Felten

The truth about quantum computers.

Date and Time
Wednesday, April 23, 2008 - 4:30pm to 6:00pm
Location
Computer Science Small Auditorium (Room 105)
Type
Colloquium
Speaker
Umesh Vazirani, from UC Berkeley
Host
Sanjeev Arora
Popular articles on quantum computers sometimes portray them as mythical devices that speed up any computation by an exponential factor. The truth is much more subtle, and over the last decade and a half we have learned a great deal about the circumstances under which Nature allows us access to its exponential computational resources. This has a bearing on our understanding of the foundations of both computer science and quantum physics. In this talk I will try to outline some of these issues. No background in quantum physics will be assumed.

Recent Directions in Nonparametric Bayesian Machine Learning

Date and Time
Thursday, March 27, 2008 - 4:15pm to 5:45pm
Location
Computer Science Small Auditorium (Room 105)
Type
Colloquium
Speaker
Zoubin Ghahramani, from Carnegie Mellon University
Host
David Blei
Machine learning is an interdisciplinary field which seeks to develop both the mathematical foundations and practical applications of systems that learn, reason and act. Machine learning draws from many fields, ranging from Computer Science, to Engineering, Psychology, Neuroscience, and Statistics. Because uncertainty, data, and inference play a fundamental role in the design of systems that learn, statistical methods have recently emerged as one of the key components of the field of machine learning.

In particular, Bayesian methods, based on the work of Reverend Thomas Bayes in the 1700s, describe how probabilities can be used to represent the degrees of belief of a rational agent. Bayesian methods work best when they are applied to models that are flexible enough to capture to complexity of real-world data. Recent work on non-parametric Bayesian methods provides this flexibility.

I will touch upon key developments in the field, including Gaussian processes, Dirichlet processes, and the Indian buffet process (IBP). Focusing on the IBP, I will describe how this can be used in a number of applications such as collaborative filtering, bioinformatics, cognitive modelling, independent components analysis, and causal discovery. Finally, I will outline the main challenges in the field: how to develop new models, new fast inference algorithms, and compelling applications. >

Software Transactions: A Programming-Languages Perspective

Date and Time
Wednesday, March 26, 2008 - 4:15pm to 5:45pm
Location
Computer Science Small Auditorium (Room 105)
Type
Colloquium
Speaker
Dan Grossman, from University of Washington
Host
David Walker
With multicore processors bringing parallel computing to the masses, there is an urgent need to make concurrent programming easier. Software transactions hold great promise for simplifying shared-memory concurrency,and they have received enormous attention from the research community in the last couple years. This talk will provide an overview of work done at the University of Washington to help bring transactions to the next generation of programming languages. No prior knowledge of software transactions will be necessary.

Our work complements research done on hardware and software algorithms for implementing transactions by considering essential issues regarding how transactions affect language semantics and language implementation. Our motivation takes the novel view that transactions can improve language support for concurrency much like garbage collection can improve language support for memory management. Our language design and language semantics work has considered the pitfalls of so-called "weak isolation" and how to avoid them, interaction with other language features like native calls and exceptions, and the implications for shared-memory consistency models. Our implementation work includes techniques for the special case of a uniprocessor, a whole-program static optimization that uses pointer information to remove unnecessary read- and write-barriers while providing "strong isolation", and ongoing work for allowing parallelism within transactions.

Dan Grossman is an Assistant Professor in the Department of Computer Science & Engineering at the University of Washington. His research in the design and implementation of programming languages is aimed at improving software quality.

Follow us: Facebook Twitter Linkedin