This volume, updated in May 2013, includes 45 opinion columns written for The Daily Princetonian, the student newspaper at Princeton University, from 2006 to 2013. I'm grateful to the Opinion editors each year, who offered advice and their own opinions, and accepted with good grace my unwillingness to change my deathless prose to match their editorial requirements. I'm also deeply indebted to my friend and fellow columnist Joshua Katz, Professor of Classics, for his unfailing support.
Princeton, NJ, May 2013
Table of Contents
Preface
Advise and consent (10/02/06)
Can you hear me now? (11/06/06)
The sounds of music (12/04/06)
Too precious to waste on the young? (02/05/07)
What do they teach in professor school? (03/05/07)
Lost and found (04/02/07)
The Sons of Martha (05/07/07)
Sometimes the old ways are best (09/17/07)
By the dawn's early light (10/15/07)
Other princes (11/19/07)
Making the grade (1/21/08)
Where the books are (2/18/08)
The bleeding edge (3/24/08)
By the numbers (4/21/08)
Native guide (9/11/08)
Leading indicators (10/20/08)
What would Orwell do? (11/24/08)
The ratings game (1/7/09)
Washington crossings (2/16/09)
Gladly learn, and gladly teach (3/23/09)
What makes smart phones smart? (4/20/09)
How was your summer? (9/21/09)
Home alone (10/19/09)
A sense of where you are (11/23/09)
Millions, billions, zillions (2/22/10)
Action figures (3/29/10)
Friends and family (4/26/10)
Call me Ned (9/27/10)
We're number one! (10/25/10)
Sic transit (11/30/10)
Anti social (1/31/11)
Enter to grow in wisdom (3/1/11)
Read any good books lately? (4/4/11)
Spring is sprung (5/2/11)
Powerless (10/3/11)
Give me your undivided attention (11/7/11)
Who goes there? (12/5/11)
So you want to write a book (3/12/12)
Preview of coming attractions (4/16/12)
I don't belong here! (9/24/13)
Watchbirds (10/22/12)
D construction (11/26/12)
The very model of a modern major major (2/3/13)
Somewhere in time (3/4/13)
Hack Princeton (4/19/13)
Shopping period is over, schedule conflicts are resolved (though not
always in ideal ways) and everyone is settled into their courses.
Academic advising is on the back burner until late November.
By now I've advised my new freshmen, caught up with most of my
sophomores, and talked about courses, independent work, grad schools and
jobs with a lot of juniors and seniors. Whether good or bad, much
advice has been dispensed, and I sometimes wonder what effect it has.
I started academic advising in the fall of 2001, late on a Monday
afternoon. My qualifications for the job were that I had been on the
faculty for one year and had taken a three-hour course that morning.
Adviser training was a blur of course numbers and prerequisites and
premed requirements and AP credits and distributions, all knotted
together by intricate paths through departments and certificate programs
that I had never even heard of, let alone known intimately.
"Unprepared" doesn't begin to capture how woefully ignorant
I was of what my innocent new freshmen would need to know and of how to
steer them safely through the maze. Nor was I alone; most new advisers
come to the position with a similar lack of expertise.
In the event, I muddled through, rescued dozens of times by real
experts like John Hodgson, the Dean of Forbes, his counterparts from the
other colleges and my more experienced faculty colleagues.
In retrospect, it appears that no one was permanently damaged. The
last of that group graduated last June. A handful took five years
instead of four, but they all eventually made it, apparently in good
condition, and they have gone on to a variety of successes, including
four or five first-rate grad schools, a med school or two and some truly
interesting jobs. I can't claim any agency for this, of course, but I
do take a great deal of pleasure from it. I've also made some wonderful
friends, both among the students who were so patient with my learning
curve and among the residential college staff and faculty advisers.
I've gotten better at advising over the years — the weird rules
about rescinding a pass/D/fail have finally been internalized, and I can
tell you how to get a summer course approved — but much remains
just as fuzzy as it was then. The difference is that I don't worry
about it as much, not because I care less about my advisees but because
I now know where to get help and I've seen how the system bends ever so
flexibly when it should.
I've also figured out what the job is, at least to some degree.
Harry Truman once said, "I have found that the best way to give
advice to your children is to find out what they want, and then advise
them to do it." Weighing in from an earlier time, Samuel Johnson
wrote, "Young people are readier to talk than to attend, and good
counsel is only thrown away upon those who are full of their own
perfections." In short, the adviser's role is to listen to your
story, get you to say clearly what most appeals to you, agree that it's
a fine idea, then sign the mysterious SCORE form.
All this presupposes, however, that the advisee has thought pretty
hard about what he or she wants to do, and that's the place where things
sometimes break down. "Do you know any good courses?" I've
been asked that very question (though not recently), and of course the
answer is "Yes, lots of them", but it doesn't really narrow
the options.
By hearsay and even a bit of firsthand experience, advisers will know
of some really good courses that might fill a gap or painlessly kill off
a distribution requirement, but no adviser should be telling you what to
do if you haven't figured out at least the basics for yourself. To get
anything out of this exercise, you have to put something into it.
Showing up for an advising appointment without even a copy of Course
Offerings isn't the best way to plan for the next semester or for your
ultimate escape with degree in hand.
So talk to friends and family, dig through online opinions and set
out a tentative plan. If you've done your homework, the rest is easy
— your adviser will be able to advise and happy to consent. Keep
it in mind for when you meet again at the end of November.
I'm certainly not an uber-geek, but I spend a lot of time with
computers and I teach a course about how computers and communications
affect society, so in theory I should be an expert on things
technological. But every once in a while there are hints of cracks in
the facade. This year they've converged on a pervasive technology where
I've clearly fallen way behind.
In July there was a story of a guy cheating on his wife and trading
hot text messages with his girlfriend via cell phone. He decided to
upgrade and sold the old one (the phone, not the girlfriend) on eBay,
where the buyer, who was doing a study about what information remains on
secondhand cell phones, discovered the messages. One imagines that the
guy might be a bit worried that his wife will find out.
The story was a great example for the first lecture of my class,
illustrating how technology can have utterly unexpected consequences.
For me it was intriguing on several fronts (leaving aside how satisfying
text-message sex might be): why would the messages be stored on the
phone, and how many could it hold anyway?
My class right now is in Peyton 145, a decent-enough lecture room
compared to most, in an almost unknown building just east of the
tortured metal of the new science library. One thing that Peyton 145
has going for or against it, depending on your point of view, is that
there's cell phone coverage in the room. This is a new experience
— my previous classes have mostly been in the basement of Friend,
which is still mercifully beyond the reach of Verizon, T-Mobile and the
rest.
At the beginning of one lecture, someone found a cell phone on the
floor, presumably dropped during the previous class. I left it on the
table at the front of the room, figuring that its owner would eventually
retrace his or her steps. Silly me — there's a much more obvious
solution if one is tech savvy. Twenty minutes into the hour it rang,
and when I answered, a female voice said "Do you have my
phone?"
I worked for many years for AT&T, the company that invented the
cell phone and that, famously, believed the consulting firm that advised
them in 1983 that there would never be much of a market for a portable
phone. AT&T subsequently went through grim times, including being
bought out, but the consulting firm continues to thrive. Draw your own
conclusions, though it does suggest there's no need to be right if
you're convincing.
At AT&T I even worked for a time on software for figuring out the
best places to put cell phone equipment, so perhaps it's a kind of rough
justice that my own cell phones have never worked very well. The campus
is full of students permanently on the phone. The highways are
dangerously crowded with idiots blathering at 75 miles an hour. In the
city, schizophrenics talking to themselves are vastly outnumbered by
hands-free cell phone users, acting just as oddly but providing cover
for the truly disturbed. My phone, however, didn't work at home, in my
office or anywhere else. The final straw came last month when I spent
three days in midtown New York and was unable to make a single call.
Thus it was that I found myself exploring the cell phone stores that
have sprung up like Starbucks on every corner and in every mall.
Salespeople were always polite but visibly astonished at the antiquity
of my phone (bought in 2000, shortly after the end of the Jurassic
period) and my usage pattern (rarely more than one minute per month),
and it was hard to find common ground. I just want to make occasional
calls. What would I do with a "basic" phone that only has
Bluetooth, a video camera, voice dialing, downloadable ringtones, an
Internet browser and calling plans that begin at 700 minutes a month?
In the end, I bought a new phone that appears to have most of the
above, though I have yet to plow through the 120-page instruction manual
to be sure. It lets me make phone calls, but it often doesn't work in
my office and barely in my home. It takes pictures but refuses to send
them anywhere. And it costs more. So I am now less ignorant of this
brave new world but no more happy in it. I have no desire to emulate
the multitude whose phones have been surgically attached to their ears
so they can talk 24/7, but it sure would be nice to have a phone that
works.
"Here will we sit, and let the sounds of music The iPod has been an enormous success: 60 million have been sold, and
even a short walk around campus suggests that thousands of those sales
have been to Princeton students, for whom the distinctive white earbuds
are the path through which the sounds of music creep.
In mid-October, Newsweek interviewed Steve Jobs, founder and CEO of
Apple, for the fifth anniversary of the iPod. Mr. Jobs was asked
whether he was worried about Microsoft's about-to-be-released iPod
competitor, the Zune. His response also mentioned ears, though without
Shakespeare's imagery or language. "In
a word, no. I've seen the demonstrations on the Internet about how you
can find another person using a Zune and give them a song they can play
three times. It takes forever. By the time you've gone through all
that, the girl's got up and left! You're much better off to take one of
your earbuds out and put it in her ear. Then you're connected with
about two feet of headphone cable."
I'm not au courant enough with today's social customs to know whether
this "stick it in your ear" approach would be deemed sweet or
romantic or just gross, but Mr. Jobs does know something about
marketing, so I would put my money on his conclusions, if not his
technique. A bit of quantitative reasoning shows that 60 million iPods
at perhaps $200 each is $12 billion. That revenue would be a modest
drop in any Microsoft bucket, but it's not shabby either.
At the heart of the rivalry between the iPod and the Zune is a
life-and-death struggle to make money by selling something —
digital music — that can be copied in unlimited amounts and
distributed everywhere in the blink of an eye, and all for free. When
Napster appeared in 1999 (how long ago that seems now), it showed that
there was a huge demand for music, especially if the price was right,
and Napster's price of zero was hard to beat.
The recording industry decided that they were losing money because
people were downloading music for free instead of paying high prices for
CD's with a dozen songs, only one or two of which were good. The RIAA
sued Napster out of business and then began to sue potential customers,
a marketing strategy that continues to this day. Apple came up with a
better idea — offer songs online for a flat 99 cents and sell
great portable gadgets for playing them. Apple has resisted pressures
to change the pricing structure and their digital rights management has
been pretty reasonable, so their business is booming.
By contrast, the Zune appears not to be as well thought out. I can
send you a song wirelessly (the Zune's big selling point) but the song
can only be played three times, and, no matter what, it disappears into
the ether or the bit bucket after three days. Music that you might have
bought earlier through Microsoft's "Plays For Sure" program
won't play at all on the Zune. It doesn't work with Windows Media
Player and not yet with Vista, the brand new version of Windows.
Microsoft often gets things right in the end, but I suspect that the
iPod will hang on to its market share for a while yet.
Let's do a bit more QR. Apple has made $1.5 billion selling songs
through iTunes; that's 1.5 billion songs or about 25 songs per iPod.
But a typical iPod can hold thousands of songs, so most iPods contain
little if any music purchased from Apple, and I'd be willing to bet that
most of the songs still come from sharing of one kind or another, in
spite of the RIAA's efforts.
Meanwhile, my personal search for good music involves a weekly visit
to the Princeton Record Exchange. After some initial binge buying, it
became clear that discipline was called for, so now I won't buy any CD
that costs more than $2. That doesn't limit me much, and my collection
has exploded in the past five years. Two dollars a CD is way below 99
cents a song, and I don't lease the bits — I own them, in an open
format that will never go away. With low prices and no DRM, I buy all
kinds of music on speculation, and pass it on to someone else if it
doesn't work out. One final QR exercise: I could rip my 600 CD's into
MP3 and load them all onto a single iPod with room to spare. Maybe it's
time to buy one after all.
"Come, ho! and wake Diana with a hymn: "What am I going to take?" is the burning question for everyone
during shopping period. The good news is that there is an endless
number of interesting courses, from AAS to WWS; the bad news, of course,
is that they're all scheduled at exactly the same time. Whoever said
that "Time is nature's way of making sure that everything doesn't
happen at once" had never seen the pileups on MW and TTh at 11 a.m.
There's another group who worried about this question a few weeks
ago, people that we all see but don't really notice. In the back of
almost every medium to large class is a collection of students who don't
fit the 18-to-22-year old demographic of the typical Princeton
undergrad; indeed, many of them look more like grandparents (or, as my
wife observes, like me.) These are the community auditors.
I had never heard of community auditors until five or six years ago,
when someone asked if a couple of them could attend my class. The
ground rules were simple: auditors had to sit at the back and say
nothing (just like regular undergrads). In fact, it's a bit more
complicated than that — they have to stand in line, way before the
semester starts, to sign up for the one or two courses they want, hoping
that the auditor quota won't be reached. They also pay a not
inconsequential fee of $120 per course.
Ever since, I have had a few auditors in each of my classes. They
are readily identifiable by age, though not all are "senior
citizens," but they are also distinguished by a degree of regular
attendance and attentiveness not always achieved by the younger
generation. They are, to a man or woman, remarkably nice people,
accomplished, eager to learn and great fun to talk to. Some have become
good friends, an unexpected but wonderful benefit.
When I first encountered community auditors, it occurred to me that I
could audit courses too. In theory, there might be time to audit one
course each semester, though in practice it usually only works in the
spring. It's been pretty easy to disguise myself as a community
auditor; indeed, I was even carded once in POL 380 by a
clipboard-carrying lady who was weeding out interlopers.
("Faculty," I whispered, and she left me alone.) By now I've
audited half a dozen courses, not only filling in some of the giant
holes in my formal education, but also seeing how gifted teachers do
their thing so well.
One gets a different perspective on the educational process by
sitting in the back instead of standing at the front. It's easy to
estimate what fraction of students cut classes, by counting empty seats
over the semester or by noticing which familiar faces show up only
infrequently. One can see what's happening on the laptops whose screens
are hidden from the instructor. (There's some diligent note-taking, to
be sure, but also a lot of equally diligent emailing, chatting,
game-playing and random surfing.) One can monitor the student diet, or
at least portable components like bagels and coffee. And one even gets
hints of social life by observing the whispered conversations, the
continual checking of cell phones and the occasional semiprivate public
displays of affection. "Multitasking" is the most positive
spin one could put on these lecture-hall activities, and not
surprisingly, those who sit further toward the back of the room seem to
pay less attention to the front and more to the distractions.
Sleeping in class, however, is more easily seen from the front. Some
narcolepsy is caused by boring lectures and lecturers, but it's also
true that most students are in the middle of four years of major sleep
deprivation, and class is the only time to catch up. So I'm pretty
sympathetic to those who doze off — I've certainly followed suit
in some late afternoon seminars.
But what about those who just never show up? It has often surprised
me how frequently people cut classes. An individual Princeton course
retails for $4,125 (a startling figure that you can verify on the
Continuing Education website). It's not a meaningful computation, of
course, but consider: if a course has 24 lectures, that's about $170
each, which is more than pocket change. It would seem like a Good Thing
to go to every lecture and pay attention at least some of the time.
Failing that, is it better to surf and sleep in the back of McCosh 10 or
in the comfort of one's own room? Good question. Maybe we should ask
the community auditors. I think they have it figured out.
This is the time of year when our department begins to interview
faculty candidates. Every week we see one or two bright young things
(relatively speaking — they are usually in their late-20s or even
mid- 30s) who want to join the academic rat race. They spend a day or
two talking to faculty members for half an hour each, they have lunch
with the chair, they go to dinner with the few people whose waistlines
and stomachs can stand yet another dinner at some downtown restaurant
and they give a talk late in the afternoon in a darkened room where
everyone from grad students to senior faculty can fall asleep together.
From this process, we pick the best and the brightest to educate the
youth of America.
It's clear that most of these worthy people have had a lot of advice
on how to give a formal academic presentation. Their PowerPoint slides
are beautifully organized, the background designs are works of art,
bullet points fly in from all directions, the animations jump and spin
and the sequence follows an immutable formula: the problem, the prior
work, the contributions of the thesis, the future work, the
acknowledgments. It's incredibly soporific, and I confess without much
embarrassment that I often doze off.
But when I wake up, I wonder — these fine young people have
been taught how to give a great job talk, but what have they been taught
about giving two or three lectures every week to real students? What do
they know about the reality of being a professor? If they had gone to
Prof. School, what might they have learned?
One thing they should have learned already is that PowerPoint is a
two-edged sword. In the right hands, it can be persuasive and
effective, but in the wrong hands (that is, almost everyone who uses
it), it provides form without content, five minutes worth of talking
points to spread over an hour. If we could somehow convert PowerPoint
slides into pills, insomnia, like smallpox, would be eradicated from the
earth.
The big problem with PowerPoint is that it almost mandates tightly
scripted presentations that are little more than a reading of the
slides. Whether one reads the text verbatim or paraphrases it in real
time, the effect is the same: the listeners know exactly what's coming
up, they can read it faster than you can, and they tune out, perhaps not
to return until several slides later. Nor is there room for spontaneity
or changes of plan. Some of the very best moments in my classes have
come when the lecture veers completely off what I had planned —
someone asks a question from left field, or a demo goes awry, or a
chance visit to a web page leads to something unpredicted but
instructive. (Naturally, some of the very worst moments have arisen in
much the same way, but we'll save those for another column.)
In practice, one learns the tricks of any trade through hard
experience. One thing our young aspirants will have to learn quickly:
Never ever say anything in the last 10 minutes of a class that even
hints that the ordeal might be over. Don't say "Next
time...," even if all you had in mind was, "Next time the Red
Sox win the Series..." Suffering students hear only the promise of
release, and they start to load notebooks and cellphones and iPods and
laptops into their backpacks; one can hardly hear the "..."
over the din.
It's even worse to run over time, of course — most people can
only endure so much anyway, and they get justifiably restless when the
allotted time is up. They have other classes and important obligations
like lunch or practice, and it's unfair to make them late. But what if
there's some topic that you just have to cover so the course doesn't
fall hopelessly behind the syllabus? The natural tendency is to
realize, about five minutes from the end, that there's still 15 minutes
of material, and this requires speaking at three times the normal speed
to fit it all in. Naturally, no one is paying attention, so no one will
understand or even remember what you said, and you'll just have to do it
again next time anyway.
Would we do better if there were formal courses? My wife, who used
to be a sixth grade teacher, got her first position on an emergency
basis, replacing on short notice an incumbent who had become ill. She
was forced, after the fact, to take education courses, not because she
was a poor teacher (quite the contrary), but rather to ratify her
otherwise irregular position. To this day, she swears that the only
thing she remembers is that women were not supposed to wear red dresses,
lest they inflame the students (in sixth grade, mind you). I don't even
have the advantage of an education course. Like my colleagues, I've
learned only through-on-the job training. So I know enough not to wear
a red dress, and, finally, I only say "finally" when it's
really over.
I found a cell phone on the Street at the beginning of spring break.
A night of exposure to the bitter cold had not dimmed its brave little
backlight, and there was no car traffic, so it hadn't been crushed
either. I took it home to warm up but couldn't see a way to tell who
owned it except by calling "Susie" or "Mom" or
perhaps "Dad's cell," and that seemed too much like an
invasion of someone's privacy. The phone beeped a couple of times
during breakfast, I think because "Susie" was trying to leave
a message, but there was still nothing to identify the owner.
So I carried it to Public Safety down by Baker Rink, where this was
evidently such a common occurrence that they had a well-established
protocol for dealing with it. "No problem," said the lady who
took it off my hands. "It happens all the time." By now the
phone and its owner have long since been reunited, but it surely would
have been a major nuisance if they had not — quite apart from the
expense of having to buy a new one, there must have been a hundred
numbers stored in the phone, and who knows what other precious
information.
By contrast, prox cards are at the low end of electronic
sophistication, but they too must be a nuisance and expensive to
replace. In spite of that, lost ones are readily spotted; I've found
half a dozen in the past few years. There's usually no problem
identifying the owner, though one card was so badly beaten up that it
had clearly been lost for months. That one was autopsied so my class
could see for themselves why the badge office says "don't punch
holes in your prox."
Not every found object is as interesting or valuable as a phone or as
easy to associate with its owner as a prox, but some detritus is both
useful and anonymous. My personal best was a crumpled $20 bill, looking
like any other piece of random litter outside Dillon Gym, but more often
one sees only pennies, which are hardly worth picking up these days. I
found a 128 MB flash drive once, not one byte of which stored anything
at all, let alone the owner's identity. It seemed like a lot of memory
at the time, but in light of today's 4 GB capacities, it's more like
finding a penny.
There's something especially useful that is sometimes lost on the
Street and other parts of campus: unopened beer cans! There's no
shortage of empties, of course, but it's hard to tell which ones are
really pristine — in poor light, from a distance and at a walking
pace, it takes a sharper eye than mine to distinguish full from empty
from some leaky intermediate state. Furthermore, local tastes seem to
run to corn-flavored soda water like Busch Natural Light, so there's
little incentive to do a careful investigation.
Back on the high-tech front, someone found an iPod Nano in my class a
few weeks ago and left it with me to track down whomever it belonged to.
Email to class members drew no response, so I played with it for a
while, looking for its owner's name. In theory and in common belief,
iPods are easy and natural to use, but it took me more than a few
minutes to figure out how to make the click wheel do its thing; online
reviews suggest that I'm not the first person to think that the Nano
isn't as user-friendly as its bigger siblings. Of course I was
discharging its tiny battery while searching, so it had to be turned off
before it ran down completely. That required consulting an expert
(i.e., the nearest person under 25), who showed me how to hold down the
bottom button long enough. Eventually I did discover a screen that said
"Joe's iPod," so Joe got his expensive toy back.
Surprisingly, the accompanying cable that had gone missing at the same
time was still lying on the classroom floor two days later; it clearly
hadn't been worth picking up, any more than pennies are.
Today's trend is to combine functions into ever smaller and more
powerful electronic devices, like Apple's forthcoming iPhone, which will
combine the functions of an iPod and a cell phone, and which could
certainly be a prox as well. That means we can pack more in —
music and movies and messages and phone numbers almost without limit.
But it also means that as gadgets get smaller and smarter, we're more
vulnerable to losing something big when we lose something small.
Meanwhile, if you want to misplace some money or decent beer, how about
dropping it outside my office so it's easy to find.
During the great Nor'easter of April 2007 (exactly three weeks ago!),
there was a flurry of Public Safety announcements to the effect that
"nonessential personnel" could stay home rather than face
multi-hour commutes over flooded roads.
Naturally one wonders, "Am I essential or could they do without
me?" That was never spelled out for faculty, and cynics could argue
it either way, but at least indirectly one of the messages appeared to
answer the question: "The academic schedule is operating as
normal." Unlike students, who clearly need not be present for
classes, faculty must be.
But who are the real "essential personnel"? In various
ways, we all are — the place wouldn't be the same without us
— but let me put in a special plug for a group that is often
pretty much invisible and whose contributions are easily overlooked.
Think for a moment about the building services people who keep things
running, often very early in the morning, probably for modest pay and
zero recognition, if indeed they are even noticed as we go about our
business. I think of Pearl Holloway, for example, who keeps the
Computer Science Building shipshape. I'm an early person, and I often
arrive at my office by 7:30 or 8 a.m. But I have never gotten there
before Pearl has been around to empty the trash and generally straighten
up the mess we left the day before. When I walk across campus at 6 a.m.
to get a paper at the Wa, I often run into grounds crews, like Frank and
Marco, who are already on the job, shoveling and sanding the sidewalks
in the winter or repairing the ravages of the weather or cleaning up
leaves and other debris so the campus is always neat. Indeed, there's a
small army of essentials: wonderfully friendly and helpful people who
labor in the background so we can have an easy and comfortable life
here.
So what does this have to do with the Sons of Martha?
In 1922, Rudyard Kipling was commissioned to create a ceremony for
graduating Canadian engineering students. This secret "Ritual of
the Calling of an Engineer," which I went through in 1964, is held
in early May. The only public manifestation is that Canadian engineers
wear, on the little finger of the working hand, an Iron Ring symbolizing
the engineering profession. Legend has it that the original iron rings
were made from steel salvaged after the collapse of the Quebec Bridge in
1907, an engineering failure of such magnitude that it has not been
forgotten to this day. Modern iron rings are stainless steel; mine,
which predates that era, rusted for months until my finger came to an
understanding with it. I often identify Canadian engineers by their
rings, and I still have the one that my father received in 1931 and wore
until his death.
In 1907, Kipling wrote a poem called "The Sons of Martha,"
which he used as part of the iron ring ceremony. His inspiration came
from Luke 10:38-42. Jesus visited Mary and Martha, the sisters of
Lazarus, at their home. Mary sat at the feet of Jesus to hear him
speak; Martha, worried about providing for her eminent guest, complained
to Jesus that Mary was not helping. Jesus chided her gently, and, in
Kipling's poem, her descendants forever after are consigned to working
in the background to help everyone else: the sons of Mary:
Kipling's poem speaks mostly of heavy machines and those who operate
mechanical systems (and of course he wrote before women engineers), but
the spirit of the poem applies far beyond that. Translated into modern
terms, most of us pay no attention to those who work long and hard
behind the scenes with little recognition, let alone thanks. Think
about them the next time that someone nearly invisible keeps the
machinery working for you. Where would we be without today's sons and
daughters of Martha?
Last week I went to check out the room where I'm teaching this fall.
It's the same one as last year, so its quirks are mostly familiar. The
tangle of wires at the front is the same, but I got pretty good at
dodging them last time, a skill that should come back quickly. The
spotlights that shine directly on the bottom half of the screen seem to
be aimed a bit higher than last time, so projecting while using the
board might be harder. The room lighting level remains so dim that the
back rows disappear into the gloom, which makes them great for napping,
but less so for pedagogy. The laptop projector and my new Mac came to
only a fuzzy agreement on what to display, so I'll probably wind up
using my ancient PC.
Last year I was on a panel to discuss technology in education. The
purpose was presumably to look forward to new and better gadgetry for
teaching, but I was not the right person to lead the charge: in spite of
being in a high-tech field, I'm basically a reluctant late follower for
most aspects of technology, and especially in the classroom. I've never
tried clickers. I don't blog or chat. My first life is busy enough
that I couldn't cope with a second one. I use Blackboard only under
duress: it's the only way to get pictures of the people in my classes.
(Note to students: please update your high school yearbook shots —
you don't look like that now.)
I do like to use a variety of media in class. Foils on an overhead
projector are usually best because I can change the sequence, slide them
up and down for better visibility, play peekaboo and write on them.
Powerpoint is deadly, but a laptop is good for pictures, demos,
experiments or anything dynamic. And the board remains best for
answering questions, elaborating on some topic or making rough sketches.
(As anyone who has been in my class will attest, rough is as good as it
gets.)
As I thought about the panel, it became clear that what we really
need is not new technology, but simple old-fashioned technology that
works. For example, I'd like a whiteboard that's well lit, though I
could live with a decent blackboard, which is preferred by colleagues
who are even more backward-looking. I'd like an overhead projector that
shows a bright sharp image that can be seen from anywhere in the room;
the field of optics has been around for over 300 years, and by now we
ought to be able to make a projector that will focus the entire image,
rather than forcing me to choose which part will be too blurry to read.
I'd like a bright sharp laptop projector, and it would be wonderful if
both projectors could be used at the same time.
There should be light on the board so my scribbles can be seen, but
no light on the screen so the images aren't washed out. I want the
students in bright light too so I can see them — keeping students
in the dark literally as well as figuratively is a bad plan, since it
puts them to sleep.
The hard part, of course, is that I want all of these simultaneously,
so I don't have to keep fiddling, especially not moving the screen up
and down. It would also be great if there were no noise from fans, air
conditioners and projectors, so we can hear each other.
There's another old technology that would be a big help: complete
information about who's there. Though I can never aspire to the level
attained by my colleague Joshua Katz, who knows everyone by the end of
the first week, I do want to know who my students are. As a minimum,
I'd like name (including preferred name, in case Jonathan Pippington
Squeak III prefers "Pip" to "Jon"), netid so I can
send spam, picture (a current one would be nice), year, and perhaps
major and/or residential college. It would also be helpful to know
gender, since that can't always be determined by name alone.
I can get some of this from Blackboard, though only by scraping the
screen. Some comes from the registrar, but only in PDF. Peoplesoft
reveals a bit more, but only when I submit grades. And I find out
gender reliably only when I meet someone in person. It's not rocket
science to provide all of these from a single source, in a useful form
(not paper!) and continually updated, but we're not there yet.
I'm no Luddite (after all, Ned Ludd went around breaking machines
intentionally, and I only do that accidentally when I trip over the
wires), but on balance, I don't need a lot of classroom technology. I'd
be almost content if I could see my students, and they could see a
screen and the board at the same time.
Most mornings I walk to the Wa to get a paper. The campus is usually
quiet and deserted, although during big party weekends or Reunions there
are sometimes groups of boisterous guys on the Street, barely able to
stand upright but full of bonhomie if they know me. ("Professor!
Want to go to PJ's?") If the weather is warm there are occasionally
bodies sleeping on the grass, not always frosh preparing for Outdoor
Action. As I walk past Walker and Patton Halls I might hear alarm
clocks for days on end, with their owners gone or perhaps just totally
wiped out. And a while back I encountered a forlorn young woman in
heels and a long black dress, just outside 1901 at 6 a.m. She
recognized me (and I her — she had been in my class a couple of
years earlier) and asked me to prox her in. But my prox only works at
the computer science building, so the best I could offer was to call
Public Safety, an offer that for some reason she refused.
Much of the morning action involves other species. Long ago, before
Whitman College was even a gleam in someone's eye, let alone a giant
construction project and now home to 500 students, there were the Dillon
tennis courts. As I walked through them one morning, reading the
headlines in The New York Times, I sensed something nearby. I looked up
and there was a red-tailed hawk on the fence, not even 10 feet away.
Let me tell you, this was one serious bird — you would not want to
do hand-to-hand combat with him. I stopped long enough to get a good
look, but did not try to get closer. I've seen other red-tails and
goshawks on campus, but this one was the closest, and the most
impressive. Sadly, Whitman has destroyed his habitat and we won't
likely see him again.
Squirrels (which are sometimes hawk food) probably outnumber students
and are great fun to watch; their ability to run and jump and chase each
other puts even the most talented varsity athletes to shame. I was on
McCosh Walk one morning when, with a thump, a squirrel landed on the
sidewalk right in front of me. Even though he had fallen from 30 feet
up, he just shook himself and ran back up the tree. If I fell out of a
tree, it's for sure I wouldn't climb back up for another try but for the
squirrel, it seemed to be all in a day's work.
Morning sounds are interesting too. I once was at a luncheon with a
famous naturalist who told us that mockingbirds can only handle about
five different songs before they repeat themselves. Tell that to the
ones that sing outside Cloister and Cap in the morning; I lose count
when they get to a dozen different calls, not even including the
distinctive "watch out" warning that they use to tell their
buddies that one of the Street's resident cats is nearby.
We had a family of baby deer living on our property some years ago,
our own personal Bambi triplets. It was a gift of beauty — the
three tiny deer spent much of their time just outside our windows. When
I set off to the Wa at 6 a.m., they would often be sleeping on the lawn
and sometimes I could get within 15 or 20 feet of them before they
decided to leave.
Deer are a big problem in Princeton, the embodiment of
"attractive nuisance." They eat bushes and flowers, and we
gave serious thought to physical violence after they ate every single
flower on the porch one night, but they are lovely to watch, so we made
no effort to chase them off.
The next year our triplets returned as teenagers, this time sporting
numbered tags in their ears and transponders around their necks so the
authorities could keep track of their movements. We never did give them
names, but their numbers made them seem like part of the family —
"Little 82 was on the lawn this morning," or "I haven't
seen 53 for a couple of days."
They came back again a third year, by now middle-aged matrons who
spent their afternoons ruminating under a tree, but the numbered tags
left no doubt that they were ours. Sadly, we haven't seen them this
year. My wife fears the worst — the deer police rounded them up
— while I choose to believe that they went south for the winter
and spent the summer in New England. Someone must know where they are
if the transponders still work. And we live in hope that they too will
come back for Reunions.
At a flea market many years ago, I paid 50 cents for a badly worn
paperback copy of Niccolo Machiavelli's "The Prince," a book
that I had often heard of but never managed to read. Some previous
owner of the book had thoughtfully highlighted all the good bits with a
heavy yellow marker, so there was no need to read the whole thing,
though I have now done so several times over the intervening decades.
Most of Machiavelli's advice to his own Prince, Lorenzo the
Magnificent, is timeless wisdom, because details change but people
don't. Consider this from Chapter 18: "Therefore it is unnecessary
for a prince to have all the good qualities I have enumerated, but it is
very necessary to appear to have them." Or, as expressed in more
modern idiom by Samuel Goldwyn, "Once you can fake sincerity,
you've got it made."
A few years back there was a small industry of business books based
on "The Prince," with titles like "Management and
Machiavelli" or "The Mafia Manager: A Guide to the Corporate
Machiavelli." I have no idea whether any of these were really more
helpful to their readers than the original might have been, though one
is naturally skeptical of fads. But making another pass through my copy
started me wondering whether there might be a market for some
Machiavelli aimed at faculty or university administrators. I'm open to
suggestions for titles, and even for content, since there's clearly some
potential.
For instance, I'm a member of the Committee on the Course of Study, a
small group of faculty, students and administrators that deals with
routine curricular updates like adding and deleting courses, and
sometimes takes on larger tasks like the proposed changes to the
academic calendar that occupied much of the committee's time for more
than a year. Ultimately our carefully constructed proposals met with so
little approval that the topic has been tabled for a while (certainly
until I have rotated off the committee, which is long enough for my
purposes).
This committee experience reminded me of another of Machiavelli's
insights: "There is nothing more difficult to take in hand, more
perilous to conduct, or more uncertain in its success, than to take the
lead in the introduction of a new order of things." This surely
applies with particular force to university administration —
four-year residential colleges and grade deflation are merely two of the
more recent in a long series of "new orders" that have been
difficult indeed for those who have tried to lead them.
I bought my copy of "The Prince" long before my current
association with Princeton, so the subtitle on its cover,
"Introduction by Christian Gauss," meant nothing to me. For
those of you whose knowledge is similarly skimpy, perhaps even those who
live in Gauss Hall, "A Princeton Companion" helpfully explains
that Gauss was one of Woodrow Wilson's original preceptors, an
exceptionally popular faculty member, chair of the department of modern
languages and for over 20 years Dean of the College, a position in which
he became "perhaps the best known college dean in America."
Gauss completed his introduction to this edition of "The
Prince" in 1951. According to the "Companion" article,
"One autumn day in his seventy-fourth year he went to New York to
deliver the manuscript of his introduction to a new edition of
Machiavelli's The Prince [...]. That evening while he was waiting in
the Pennsylvania station for the train to take him back to Princeton,
his heart failed and he fell dead." Though one hopes to live well
beyond 74, if one is not so fortunate there are surely worse ways to go
than peacefully after a long, productive and much appreciated life.
"The Prince" itself is not a long book, only 127 pages in
this small paperback, so Gauss's 25 pages of introduction were a
significant contribution. Gauss, writing only a few years after Hitler
and Mussolini and with Stalin very much in power, ascribed the rise in
popularity of "The Prince" to the emergence of new types of
states and the clashes between them. I wonder how he would have changed
his introduction if he were writing today, with one ill-managed great
power and a host of scarily random smaller players.
I also wonder what Niccolo himself would do today to get his message
across. Would he work behind the scenes in Washington or some other
capital? Would he write learned articles in obscure academic journals?
Would he be a cable TV pundit or have a blog? And what would he think
of today's Princes? "A prudent man should always follow in the
path trodden by great men and imitate those who are most
excellent." My bet, sadly, is that he would have found
discouragingly few to be "great" or "most
excellent." But that leaves a golden opportunity as you rise in
your chosen field, for we are sorely in need of some most excellent
Princes.
I'm writing this column as therapy, a break from grading the final
exams for my course. Samuel Johnson once described a second marriage as
the triumph of hope over experience, and I see a sort of parallel with
teaching and then grading. Every semester begins with a burst of new
energy and a brand new group of some of the smartest and nicest people
one could ever hope to meet. I spend the whole semester getting to know
them, while talking about interesting and important topics. This is the
"hope" part, and it's nothing short of wonderful.
Then comes the final exam and the "experience" part: every
year I discover again that in spite of my best efforts, and certainly
the best efforts of the students, not everyone learned everything that I
had hoped to teach them. This year, pretty much everyone mastered the
difference between a bit and a byte, and most proved able to make the
distinction when it mattered; that's an improvement over some previous
outings. A discouragingly large number still don't grasp binary
numbers, and certainly not how to do simple arithmetic on them. Indeed,
it seems that some don't even know how to do arithmetic on decimal
numbers, at least under time pressure. I'd feel more superior about
this if it weren't for the fact that when I checked my own arithmetic,
nothing more complicated than adding up points for various questions, I
found errors in two or three of the first five.
One hopes that students will hang on every word of a lecture and
remember it clearly for the exam. But experience shows that recall is
less than perfect: some students appear not to have to been in class at
all, and some of those who did attend were present only in body,
devoting their classroom time to sleeping or surfing or socializing. I
did get full attention in a few lectures, however, notably the one where
we examined MPAA and RIAA "cease and desist" letters sent to
OIT about students who were uploading allegedly copyrighted music and
movies. The letters are interesting in themselves, and they are based
on technical topics from the class, like IP addresses, peer-to-peer
networking and digital signatures. In one of those golden teaching
moments, a student said "Suppose, just hypothetically, that a
student was caught uploading an MP3. How did they know?" Suddenly
the whole class was on high alert, and we had a great discussion about
this purely "hypothetical" situation. I haven't gone back yet
to see if I can correlate attendance with how well people answered the
final exam question about this topic, but I'll bet that those who were
present did fine.
Why should one have to know any of this technical stuff? I think it
matters if one is to be an educated citizen. For instance, surveillance
equipment is getting cheaper all the time. How cheap? One of this
year's questions asked a peripherally related question: how much disk
capacity would you need to store everything you're heard in your whole
life? It's not much, about 10 terabytes (work it out — get 5
points on my exam), which even today would only cost a few thousand
dollars, and which will likely cost less than a hundred dollars by the
time today's freshmen have graduated, thanks to exponentially falling
costs. Video is at most 10 times that.
Energy costs for Internet servers are growing at about one percent a
month. When will today's gigawatts for servers become terawatts?
(Another five points.) Parents can now buy systems that will monitor
where their teenagers drive and display the results on a map via the
Internet. How might this work? What is the main concern about the
impending purchase of DoubleClick by Google? What's the technical issue
behind the current European Union vs. Microsoft fracas and how does it
relate to the ongoing battle in the United States that began nearly 15
years ago?
One of my questions came from the current presidential campaign.
When John McCain made the now obligatory stop at Google in May, he was
asked one of their standard interview questions: how would he sort a
million 32-bit integers in only two megabytes of RAM? Great amusement
from the audience, of course, and Eric Schmidt '76 let him off the hook
quickly. But wouldn't it be striking if he could have answered the
question or perhaps the variant of it that I subsequently asked on my
exam? When Barack Obama was asked the same question in the same setting
in November, he gave a witty and technically correct answer. (There's
obviously some alert geek on Obama's campaign staff.) I'm delighted to
report that the majority of my kids got it right too. It's clear that
they'll be ready in a few years when they're starting to run the world,
and I for one am looking forward to that.
"A university is just a group of buildings gathered around a library.” — Shelby Foote
The first problem set for my class last September asked some
questions about prox cards. One required students to speculate about
why checking out a book at Firestone uses neither the wave of the prox
that unlocks doors nor the swipe of the magnetic strip that pays for
food. Most people figured out the likely answer (which is left as an
exercise for the interested reader), but I was surprised that some
confessed that they had never been in the library and thus didn’t
know what happens when a book is checked out.
I visit Firestone four or five times a week, which is hardly unusual
for some disciplines, but perhaps outside the norm in technical fields
like computer science, for which the claim is sometimes made that
“it’s all online anyway.” Many of my visits take me no
further than the Dixon collection, a marvelous resource for which one
need not even climb the stairs — a room full of new books and
students racked out on couches. Five minutes of browsing is usually
enough to find two or three promising books to carry home. I’m
biased towards history, popular science and detective stories, but some
of the best finds have been pure serendipity — an interesting
title or topic that I would never have seen if the book were not right
there in front of me.
Sometimes one of these accidental discoveries is so compelling that I
want to read whatever else the author wrote. Or maybe something reminds
me of an author whom I enjoyed before. Either way, that’s where
one needs the rest of the library, since that’s where they keep
the rest of the books.
Thus one of my most frequent downstairs destinations has become
B-1-P, which holds Firestone’s remarkable collection of detective
stories and murder mysteries. This corner of the stacks presumably has
limited value for scholarship, but if you’re looking for escape
literature, it’s hard to beat. I’ve found pretty near all
of the Spenser stories I missed in my previous life. (That’s not
the Faerie Queene guy, in case you hadn’t guessed, but Robert
Parker’s flippant Boston private eye.) Several authors write
detective stories set in Rome at the time of Julius and Augustus; they
may not be historically accurate, but they’re good enough for me.
And I’ve recently become intrigued by a Nora Roberts series of
police procedurals set in 2058. I found some of these in the online
catalog, but mostly it’s been pure shelving serendipity — an
interesting title or cover, or an author who appears often enough to
suggest publishing success.
I had a real reason for digging in the stacks last fall, trying to
track down a couple of words coined by the science fiction writer Robert
Heinlein that have become part of popular culture. The most widely
known is “grok,” from “Stranger in a Strange
Land” (1961), which the OED defines as “To understand
intuitively or by empathy; to establish rapport with. To empathize
or communicate sympathetically (with); and to experience
enjoyment.” It shows up today in names like Grokster, the
file-sharing program that certainly did permit millions of users to
experience enjoyment. Understandably, the purveyors of movies and music
were most unsympathetic to the communication that Grokster enabled, and
their lawsuit eventually led to a Supreme Court decision that shut down
Grokster permanently.
The other memorable Heinlein coinage is TANSTAAFL (“There
ain’t no such thing as a free lunch”), from “The Moon
is a Harsh Mistress” (1966). Wikipedia identified the source, but
I’ve learned to be a bit cautious about the accuracy of what one
finds there, so down I went, and spent half an hour in the gloom of
B-1-D to make sure I had the references right. (Apropos of Wikipedian
reliability, another class assignment asked students to edit an article
that they thought needed fixing; the intent was to remind them
subliminally of how an online encyclopedia determines its version of
truth.)
Lest you think that all trips to the library are just for fun and
games, I do read occasional serious stuff, but the path to it is often
much the same: a browse through Dixon or perhaps a review suggests a
topic or an author, and then it’s time to hit the stacks. For
example, a few months ago I came across a new book by my colleague Tony
Grafton, with the catchy title “What Was History?” It turned
out to be (sorry, Tony) too heavy going for bedtime reading, but of
course it’s not the only book he wrote. So back downstairs, this
time to C-10-N, to find “Bring Out Your Dead.” From the
title, one might almost think it misfiled; it goes very nicely with
Parker’s “Death in Paradise” and Roberts’
“Divided in Death,” the cop novel that I picked up on my
return trip through B-1-P.
"There's no other major item most of us own that is as confusing,
unpredictable and unreliable as our personal computers." -
Walt Mossberg, The Wall Street Journal
Every year, a handful of students ask for advice about what they
should buy to replace ailing computers. The best I can offer are
generalities: know how much you can spend and what you plan to use it
for; after that, it's a bunch of tradeoffs among important alternatives.
It should be thin and light but with a big screen and long battery life.
It has to be cheap but have lots of features. And so on. The only
thing you can be sure of is that you won't be happy once you get it.
Recently I found myself on the other side of the question when a dear
member of my family, who shall remain nameless here, spilled a glass of
milk on the keyboard of the main laptop that I use at home. This proved
to be almost the coup de grace - most keys did nothing, random keys
became permanently stuck, and the touchpad didn't work at all. The
machine is too old to repair but it holds so much useful data and
programs that I couldn't just toss it. I managed to get it limping
along with outboard support systems - a keyboard, a mouse and an
external drive, rather like a heart-lung bypass machine - but now it's
making ominous groaning noises, which I hope come from the fan and not
the internal disk. Either way, it has to be replaced.
I wasted a month surfing sites like amazon.com and cruising big-box
stores like Best Buy, trying to find the right balance among utterly
unsatisfactory compromises. In spite of the many advantages of a Mac,
they cost too much and it really seemed necessary to get a Windows PC
for compatibility with everything I already have. But I kept getting
stuck on one crucial point: new computers only come with Vista,
Microsoft's latest operating system.
My first encounter with Vista last fall was traumatic: I discovered
only a day or two before the first lab in my course that all the public
cluster machines had been surgically altered to run Vista, and that the
computers offered to incoming freshmen came with Vista as well. My
carefully crafted lab instructions were just plain wrong, because Vista
was different from XP in so many little ways.
With a lot of work, I managed to find a path through the maze of
differences, but it left me with a bad impression. My enthusiasm was
not increased by endless stories about incompatibilities and lack of
drivers, so severe that even senior Microsoft people were taken by
surprise when they tried to eat their own dog food by using Vista. (A
recent class action lawsuit alleges that Microsoft misled consumers
about Vista's hardware requirements and compatibility; the email
exchanges among company executives that form part of the case make
interesting reading.)
Microsoft will eventually work through the technical issues and in a
year or two Vista will be fine, but I needed a new machine right away.
XP is about to be discontinued, and most computer manufacturers sell
only Vista machines. Fortunately there are still a handful on the
market that come with XP; indeed, the number may be growing, at least
while Microsoft smoothes out some of Vista's rough edges. So I bought
one of these trailing edge computers, thus confirming my standing as a
technology "late follower."
The good news is that Moore's Law continues to operate: every year or
two technology advances enough that we can get roughly twice as much
computing power for the same price. For a modest sum, I bought a
machine that's more powerful than any other I own, except maybe the
MacBook Pro that cost four times as much. It runs the old familiar XP.
It only took me a couple of days to install the programs that I need,
while carefully excising the unwanted trial versions of software that
Mossberg calls "craplets," though I am still trying to defeat
whatever turns the fan on and off every minute.
The bad news is that I now have yet another computer to look after.
A quick count reveals 8 working laptops scattered between home and
office; my office looks vaguely like a higher-tech version of one of
those places with rusting cars in the back yard. It seems a shame to
just discard perfectly good computers, and I have grand plans to use
them for experiments with operating systems and networks, but somehow
there's never enough time. So I predict that sometime next year, I'll
have to add laptop number nine. Will it run Vista or Mac OS X or none
of the above? Yes, though I don't know which. Will it still be
"confusing, unpredictable and unreliable"? That you can be
sure of.
Last weekend my wife observed, quite correctly, that our CD
collection has gotten totally out of control. It has long since
overflowed the shelves and there's no organization either, with odd
combinations like Simon and Garfunkel between Hildegard von Bingen and
two versions of Bach's B Minor Mass. We could buy another set of
shelves to match the first, but the company that made them went bankrupt
recently, an early victim of recession. As befits my position as the
family tech person, I offered a technical solution: put the entire
collection on a single iPod. There would enough room for whatever we
might add for the rest of our lives, and it wouldn't take any space at
all. I don't think my wife was convinced, but nevertheless I went to
the online Apple store to see what might be available.
Under "refurbished iPods," I discovered a range of options
from a one GB Shuffle with "up to 12 hours of music playback"
to a 160 GB Classic with "up to 40 hours". This looked
strange. If a one GB device gives me 12 hours, presumably a 160 GB
device would give me about 160 times as much, or nearly 2,000 hours. I
own a Shuffle and it does hold roughly what Apple says it does. So how
did Apple come up with that number for the big iPod? The answer, of
course, is that they're talking about playing time, not memory capacity.
I totally misinterpreted because I was fixated on getting enough
gigabytes and not thinking about time at all.
We are surrounded by so many numbers from so many sources that we
don't even notice most of them, let alone think about them critically.
It's only when something flagrant appears, or when we're forced into
careful study, that we pay much attention.
The "Corrections" column in The New York Times is mostly
devoted to repairing names and quotations, but it often has numeric
gaffes to report as well. Sometimes the issue is analogous to my error
of using the wrong units. For instance, on March 13: "Americans used 3.395 billion
barrels of gasoline [in 2007], not 3.395 billion gallons." A barrel
of oil is 42 gallons, so that's significant. On April 10 we learned that Mexico's daily oil production last
year dropped "to about 3.1 million (not billion) barrels a
day." Such errors, off by a factor of a thousand, are surprisingly
common, perhaps because no one has much intuitive feel for large
numbers, and words like "billion" and "trillion"
have come to mean just "big" and "really big."
Greekoid prefixes like giga and tera, though familiar from our techno
gadgets, are just as metaphorical to most people, and further-out ones
like peta are even worse. The Times printed another correction on March 25, noting that a
petaflop is "a thousand trillion instructions per second, not a
million trillion."
It's often said that if we all used scientific notation, these
problems would go away; after all, no one would confuse 10 to the 15th
power (10 with a superscript 15, or sometimes 10^15) and 10 to the 18th,
and we wouldn't need compounds like "million trillion." Having
covered this topic in class for some years now, I'm skeptical. And we
would still have typographical problems: a story in the Times' Sunday magazine last December
talked about 10 to the 50th chess positions, which is a truly huge
number, but it appeared as 1050 in the online version.
At least numbers like "a petaflop" are round. One of my
pet peeves is numbers that are speciously precise, since they almost
always indicate that whoever produced the number wasn't thinking about
it at all. For instance, on March 16, a Times story about mega-yachts, favorite toys of
the super-rich, quoted the editor of a yachting magazine as saying
"When a yacht is over 328 feet, it's so big that you lose the
intimacy."
I move in a rather different social circle, so I don't have much
experience with this kind of intimacy, or loss thereof. The thing that
caught my eye was the length beyond which a yacht ceases to be intimate.
Size clearly matters, but where does the oddly precise value 328 come
from?
Once you've seen a few of these, it's pretty obvious: another kind of
number numbness. Three hundred twenty eight feet is 100 meters. It's
common journalistic practice to convert metric units into more familiar
English units, but the conversion is often done blindly - whip out the
calculator and write down whatever it says. The result is meaningless
precision: the original ballpark figure of "a hundred meters"
has become a pseudo-scientific "fact" with three
"significant" figures.
This kind of numeric cluelessness is likely to be with us for a long
time, though I'm trying to stamp it out locally. Meanwhile, if I had
billion dollars, I wouldn't spend any of it on a yacht that was a
hundred meters long, but I would definitely buy my wife an iPod with 160
gigabytes.
Princeton is a different place in late summer. Students and faculty are
for the most part enjoying their last few weeks of freedom before the
treadmill starts again, so one might naively assume that the town is
even quieter than it is earlier in the summer when sports camps and
institutes for the (presumably not athletically) gifted fill the campus
with kids who barely qualify as teenagers. But Princeton has become a
tourist attraction. The Orange Key groups seem more numerous and
bigger; it's not unusual to see three of them running at the same time.
And it was a bit of a surprise to watch five tour buses unloading on
Prospect Avenue one Sunday morning. What are all these people doing,
and why are they doing it when there's no one around but other tourists?
I suspect that most of them are coming because of the Princeton name
and aren't really sure what to look at. If they rely on the freebie
maps handed out in town, they're not going to see much; those maps have
only the loosest connection with reality - Nassau Hall but none of the
surrounding buildings, the library but not the chapel, the art museum
but not Murray Dodge, Prospect and Frist but no Robertson or fountain.
From time to time, however, there are visitors who have more on their
agenda than the tour and the art museum. On one desperately hot and
muggy Sunday in mid-August, I was standing on Nassau Street waiting for
the light to change so I could walk down University Place, when an
elderly woman asked if I lived in Princeton. Yes, I replied, and she
said, "Could you tell me where the statue of John Nash is?"
This startled me, since I had seen Prof. Nash walking across campus
only two days earlier, looking quite well, and it's not the custom in
this part of the world to erect statues of living people. For a brief
disoriented moment all I could think of was the Seward Johnson sculpture
beside the Palmer Square kiosk. Then it dawned on me - she was really
asking about another famous Princeton personage, Albert Einstein. There
is a bust of Einstein on the walkway that leads toward the monument in
front of the Princeton Borough offices. It's not a big statue like the
imposing one on Constitution Avenue in Washington, but it's definitely
him. ("Newspaper Reader," another Seward Johnson piece, is on
that same walkway if you ever want to check out off-campus sculptures.)
I aimed her in the right direction and headed on down University Place.
Where only a few moments later, a young woman who looked like any
Princeton student stopped me and said, "Are you a professor
here?" Some of us apparently look the part; in any case, there
didn't seem to be any harm in admitting it, and she began a series of
questions. "What department are you in?" When I replied
"Computer Science," it was clear that she had hoped for
something better; with a politely disappointed look, she asked,
"Where's the train station?" I pointed her toward the Dinky,
but after a few steps she turned back and asked how she could get an
internship. That's pretty random no matter what, and I told her that it
would probably be tough if she weren't a student here. Disappointed
again, she turned toward the train, but paused once more and said, out
of the blue, "Did you know Einstein?"
So much for my self-image as the still-youthful professor. Sadly, I
did not know Einstein; I was in elementary school when he died in April
1955, and I didn't even live in the same country. But as it turns out,
my wife did know Einstein, sort of. She grew up in Princeton, where her
father was on the faculty in the English department, and when she was
about 5 years old, he took her to meet the great man. She remembers
this clearly; he was incredibly kind, they talked about cats, and she
got his autograph.
Occasionally I've been stopped by intrepid tourists, far off the
beaten tracks of the campus and the tour groups, asking where Einstein
lived. That I do know; in fact, one of my favorite long walks takes me
down Mercer Street, past the house where Einstein lived for many years
(including when my wife visited him). As I pass it, I wonder - what was
it like to have Einstein as a neighbor? And I also sometimes wonder if
anyone among us will prove so famous and influential that 50 years from
now people will make pilgrimages to the places where he or she lived.
There are a lot of amazing people here, so as you settle in for another
year, keep your eyes open for the politician or athlete or scientist or
literary giant who will never be forgotten. Who knows - if you play
your cards right, it could even be you.
If all goes according to plan, you're reading this at nearly the maximum
stress point of the semester: the beginning of midterm week.
Fortunately, good time management and study skills will let you take
this in stride, though your less organized and less diligent friends are
probably in a total panic. There were warning signs that midterms and
their accompanying stress were coming. One of the first indicators in
my class is that people start to miss assignment deadlines. The reasons
vary, but old standbys like "I forgot" and variants of
"the dog ate my homework" are popular: "My computer
crashed, and I lost the whole problem set." There seems to be a
strong correlation between disk failures and midterms; maybe disks
stress out too, in a display of empathy with their owners.
People start to fall ill because there are always evil bugs going
around, and living in close quarters is a great way to exchange them.
I've been auditing SOC 250: The Western Way of War, a fascinating course
taught by professor Miguel Centeno. Some days the sound of 100 students
coughing drowns him out entirely, and I wonder why I'm voluntarily
sitting in the middle of a germ warfare zone. Stress must weaken the
immune system since it's for sure that the number of requests for
extensions and/or mercy because "I've been sick" starts to
rise in the week or two before midterms.
And then there's attendance at lectures. My deadest day of the year
is the Wednesday before Thanksgiving, of course, but the second lecture
of midterm week is not far behind. Everyone is desperately cramming for
exams or trying to catch up with papers due or maybe just hoping for a
few hours of sleep. It's more efficient and certainly far more
comfortable to sleep in one's room than in lecture, so it's no surprise
that half the class fails to show.
I do expect close to a full house today, however, since that's when I
hand out the midterm exam. People are faced with an unpleasant choice:
drag their tired and sick bodies to class to pick it up, or make a
special trip to beyond the edge of the known universe (my office in the
Computer Science Building). Most decide that it's less effort to come
to class because it's closer and there's still a chance for a decent nap
at the back of the room.
The perpetual construction on campus also added a bit of stress all
around. I was originally supposed to teach in Peyton 145, but some
schedule slipped (the dog ate the blueprints?), and the room wasn't
ready in September. This left me in a bit of a bind. Officialdom
offered a room with 98 seats (I counted them); a bit of quantitative
reasoning suggested that this would probably not hold the 130 students
and 10 community auditors who had signed up. Furthermore, the web page
that gives information about classroom sizes was broken and stayed
broken until weeks after the dust had settled, so there was no way to
explore alternatives short of wandering the campus.
After some tense last-minute negotiations, I wound up in McDonnell
A02, a huge room that seats more than 300. The big drawback of a huge
room is that students sit way in the back, where it's better for sleep
and socializing. The instructor can't see them in the gloom; there's no
personal contact and little chance to get to know anyone past the first
few rows. In the first lecture I tried to get people to move forward
but was pretty much ignored, and over time more and more students wound
up in the very back.
Last week we moved back to Peyton. We just fit, and enough people
skip each lecture that the regulars have room to spread out. Life is
mostly better: I can see everyone, though the lighting is still poor,
and the controls are terribly awkward. I can use overheads instead of
PowerPoint, which is liberating. There's a full-size table where I can
spread out my junk. Best of all, I'm getting to know some of the
back-row population. In the long run, I hope that's good for them; it
certainly is for me. In the short run, though, loss of anonymity and
interference with regular sleep must surely add to the stress of midterm
week. Hang in, everyone - with luck, you'll live through it.
George Orwell delivered the final manuscript of "Nineteen
Eighty-Four. A novel" almost exactly 60 years ago. The book has
had a strong influence on language through words like newspeak,
doublethink, Big Brother and of course the adjective
"Orwellian," but fortunately Orwell's dystopia didn't
materialize in the real year 1984. One of the technological ideas in
"1984" was pervasive surveillance and monitoring through the
"telescreen," a two-way communication device that was very
difficult to hide from. In Orwell's novel, surveillance was a
government activity. That's still true today, especially since Sept.
11, 2001, with ubiquitous cameras and greatly increased monitoring of
communications like e-mail, sometimes within the law and sometimes
arguably well outside. Orwell could have written a fine new edition
exploring how governments might use today's technology.
I've been struck, however, by a different kind of surveillance that
was not part of Orwell's worldview at all, at least as I remember the
book. (I had planned to re-read "1984," but the three or four
shelves of Orwelliana in the depths of Firestone hold but a single copy,
in Polish.)
The surveillance I have in mind is not governmental but commercial.
The march of technology has given us ever smaller and cheaper gadgets,
especially computers and cell phones, and pervasive communication
systems, notably the internet and wireless. As an almost accidental
byproduct of this progress, we have voluntarily given up an amazing
amount of our personal privacy, to a degree that Orwell might well have
found incredible.
In my class I sometimes ask whether people would willingly carry a
device that can track their every movement and report exactly where they
are at every moment. Of course no one would ever do that, but in fact
everyone does, since every student carries a cell phone that is never
turned off. Older phones only know to within a few hundred meters where
you are, but newer phones with GPS have you pinpointed within a few
meters.
So phone companies know where your phone is. Would they reveal that
information? As I write this, we have just learned that Verizon
employees have been checking out President-elect Barack Obama's cell
phone records. Clearly this was unauthorized, but it's not hard to
imagine ways in which your physical location could be used commercially,
for example to send location-dependent advertising to your phone. Would
you be willing to let the phone company use your location in return for
lower rates or a sexier phone? Experience suggests that most people
would be quite happy with such a trade - privacy is good but it is often
given away or sold off quite cheaply.
On the internet, students are astonishingly willing to broadcast the
most intimate details of their lives on myspace.com and facebook.com,
though this pendulum may be swinging back as it becomes clear that more
than just your friends are watching: prospective employers check out
candidates, as do college admissions offices.
Facebook and similar sites have an enormous amount of data about
relationships among people, though their attempts to make a profit from
it have met with mixed results. There was real pushback a year ago when
Facebook exposed purchases made by members on third-party sites; this
was deemed going too far. On the other hand, when Facebook added a
"news" mechanism a couple of years back, a surprising number
of people didn't mind having their changes of relationships and other
facts broadcast far and wide without explicit consent - privacy
given away again.
Most websites use cookies to track repeat visitors; companies like
DoubleClick, recently acquired by Google, sell this information to
advertisers. We were talking in class last week about how cookies work.
I was faced with the usual wall of open laptops (in some ways a sore
subject, to which we may return some day), but for once they provided a
teachable moment. I asked everyone to pause in their chatting, surfing,
twittering, mailing and similarly crucial activities and count the
cookies on their computers. "How many do you see?", I asked.
The first answer, quite representative, was a shocked "I can't
count them!" That led to a discussion of whether the benign uses of
cookies outweigh their privacy-invading role of monitoring what sites
you visit. Most people seemed a bit taken aback at all of this, and
I'll bet that a fair number of cookies were subsequently deleted. This
is one place where you can recapture some privacy at no cost - if you
stop accepting cookies from third parties (the advertising companies),
the web keeps right on working.
Scott McNealy, at the time CEO of Sun Microsystems, once said
"You have zero privacy anyway. Get over it." Sadly, it's
pretty close to true these days. The remarkable thing is that we seem
to have given it away, and continue to do so, for pretty much nothing at
all in return. Orwell could have written a book about it.
Every November and December I spend quite a bit of time writing letters
of recommendation for students past and present. This activity peaks in
the late fall because the deadline for most graduate school applications
is the end of December. Given the perilous state of the economy and the
sometimes bleak job prospects faced by current seniors, grad school
looks like a good way to ride out some of the storm while improving
one's mind and credentials. It normally takes me nearly a day to
write a letter that captures what I know of a person and describes the
work he or she has done. Fortunately, it's often possible to use the
same letter for every place that someone is applying to, so the effort
can be amortized over half a dozen schools.
Some clerical parts of the process have gotten better over the years.
I once wrote, addressed and mailed 33 [sic] real physical letters for a
friend who was trying to get a teaching job. (He was a great teacher,
but times were tough, and he wound up programming at a software
company.) Today, physical letters are actively discouraged, and writers
are supposed to visit a website, fill in some checkboxes and upload a
letter. Commercial operations like Embark and ApplyYourself provide
this service for universities; Princeton's graduate school uses Embark.
For the first few years, these systems were terribly clumsy and
unreliable, but the kinks have gradually been worked out, and mostly
things work fine. Some universities and fellowship operations roll
their own application systems; those too have gotten better, though they
still have bugs. One first-rate university on the West Coast wrote its
own system. I have a good friend there, and every year I can pull his
chain by pointing out some new glitch or rookie mistake in the user
interface, even though none of it is his fault at all.
Perhaps surprisingly, the hard part of recommendations is not the
letter itself, but that every single university or fellowship wants
something a little bit different from all the others. Some insist on
telephone numbers with separators between the digits, while others
refuse them. Some require Word documents while others insist on PDF. A
few set length limits, so my carefully crafted 800 words have to be
chopped and mangled to fit an arbitrary limit like 600.
The thing that irritates me the most, however, is that almost without
exception, each place wants the applicant rated against his or her peers
on a bunch of dimensions. A typical form lists eight or 10 desirable
attributes like creativity, ability in written expression, oral
expression, personal character, research ability, teaching ability, and
the like. It asks me to assess the applicant on each, against other
students, on some non-linear scale like "best ever," "top
1 percent," "top 5 percent," "top 15 percent,"
down to "bottom 50 percent."
This is ridiculous, to put it mildly. Occasionally there is someone
who is clearly the academically strongest this year, but even so, it's
impossible to slot that person into "top 5 percent" on
"written expression." What could it possibly mean to say that
Jack is in the top 25 percent in "integrity"? If Jill is in
the top 10 percent in creativity, and Jack is only in the top 20
percent, does that mean that she'll do better than he will? It's not at
all clear that these fine distinctions on unquantifiable characteristics
have much to do with likely success as a grad student, where
perseverance and the ability to sustain or at least feign interest in
one's topic for years are more important.
So the natural tendency is to place everyone in one of the top couple
of positions, on the theory that anyone merely in the top quartile is
doomed. This leads to a Lake Wobegon effect, where all the children are
above average. I suspect that the attempt by universities to get a
fine-grained and multi-dimensional ranking of their applicants is
destined to fail, because most recommenders can't or won't make the
distinctions that are asked for.
In a way, this is the same situation that we see every August when US
News and World Report publishes its ratings of college and universities.
For about eight years in a row, Princeton was number 1 (naturally) but
slipped to number 2 last year (clearly some kind of error at US News).
Everyone knows that such rankings are meaningless: the criteria can't be
quantified, the data is necessarily flaky, and it's all combined with
arbitrary weights. It's the same with grad school applications, and
pretty much anything else that asks us to rank-order people. There's no
way that anyone can put Jack ahead of Jill or vice versa with the
precision that's expected in these forms. Let's stick with careful
written assessments. People can't be reduced to a single number or even
a dozen numbers, and we shouldn't even try.
There are only two groups who routinely ignore the traffic lights on
Washington Road: drivers and pedestrians.
Almost every day I cross Washington Road several times. In the early
morning when there’s hardly anyone around, drivers rocket by at 50
or 60 miles per hour and feel completely free to ignore the lights,
especially if they’re coming up the hill and making a right turn
onto Prospect Avenue. One steps into the street at one’s peril:
Eternal vigilance is the price of survival.
Later in the day, say when classes change at 11 a.m. or noon,
there’s a surging tide crossing in both directions in front of the
Robertson swimming hole. I’m struck, so to speak, by the
devil-may-care attitude of students, who ignore cars, traffic lights and
the laws of physics as they cross the street, totally wrapped up in
their cellphone conversations and oblivious to the cars and trucks
speeding by in both directions. Only a fortunate coincidence — a
few drivers are not talking on their own phones — prevents a
higher death toll. Young people think themselves immortal (age will
correct this misapprehension eventually), and fortunately most drivers
are alert enough that the illusion is preserved; everyone so far has
escaped unscathed. (Someday I must look into whether
“scathed” is a word that could be worked into a column, but
that’s for another time.)
Boulders and fences are an attempt to eliminate jaywalking, but
casual observation quickly reveals that they don’t accomplish
much. The flashing lights between Fine and the Lewis Thomas Lab are a
great idea, but only if drivers pay attention. Sometimes they do. The
free-fire zone at the end of William Street across from the library and
the chapel is total anarchy, with no rules of engagement at all;
it’s sufficiently dicey that I normally come up Shapiro Walk
instead of William so I can use the traffic light.
Perhaps one could think of the local crossings as tryouts for the big
leagues. I spent half of last summer commuting to New York. Part of
the trip was a mile-long walk from Penn Station down 8th Avenue early in
the morning and back up in the late afternoon. In New York, pedestrians
and drivers are in fierce competition, certainly for high stakes if not
literally to the death (usually, but there are unfortunate exceptions).
An aggressive pedestrian can save several seconds by trying to beat out
a speeding cab or truck. An aggressive driver can save an entire cycle
of a traffic light by 60-mile-an-hour intimidation. With the stakes so
high, it’s clear why the competition is so intense. Even though
my schedule didn’t matter a bit, I joined the game as a matter of
personal honor. It became obvious after a few near-scathes, however,
that I didn’t have the requisite skill to participate, and I went
back to dutifully waiting, just like any other rube, until the lights
were in my favor. And even then I looked both ways before crossing. In
hindsight, this was probably prudent: a recent article in the New York
Times observed that pedestrians over 65 are much more likely to be
killed than more spry and alert younger folk; clearly something happens
suddenly when one hits that magic birthday, at least in New York.
Back in Princeton, crossing Nassau Street is a variant of the
Washington Road game. There are fewer traffic lights and they have long
cycles. As compensation, there are several places where in theory a
pedestrian can simply walk across the street and the ever-courteous and
alert drivers will wait patiently until he or she is safely on the other
side. But as Yogi Berra might have said, “In theory there’s
no difference between theory and practice; in practice there is.”
In theory, pedestrians have the right of way and need merely step into
the street to exercise it; in practice, one runs the same risks as on
Washington or 8th Avenue, though typically there’s so much traffic
on Nassau Street that the closing velocity is lower and the odds of
surviving are correspondingly higher.
I suspect this low-level warfare will go on for many years, though in
theory (there’s theory again) the planned pedestrian bridge across
Washington Road will eliminate one of the points of impact. In practice
it would not be a complete surprise if the bridge were ignored, since it
will always be easier to cross at street level than to climb up and back
down. The only solution is likely to come in some not-too-distant
future when the entire campus is covered with buildings —
we’re well on our way — and the roads have all gone
underground, in our own local version of Boston’s Big Dig.
Let’s hope for the day when Washington Road becomes a pedestrian
mall. Meanwhile, look both ways before crossing the road.
More than a dozen years ago, I taught for one semester at Harvard. A
good friend was taking her sabbatical and asked me to take over her
course. This was well before I came to Princeton, and, though I had
done occasional adjunct teaching, I had never spent real time at a
university. Thus I blithely agreed to what turned out to be six months
of total immersion.
CS 50 is Harvard’s general introduction to computer science,
loosely equivalent to COS 126 here, except that it was much larger and
had a much broader spectrum of students: everything from hotshot
freshmen who had learned programming in utero to terrified seniors still
trying to satisfy their QR requirement. So one big problem was to find
a middle ground among very different levels of experience and interest.
The other problem, which I discovered only after it was much too late to
chicken out, was sheer size: the initial enrollment was 457, a number
still burned into my brain.
I have never worked so hard in my life. I had 31 teaching fellows
(as Harvard styles its teaching assistants), all but one of whom were
undergrads. As one might imagine, this raised some interesting issues,
like ensuring that teaching fellows didn’t wind up grading their
significant others’ work.
But in the end it all worked out — thanks to hard work by
everyone — and I had the time of my life. The experience taught
me that I could handle absolutely any teaching challenge, from making up
exams and getting them graded to printing 10,000 pages of lecture notes
overnight to dealing with discipline cases and the inevitable personnel
issues (cf. significant others, above).
Thus when Princeton became an option several years later, the
academic life was not a leap into the unknown but a chance to resume
something that had been enormous fun. In fact, COS 109: Computers in
Our World, the course I’ve been teaching here each fall, began as
an attempt to do a better job for less technical students than had been
possible in CS 50.
Last week I went north for spring break, to Cambridge instead of
Cancun. I gave a guest lecture in a programming course and spent a day
hanging out with friends from Harvard’s computer science
department. Some of them have gone temporarily over to the dark side by
taking on serious administrative responsibilities. It’s clear
that the financial meltdown of the past six months has made their lives
far more complicated than they had expected when they signed up, and
they’re very busy trying to keep things running smoothly. The
same is true here, of course, though I wonder if Harvard’s
problems are somewhat more severe overall. Fortunately, this is way
above my pay grade; I’m glad to have good people in charge and
lucky not to be directly involved.
But the high point of the trip was meeting a young man who had been
in my class in the fall of 1996. Given the class size, it was not
surprising that I had no memory of him. I’m enough of a digital
packrat, however, that I still had his grades: he stood eighth out of
nearly 400 survivors, which is a mark of exceptional talent. He had
taken the course as a sophomore and almost on a whim, since he had
planned to be a government major. I can’t claim credit for
anything beyond somehow accidentally releasing his inner geek, but he
enjoyed the course enough that he switched to computer science. He went
on to get his Ph.D. in CS as well, and for the past couple of years has
been teaching CS 50! This kind of full-circle experience has surely
happened to some of my colleagues here, but never before to me, so it
was especially rewarding.
Harvard routinely records large lecture classes — somewhere
there is a dusty box of my VHS tapes — and today lectures are
freely available on the web. My young friend is a dynamite teacher, and
his first lecture alone has provided me with three or four ideas for COS
109 next fall. We spent an hour trading stories of the class and how
much fun it was despite the workload. In fact, the sheer enjoyment of
teaching seemed to be at the center of everyone’s conversation.
The regular folks are wrapped up in current courses and full of ideas
for new ones. The deanly types are always looking for some way to get
back into the classroom. I don’t suppose anyone would volunteer
to teach for free to help out in tough times, but there was definitely a
sense of wonder that one can get paid to have such a good time.
Harvard is a different place from Princeton, but that pervasive
desire to teach and to do it well is one of the things that links the
two and indeed all good schools. Chaucer’s clerk is alive and
well, gladly learning and gladly teaching.
It’s hard to sit through a meeting or go to a lecture these days
without seeing someone lovingly stroking an iPhone, perhaps researching
some vitally important topic or (much more likely) reading e-mail or
playing a game. The iPhone was always a sexy toy, but it’s become
the geeky tool of choice, even for people who are far from
geeky. iPhone owners always have a new toy to show off, some cheap or
even free program they got from Apple’s App Store. Some are
pointless but amusing, like the one that pours a glass of beer as you
tilt the phone. Some are fun games. Some are useful, like the ones
that help navigate from here to anywhere. And some are simply
remarkable.
In my class few weeks ago, part of the lecture was to be about
Django, a software system that makes it comparatively easy to build web
applications. Django is named (for no reason known to me) after Django
Reinhardt, the gypsy jazz guitarist of the 1930s and 40s. Thus my
musical selection of the day was some of Reinhardt’s music. As
the CD was playing, one of the students in the class pulled out her
iPhone and 30 seconds later showed me that Shazam, a free iPhone
program, had found the album I was playing, complete with artwork, and
Amazon was offering to sell it to me.
Arthur Clarke’s famous Third Law says, “Any sufficiently
advanced technology is indistinguishable from magic.” Shazam is
not magic: the various technological pieces are straightforward, though
the specific algorithm that can so efficiently identify one tune out of
zillions surely qualifies as advanced technology. Some deprecate the
whole thing with comments like “It’s not really very good at
classical music.” True enough, but viewed more generously,
it’s nothing short of astonishing.
A couple of weeks earlier, a friend from Google aimed his Android
phone at the bar code on the back of a book in my office and a few
seconds later showed me the web page where Amazon (again) offered to let
me read pages in the book and of course to buy a copy. We can see where
this is going. Pictures of landmarks? Of items in a store? Of famous
people? Of ordinary people like you and me? No problem: camera phones,
universal connectivity and powerful central computing services working
off huge databases are going to make such apparent miracles routine.
What does it take to create such systems? Phone applications sit
atop a huge library of supporting software, of course; no one starts
from the bare hardware, so we really are standing on the shoulders of,
if not giants, at least a myriad of helpful programmers who have gone
before. But the other thing is that the companies that sell the phones
— Apple, Google (indirectly) and others — have provided
enough entree to their systems that anyone with some programming
expertise and a bright idea can create something new that will make the
phone even more appealing. This open environment unleashes the
creativity of thousands of people. Some ideas are brighter than others,
but no matter what, there are far more good ideas than Apple or Google
could have produced on their own. A thousand flowers can bloom, and
some of them will be beautiful.
I’m not sure whether opening up the programming interface was
entirely public-spirited on Apple’s part, but with the threat of
Android, an open platform with Google muscle behind it, things did open
up, and today it’s quite possible to build interesting software
fairly quickly for iPhones and Androids. Google came a bit late to this
party and is definitely playing catch-up. The first Android phone seems
clunky next to the iPhone, though it has a real keyboard, not a
simulated one, and it has a few tricks that the iPhone hasn’t yet
managed, like a neat integration with Google’s Street View, so you
can look around a location by turning the phone or yourself.
There were two phone projects in my class last spring, but it was
early days, the development systems were rough, and writing code was
hard going. In spite of that, the projects were very good, and one was
so successful that a couple of the students who built it are selling it
on the App Store. This year the development systems from Apple and
Google are more polished and stable. I have half a dozen projects in
the class and two JPs devoted to phone systems. The majority are
iPhone-based, but a couple are using Android, with phones donated by
Google. For sure there will be some winners this time, too.
I kid my students about endowing a chair when they become rich and
famous, but the real reward is watching enthusiastic and creative people
convert dumb devices into smart ones. That really is indistinguishable
from magic.
Very nice, thank you. Like many of you, I had both a vacation and a
job. For part of the summer, my wife and I rented a house overlooking
the upper Delaware River, one of those places that describe themselves
as “away from it all.” Indeed it was. In the week we were
there, we never saw another person on the road leading to the house.
The nearest sign of civilization (defined as grocery store and gas
station) was more than five miles away and in a different state. There
was no cell phone service within that radius either. We did have
satellite TV, from which I learned that Dish TV has just as many truly
lousy channels as Comcast does in Princeton, and just as few good ones.
Fortunately, the combination of Dish plus old-fashioned telephone
provided Internet access, albeit with painfully low bandwidth, barely
enough to manage e-mail but definitely not enough for Netflix. People
who say that cell phones and the Internet are going to replace other
services have clearly never spent any time in the boonies. You
might reasonably ask why vacation didn’t last longer. To those
not in the academic game, summer must seem idyllic: three months to
travel or lie on the beach or just veg out. Of course the reality is
not quite like that. First, faculty don’t get paid for those
three months. More important, no one can just stop their professional
activities for more than a modest period — life here may not
literally be publish or perish, but one has to keep up. So the summer
offers a change of pace and perhaps different kinds of things, but
it’s not for totally goofing off. One works, just differently.
My version of “working differently” was to spend a while
at Google in New York, writing code to explore some especially grimy
data with the hope of finding ways to make it better. This was a
typical summer intern gig: exceptional people, great fun, lots to learn
and almost sure to have no effect whatsoever, positive or negative, on
the company’s bottom line. I shared a Dilbert-like cube with
three other programmers whose combined ages barely exceeded my own. One
of them, poor guy, was a Princeton undergrad who had been in my class a
year earlier. I’m sure he would have preferred to escape the
all-encompassing Princeton bubble, but he took enforced togetherness
with great good grace, and it was a pleasure to have a friend sharing
the space.
For me, programming at Google was a way to keep up with real-world
software development. For students, summer internships offer more,
because they are a remarkably effective way to decide what kind of
“permanent” job might appeal. (The quotes around
“permanent” acknowledge the realities of today’s
perilous economy.) When I was an undergrad long ago, I spent a summer
working in an office for the Ontario highways department. The people
were nice, and I got to see some early computers in action, which helped
steer me toward a lifetime interest, but the job itself was so
indescribably boring that it convinced me that I would never again work
for a government agency. By contrast, as a grad student, I had a
wonderful summer at MIT, using the first time-sharing operating system
to help build the next version. I made friends that I still see from
time to time, and the job provided contacts and experience that led to
fantastic jobs at Bell Labs for the next two summers. Those internships
told me that I had found employment paradise, and I went there
permanently (well, for 30 years) right after finishing grad school.
Not every internship works out well, but even a bad experience
teaches us something about environments and about ourselves. In effect,
a summer job is an interview that lasts two or three months, with each
party assessing the other for long enough that decisions are based on
enough time together that they are likely to be sound. No hasty
marriages; better to live together for a while first.
But now, vacations and internships are finished, and summer, much too
short, is over. In spite of that, I’m glad that September is here
again. One of the appealing parts of the academic life is the big
annual cycle. In June, we all head off for vacations and internships.
Every fall there’s a fresh burst of energy as everyone comes back
to campus. We greet old friends and make new ones. The freshman class
is eager and enthusiastic. (That will diminish around midterm week and
be noticeably reduced by sophomore year.) People who live and work in
the real world don’t have this wonderful experience — their
treadmills run at a pretty steady pace throughout the year, and there
isn’t a lot to distinguish one part from another. We’re
lucky to get an annual rejuvenation, so enjoy it while you can. Welcome
back! How was your summer?
There’s a new phrase in the Princeton lexicon:
“self-isolation,” the state of living alone to avoid
infecting everyone else with swine flu. A variant form is also common,
in e-mail that begins “I’m self-isolating...”,
followed by an apology for skipping class or missing an appointment, and
often accompanied by a plea for an extension on an assignment. My first
lecture in COS 109: Computers in Our World generally includes advice on
how to do well in the course. One suggestion is simply to come to
class, since the material covered there could be intrinsically
interesting or useful, and sometimes, just coincidentally, might appear
on an exam. This exhortation is ignored most years, with attendance
eventually stabilizing at the University-wide norm of about 75 percent
for biggish classes. This year’s first lecture also included
some house rules, the first of which was “Please don’t come
to class if you’re sick.” This apparently contradictory
request seems to have caught on, judging by the amount of e-mail from
students coming down with, in the middle of, or not quite recovered from
swine flu. For my part, though I would love to have everyone in class
every day, I’m grateful indeed that people are staying away when
they’ve got a communicable disease.
I haven’t kept accurate records, but in the first three weeks
there were at least a dozen self-isolators in my class of 110, so one
might estimate that 10 percent of undergrads have already had swine flu.
What will happen next? I certainly don’t know, but the World
Health Organization (WHO) issued a press release late in August that
gave some possibilities. WHO said that “there would be a period
of further global spread of the virus, and most countries could see
swine flu cases double every three to four days for several months until
peak transmission is reached.” (This comes from Wikipedia’s
version of a widely disseminated news story.)
Since I’m teaching a QR course, the numbers in this story
presented an opportunity for a couple of questions in a problem set.
“Suppose there are 1,000 people with swine flu today. If the
doubling period is four days and peak transmission is reached after two
months, how many people will be infected? (This is the optimistic
scenario.) Now suppose that the doubling period is three days and peak
transmission is reached only after three months, a pessimistic scenario.
How many people will be infected?”
Truly interested readers are invited to pause here and work this out
for themselves; the uninterested can read right on for the answers.
Two months is 60 days, so in the optimistic case doubling every four
days means that there are 15 doublings. If you were ever in COS 109,
you will certainly remember that two to the 15th power is greater than
32,000, so the hypothetical 1,000 original cases have become 32 million,
roughly one-tenth of the population of the United States. Things are
not so good in the pessimistic case, however. Doubling every three days
for 90 days is 30 doublings, and two to the 30th power is more than a
billion, so our original 1,000 victims have become well over one
trillion. Given that the population of the world is less than seven
billion, everyone would have gotten flu well before the end of three
months. There are several useful quantitative lessons here, including
the approximations that link powers of two (like two to the 30th) to
powers of 10 (like a billion), and the important fact that no
exponential growth can go on forever.
Of course these are simple-minded models, but diseases do spread
exponentially (for a while) if they are contagious enough and if
people’s behavior encourages the spread. Hence the request to
isolate yourself — you’re much less likely to pass it on.
Whether you’re one of the unlucky 32 million or everyone gets
it, it’s no fun to be sick when far from home, and even less so if
the flu combines with painful complications like strep throat. If you
have nearby family or supportive roommates to bring sympathy and food,
they can offer comfort, but it’s still a grim time. I got the
regular seasonal flu about 10 years ago, presumably from a student, and
I thought I would die; indeed for a couple of the worst days, death
seemed like a fine option. Every year since, it’s been a flu shot
for me at the earliest possible moment, and so far I’ve have
managed to avoid getting sick again. I’m old enough to probably
have a few swine flu antibodies from a now-forgotten childhood illness.
But just in case, please continue to stay home and enjoy your
self-isolation as best you can. Once you’re truly well again,
come back to class. It will be really good to see you, and who knows
— some of the material might be interesting, useful, and maybe
even on an exam.
I was trekking across campus on a recent Sunday afternoon. It was a
beautiful day, and the campus was dotted with tourists and students
enjoying some sunlight and the last of the leaves still on the
trees. A couple of about my age was standing in front of the chapel,
holding a campus map but looking very lost. As I walked past, the man
said, “Excuse me...” in a heavy accent, perhaps Russian. I
asked if I could help, and they said, almost in unison, “Where are
we?” “In front of the chapel” was true but not
helpful, so I looked at their map. I have good map skills, but even so
it was pretty confusing, because they were holding it upside down, so
all the building names were upside down, too, even the ones they had
carefully highlighted in yellow.
After turning it right side up, the better to read the building
names, I pointed out the chapel (now recognizable), and told them where
they were. But what they really wanted to know was how to get to Fine
Hall. Conjecture: one or the other (or both) was a famous
mathematician, and there was a gathering at Fine? In any case, it was
now easy to guide them — turn right on Washington Road and look
for the really tall building. I watched to be sure that they made the
turn, and no doubt they got there safely.
Princeton does a fine job with its online map, the one that can be
neatly printed on a single piece of paper. I’ve seen these in all
kinds of hands over the years, and I’ve given out my fair share to
random visitors as well; I even used to carry an extra one since it was
so useful. On the other hand, the University does less well in
identifying buildings when you’re standing in front of them.
You’ll look in vain for a name on the computer science building
where I hang out. The E-Quad says, “School of Engineering and
Applied Science,” but when did anyone ever call it that? Fields,
Forbes, Foulke, Frist, Friend, Frick — it sounds like an excerpt
from Dr Seuss. Don’t confuse Fisher Hall, home of the economists,
with Fisher Hall at Whitman. If you’re sick, go to McCosh, not
McCosh.
In retrospect, I wonder if my tourists really wanted the old Fine,
which is where the mathematicians used to hang out. Perhaps I should
have sent them to Jones. That’s the current name for the building
attached to Frist Campus Center that was Fine, when the current Frist
was Palmer Labs (not to be confused with Palmer House, the
university’s up-scale bed and breakfast). Palmer Labs used to be
home to the physics department before it departed for Jadwin Hall, not
to be confused with Jadwin Gymnasium, which is even further away.
Two statues overlook the north doorway of Palmer (oops, Frist). The
one on the left is Benjamin Franklin, readily identified by his
distinctive coat, which was apparently deemed quite fashionable when he
was the American ambassador in Paris from 1776 to 1785. The other
statue is Joseph Henry, who is much less well known, though it’s
his name on Joseph Henry House, just a stone’s throw from Nassau
Hall, and perhaps Henry Hall, next door to Foulke Hall. Henry, who
taught here from 1832 to 1848, was an exceptionally accomplished
scientist of the time, second only to Franklin; he discovered much about
how electricity and magnetism work and even lent his name to the unit of
inductance, the henry. Surely immortality is having your name used for
a fundamental unit of measurement and spelled in lower case, like volt
or watt.
Back to the statues. In the 1930s, there were two scientists on
campus whose wives were pregnant and due to deliver at almost the same
time. The men (who I shall call Smith and Jones) agreed that the
parents of the first child to arrive could choose whichever of the two
famous names they preferred, and the other parents would take the
leftover. I heard this story from Joseph Henry “Smith,” who
I knew well, but I know nothing about Benjamin Franklin
“Jones.” One also wonders whether the wives concurred with
this decision process, and of course what might have happened if the
newborns were female, though those are stories for another time.
I’ve wandered around a fair number of university campuses over
the years. Some of them carefully label each building, so there’s
no doubt about where you are, but others, like Princeton, seem to prefer
a sort of security by obscurity: if you don’t know what a building
is named or what it houses, perhaps you don’t need to know. So
always carry a map, and be sure you know which way is up.
Big numbers are everywhere these days, what with lost jobs
(millions), bailouts (billions), and the budget and its deficits
(trillions). I think most of us don’t grasp these numbers at all,
to the point where words like million, billion and trillion have become
synonyms for “big,” “really big” and
“really really big.” Technology has its own set of
“big” words as well: mega, giga and tera are part of
everyday speech, while further-out ones like peta and exa now appear in
public with some regularity.
Since most of us have no intuition about big numbers and probably
don’t know the data that they are based on anyway, we’re at
the mercy of whoever provides them. Here are a couple of recent
examples.
There was much buzz before Christmas about Amazon’s Kindle and
other e-readers as potential gifts, along with speculation about a
tablet device from Apple. (The iPad was announced in late January, but
won’t ship until March, so that’s a topic for another time.)
On Dec. 9, The Wall Street Journal said that the Nook e-book reader
from Barnes & Noble has two gigabytes of memory, “enough to
hold about 1,500 digital books.” On Dec. 10, The New York Times
said that a zettabyte (10^21 bytes) “is equivalent to 100 billion
copies of all the books in the Library of Congress.”
By good luck, I was right then in the early stages of inventing
questions for the final exam in COS 109, so this confluence of
technological numbers was a gift from the gods. On the exam, I asked,
“Supposing that these two statements are correct, compute roughly
how many books are in the Library of Congress.” This required
straightforward arithmetic, albeit with big numbers, not something that
most people are good at. The brain often refuses to cooperate when
there are too many zeroes. Writing them all out might help, but
it’s easy to slip up. Scientific notation like 10^21 is better,
but units like “zetta,” completely unknown outside a tiny
population, convey nothing at all to most people.
Since intuition is of no help here, let’s do some careful
arithmetic. Taking the Journal at its word, 2 GB for 1,500 books means
that a single book is somewhat over a million bytes. Taking the Times
at its word, a hundred billion copies is 10^11; dividing 10^21 by 10^11
implies that there are about 10^10 bytes in a single copy of all the
books. If each book is 10^6 bytes, then the Library of Congress must
hold about 10,000 books.
Is this a reasonable estimate? One useful alternative to blind
guessing is a kind of numeric triage, which led the second part of the
exam question: “Does your computed number seem much too high, much
too low, or about right, and why do you say so?” Of course if one
didn’t do the arithmetic correctly, all bets are off. A fair
number of people found themselves in that situation, and thus had to
rationalize faulty values from hundreds to bazillions.
Those who did the arithmetic right were better off, but some still
had trouble assessing plausibility. Apparently even small big numbers
are hard to visualize, for a surprising number thought that 10,000 books
was reasonable for a big library: “I would guess that even
Firestone holds over 10,000 books” was a not-atypical response.
That’s not reasonable, of course — even I have close to 500
books in my office, and I’ll bet that many humanities colleagues
have thousands.
Let’s look at another example. At Christmas, my wife gave me
“Googled: The End of the World As We Know It,” by Ken
Auletta. It’s an interesting history and assessment of the most
successful technology company of the past decade, though there were
places where the fact-checking was a bit spotty. For instance, it
claims that Google’s CEO, Eric Schmidt ’76, graduated from
Princeton in 1979. But in exam-creation mode, what caught my eye was
the very last sentence, which says that Google stores “two dozen
or so tetabits (about twenty-four quadrillion bits) of data.”
Another gift! There is no such thing as a tetabit; if quadrillions
is correct, then the word should have been petabits. So I asked,
“How many gigabytes does Google store?” This required
converting petabits to gigabits, then dividing 24 bits by 8 bits per
byte to get 3 million gigabytes. But “tetabit” is also only
one letter away from another valid unit, terabit, so the second half of
the question asked, “If tetabits really should have been terabits,
how many gigabytes would there be?” I’ll leave that as an
easy exercise.
What are we to do when the country’s premier newspapers and
highly qualified authors of important books can’t get the numbers
or the units right? Most numbers just go right by; we don’t have
the time or background to pay much attention, and we act on intuition
and gut feelings, however faulty. Could we do better? Eternal
vigilance is a partial answer. Informed skepticism, a little knowledge
and some grade-school arithmetic will also help, but only if we use
them.
Last fall I read a most enjoyable book by Paul Collins, called
“The Book of William” and subtitled “How
Shakespeare’s First Folio Conquered the World.” The First
Folio was published in 1623, the first collection of all of
Shakespeare’s plays. The material was pulled together after his
death by John Heminge and Henry Condell, who had known him well and
acted in some of his plays. Collins tells a fascinating story of the
initial collection of plays after Shakespeare died in 1616, the
modifications in later folios, the disastrously bad edition by Alexander
Pope in 1725 and the Folio’s steadily increasing value as a
historical document and financial asset. He trekked around the world to
visit as many of the surviving First Folios as he could, and wrote
engagingly of the people and places he saw. If one is a serious
bibliophile, this must be well-trodden ground, but it was all new to me,
and very well done.
Having read Collins’ book, I really wanted to see a First Folio
for myself. The Folger Library in Washington, D.C., has a lot of them,
but it turns out that there’s no need to make a trip: Firestone
has a copy in its Rare Books collection, and thanks to a current exhibit
in the Milberg Gallery, you can just walk in and there it is.
Why is it on display? For the past several months, Firestone has
featured an intriguing exhibit called “The Author’s
Portrait: O, could he but have drawne his Wit.” There are over a
hundred significant rare books and other artifacts that include
portraits of their authors; the collection includes works by, and often
more or less contemporaneous pictures of, people like John Milton,
Harriet Beecher Stowe and George Bernard Shaw.
The First Folio is there because of its frontispiece, the famous
engraving of Shakespeare. You’d recognize it for sure. I did
because it’s the one in the Complete Works that I bought for a
couple of dollars at a used book sale years ago, long before I knew
anything about the portrait or indeed any of the history of how
Shakespeare’s plays were collected.
It was also sometime last fall, perhaps with The Book of William in
mind, that I happened to notice something unusual in the window of the
toy store on Palmer Square, just down the street from Teresa’s.
Just what every kid wants — a Shakespeare Action Figure with
Removable Quill Pen and Book! Unfortunately, this improbable
“action” figure appeared to be pretty badly done. It was
only five or six inches tall, made of hard vinyl and rather crude in
execution. The tiny removable book and quill pen would be lost
immediately by even the most careful child. I looked at it several
times over the next couple of months but could never bring myself to
spend money for something so poorly made, though it might have made a
tongue-in-cheek present for a friend in the English department.
Eventually it disappeared, whether sold or shipped back one can only
speculate, and the opportunity seemed to be gone.
But thanks to the miracle of Google, I was able to find the company
that sells it. Indeed, its online catalog has an army of action
figures, from Alexander the Great to Annie Oakley, from Jane Austen to
Oscar Wilde, from Franklin to Einstein, from Jesus to Freud. Some of
these famous people also have removable parts, and the Deluxe Jesus
comes with five (vinyl) loaves, two small fishes and a jug for
converting water to wine. (I am not making this up.)
Einstein and Beethoven apparently have no removable parts. The
latter omission is a bit surprising, since Beethoven is often pictured
as writing with a quill pen, and there’s not much difference
between a tiny plastic book and a tiny plastic music manuscript. None
of the figures are particularly accurate representations of their
originals, at least as I remember images; for instance, neither Einstein
nor Beethoven has nearly as much vinyl hair as they seemed to have had
in life.
Here’s a question: who would be our campus action figures,
Princeton stars that combine great distinction with real
distinctiveness? Cornel West GS ’80 comes immediately to mind; no
one else here is so instantly recognizable. Paul Muldoon would surely
qualify. Joyce Carol Oates would be a wonderful addition to the group.
My fellow ‘Prince’ columnist, Tony Grafton, is a natural.
Of course President Tilghman belongs, and if she were to carry the mace
that signifies her office, that would be a really impressive removable
part. So give it some thought. It’s likely that suggestions
would be welcomed by the company, which must be looking for ways to keep
their list fresh, and I’ll bet that there would be a decent local
market for our very own Princeton action figures.
Princeton has often seemed to me like a sort of big extended family.
Like all families, some of the members are a bit odd and a few
don’t get along, but for the most part we have enough in common
that we generally enjoy each other’s company. This family
feeling is particularly obvious at Reunions, when tens of thousands of
alumni greet each other like long-lost relatives, but for me there was
another reminder early this year. The Office of Information
Technology’s directory service finds names, numbers and e-mail
addresses for students and faculty, but it turns out that there is a lot
more there than you might realize. Every so often, I run a program that
extracts information from the database and converts it into a format
that’s more convenient for my own weird searches. This spring, I
discovered that what used to be about 25,000 items had blossomed into
nearly 100,000. Most of the new entries are no more than a name or a
netID, apparently for people long since departed, since they are listed
only as “department unknown.”
Some 1,700 entries, however, reveal a family connection. A listing
like “Jonathan Huffington Post III ’77 P09 P13” says
that Huffy is a parent of a current freshman and a son or daughter who
graduated last year. There are on average 140 such entries for each of
2010 through 2013, which suggests that well over a hundred students each
year are “legacies,” the standard term for the children of
alumni. I haven’t checked with officialdom for accurate numbers,
and the OIT data is certainly not complete, but this is pretty
consistent with other sources. (There seems to be no analogous
“G13” notation, but surely there are grandparents as well,
and in fact it would be a nice programming exercise to try to find them
mechanically from the OIT data.)
This all reminded me of another aspect of family: siblings in
classes. I was pleasantly surprised in my first year here to encounter
two sisters, one in the fall and the other next spring. (Hi, Grace and
Joyce!) I’ve kept a casual record ever since, and found that
it’s not at all unusual to have a brother or sister of a former
student in one of my classes. Over the past 11 years there have been 36
that I know of, including two semesters with twins in the same class.
It’s only moderately embarrassing that I never could reliably
distinguish between Adam and Matt, or Jessica and Tiffany, but others
would certainly have had the same problem.
I’ve also run into a fair number of children of friends, even
if the parents didn’t go to Princeton. If your name is Smith,
you’re anonymous unless someone tells me, but if you have a
distinctive name, it’s not too hard to make the family connection.
Surprisingly, a modest handful of these parents have said that they
learned to program from one of my books, which is incredibly rewarding,
even though it does remind me all too clearly of the passage of time.
Someday this comment will come from a grandparent, which might well be a
sign that it’s time to move on to “department unknown”
status.
I’ve also had well over a dozen students whose parents are
Princeton faculty or staff. Again, if the name is distinctive,
it’s not too hard to infer a connection, but usually there’s
no clue. For example, last fall I was on a committee with a colleague
whose family name is fairly common — not Smith, but well within
the Census Bureau’s top 100. During one of the endless meetings,
he casually mentioned that his daughter had graduated a couple of years
earlier. Something clicked — indeed she had been in my class, but
(and this seems to be an inviolable rule) she had not given the
slightest hint that her father taught here. Not that it would make any
difference, but in most cases, I learn of this kind of family connection
only well after the fact; indeed there were two other faculty children
in that same class.
Lots of schools must have a similar kind of friends-and-family
experience, but it seems especially strong at Princeton. The continuity
over years and generations adds to the sense of community and enriches
the experience for all of us. Certainly it’s a pleasure to
discover that a student that I have come to know today has a connection
that goes back 25 or 30 years and perhaps even further. When you go to
Reunions next month, watch for recent grads marching with their parents
and their parents’ friends and everyone’s young children,
all having a wonderful time. As Tolstoy said in “Anna
Karenina,” “Happy families are all alike.” The
Princeton family has a lot of very different members, but it seems
remarkably happy, and it’s great to belong to it.
The term “Luddite,” often aimed today at people who complain
about new technology, comes from the (perhaps fictional) Ned Ludd. Ludd
was said to have destroyed textile machinery to protest its negative
effect on his livelihood and way of life during the early days of the
industrial revolution in England. Or, as urbandictionary.com explains,
“A group led by Mr. Luddite [sic] durring [sic] the industrial
revolution who beleived [sic] machines would cause workers [sic] wages
to be decreased and ended up burning a number of factories in
protest.” While looking for more accurate information about Mr.
Ludd, I came upon an essay by Thomas Pynchon, published in The New York
Times Review of Books on Oct. 28, 1984. Under the title “Is it
O.K. to be a Luddite?” Pynchon wrote, “Machines have
already become so user-friendly that even the most unreconstructed of
Luddites can be charmed into laying down the old sledgehammer and
stroking a few keys instead.”
Well, maybe. But that was then and this is now. Although I’m
hardly a Luddite, recent painful experiences lead me to take issue with
the phrase “so user-friendly.” For instance:
I used to have excellent map skills, and in fact probably still do,
but today large-scale detailed paper maps can be hard to find. Google
Maps and its competitors are helpful if one has immediate and speedy
access to the Internet on a big screen. But on the road in a car,
that’s not the case, so I’ve been an early adopter and (in
unfamiliar territory) a regular user of a GPS. The first one of these,
bought for my wife, proved invaluable when off the beaten path; it was
comforting to know that we could always find our way back to
civilization — that is, the nearest interstate. That first GPS
cost more than $500, in part because it was bundled with a PDA (personal
digital assistant), a now-vanished technology that we never used anyway.
Our second GPS was bought for driving to Yellowstone a couple years
ago, a trip with frequent diversions into small towns in the middle of
nowhere as my wife dug into her family history. The new device, made by
the same company, cost less than half what the original did and it had a
bigger and brighter screen, but it was markedly less user-friendly, to
the point where I seriously considered acquiring a small sledgehammer.
It no longer told us what road we were on, just the name of some
upcoming cross street. It no longer showed a track of where we had been
to help us avoid traveling in circles. It replaced the easily
identifiable car icon by one in a shape and color that made it
indistinguishable from the background. It removed any hint of a scale
from the display, so there was no way to tell whether the next tiny
village was a mile away or a hundred. It didn’t let us pan around
the screen to see context. And it had a mind of its own about
interchanges, maniacally zooming in whenever we came near a complicated
intersection, so we could tell only that we were somewhere in a maze of
ramps.
Gearing up for a trip late this summer, I bought a newer version of
the same GPS, at a third of the price of the previous one. This model
does tell us what road we’re on, and the car icon can now often be
distinguished from the background. It also has a large repertoire of
recorded street names, so it can say (in the voice of a robot female
drill sergeant) “in point one miles turn right on Prospect
Avenue.” Unfortunately, its tiny little processor can’t keep
up with this task, so the message is more like “in [snap crackle
pop] turn [long silent pause] on [static].” There’s still no
scale or panning, and it’s even more aggressive about zooming in
on interchanges.
There’s an art and even some science to creating good user
interfaces. One of the simplest rules is to enlist potential users as
victims and get their frank opinions before the design is frozen.
Our world is full of gadgets and systems like my GPS that have focused
on elaborate “features,” apparently at the expense of this
basic step. It’s hard to believe that such flaws would not have
been remarked on by anyone who tried to use the device in real life. In
recent weeks I’ve struggled with cell phones, cable TV remotes,
computer networks, e-mail and online calendars. An expert would charge
a huge consulting fee to explain how their user interfaces are the
antithesis of friendly. I’d happily spell it out for free; just
ask.
Every August, US News & World Report publishes a popular issue that
ranks U.S. colleges and universities. If memory serves, Princeton has
been ranked first in “national universities” for the past 10
or 11 years, with two exceptions. Several years ago we slipped to
second place for a year, clearly because of some technical glitch at the
magazine. This year Princeton was again second, this time perhaps
because I am on sabbatical, along with several distinguished colleagues
whose presence adds so much to the environment. Each year the
University is appropriately restrained when it graciously acknowledges
the results, pointing out that Princeton is happy to be highly ranked,
but that there are plenty of good schools and prospective students
should not base their choices solely on rankings. Of course everyone is
quietly pleased to have Princeton’s greatness recognized.
There are many such rankings, from a veritable smokestack industry of
suppliers. One of these is the National Research Council, a highly
respected nonprofit institution that has provided “science,
technology and health policy advice” to the federal government
ever since the first of its constituents, the National Academy of
Science, was established by Abraham Lincoln in 1863. Along with
advising, from time to time the NRC ranks graduate programs at U.S.
universities. The last study was published in 1995, so it was more than
a bit dusty when the NRC undertook an update some years ago. The new
results were due to appear in 2005, but for whatever reason were delayed
until the end of September, just a month ago.
Princeton did very well in this exercise, with many departments in
the top handful in their respective cohorts. Computer science, the
department that I know best, went from sixth in 1993 to second.
Wonderful! We’re No. 2!
Sober second thought, however, makes one a bit cautious. Princeton
computer science is a fine department, but is it better than, say,
Massachusetts Institute of Technology, which came in third? MIT was
second in 1995, after Stanford and followed by University of California,
Berkeley; Carnegie Mellon University; and Cornell — a quite
reasonable result. Other computer science departments were jerked
around wildly; for instance, computer science at the University of
Washington went from ninth to 38th, a most unlikely outcome.
I asked friends in other departments about the rankings. One
colleague here calls the results “completely nuts.” In his
discipline, the top school has no faculty in several major areas of the
field, while the three schools that are Princeton’s main
competitors for graduate students are well down the list. Naturally
those who move up are happy; another school congratulates one of its
departments on rising from 81st to “in the top 12,” a
dramatic leap that might be taken with a grain or two of salt.
How do such wild results happen? In the 1980s, statisticians at Bell
Laboratories studied the data from the 1985 “Places Rated
Almanac,” which ranked 329 American cities on how desirable they
were as places to live. (This book is still published every couple of
years.) My colleagues at Bell Labs tried to assess the data
objectively. To summarize a lot of first-rate statistical analysis and
exposition in a few sentences, what they showed was that if one combines
flaky data with arbitrary weights, it’s possible to come up with
pretty much any order you like. They were able, by juggling the weights
on the nine attributes of the original data, to move any one of 134
cities to first position, and (separately) to move any one of 150 cities
to the bottom. Depending on the weights, 59 cities could rank either
first or last!
To illustrate the problem in a local setting, suppose that US News
rated universities only on alumni giving rate, which today is just one
of their criteria. Princeton is miles ahead on this measure and would
always rank first. If instead the single criterion were SAT score,
we’d be down in the list, well behind MIT and California Institute
of Technology. If the criterion were ZIP code, which is nonsensical but
objective, Princeton would be near the bottom (though still above
Harvard and Yale!), and no school outside the northwest could compete
with the University of Washington’s 98195.
To its credit, the NRC makes its data and methodology freely
available, so you can create your own lists with any criteria you like.
I often ask students in COS 109: Computers in Our World to explore the
malleability of rankings. With factors and weights loosely based on US
News data that ranks Princeton first, their task is to adjust the
weights to push Princeton down as far as possible, while simultaneously
raising Harvard up as much as they can.
The exercise is meant to demonstrate convincingly that such rankings
don’t mean much. Most things in life are far too complicated to
rank in linear order — almost everything is a messy combination of
properties, and those properties are differently important to different
audiences. The next time you hear that Princeton is number one, be
grateful for the compliment, but remain a little skeptical.
I’ve been doing a lot of driving lately, and it has given me
plenty of time to listen to music and ruminate on the important
questions of life, like what determines the price of gas.
Prices have fluctuated a lot over the past few years, though the
long-term trend is mostly up. For example, on opposite corners of an
intersection in North Jersey there are two gas stations that I used to
frequent, now both abandoned but still sporting signs advertising the
price of gas when they went out of business. One says 87.9 cents for
regular, but it’s been closed for well over a decade and dollar a
gallon gas is surely gone forever. On the other corner, the last price
was 363.9 cents. That station must have died during the spike in prices
three years ago, which I remember vividly because I drove to Yellowstone
and back and never saw gas below $4 a gallon. Today prices are just
under $3, but I’ll bet there will soon come a time when 363.9 is
in range again.
Whiling away the miles from Boston to Princeton a few weeks ago, I
also saw a remarkable spread of prices even on the same day. The
cheapest, 252.9, was in New Jersey, which always seems to offer a
bargain, but I had been forced to refuel in western Connecticut at
309.9, which irritated me no end. When I arrived in New Jersey, with a
nearly full tank of expensive gas, the price at the Garden State Parkway
service center was 257.9, only a nickel above the lowest price anywhere.
This was a bit odd, since the service centers don’t gouge but
they’re not normally cheap either, a fair trade for convenience.
Someone is leaving money on the table by not setting the price higher
than any random corner station.
It was only a month or two ago that Governor Christie backed out of
New Jersey’s commitment to a new railroad tunnel under the Hudson
River, citing the cost (about $3 billion for New Jersey if I recall).
I’m certainly not an expert on such matters, but it seems that he
could have done better on this decision.
Let’s do some quantitative reasoning. As a rough estimate,
there are 5 million vehicles in New Jersey (one for every two people,
give or take). At 10,000 miles a year and 20 miles per gallon,
that’s about 500 gallons per vehicle per year. If the gasoline
tax were raised by 20 cents a gallon, which would bring prices more or
less into line with surrounding states, a typical driver would pay an
extra $100 for gas over the year, and the state would collect an extra
$500 million in taxes. In a few years, that would cover New
Jersey’s share of the tunnel’s cost. Of course there are
downsides — gas taxes are regressive, falling unequally on the
less affluent — and my estimate might be too optimistic, but
it’s worth considering.
I commuted to New York on NJ Transit for much of the summer. The
train trip is wearing but a lot less tiring than driving would have
been, and always much cheaper since NJ Transit gives a generous discount
to senior citizens. The existing tunnel under the Hudson, with only
two tracks, operates at capacity all the time, and the slightest glitch
causes delays. I spent hours sitting in morning trains waiting to get
to Penn Station, and a similar amount in the afternoon waiting to leave
the city. (Twice I also sat for hours in the middle of nowhere when
trains were held up by what NJ Transit euphemistically calls
“trespasser incidents,” in which some unfortunate soul
decides to end it all by walking in front of a high-speed train and the
entire Northeast Corridor is shut down while the authorities
investigate.)
On return trips from New York, I often chatted with a distinguished
NJ Transit engineer who was designing electrical systems for the new
tunnel. He was remarkably instructive on technical matters, and it was
reassuring to see the high level of competence that went into the
planning. It’s a shame that his work for the past five or 10
years will be for nothing.
On the other hand, it’s good to see some local sanity: it
sounds like plans to replace the Dinky by a bus are not moving very
quickly, if at all. Anyone who takes the Dinky regularly knows how
convenient and reliable it is and how unlikely it is that a bus would
provide service as good.
I drive between Princeton and Boston rather than taking the train
because it costs half as much and is a lot faster. But for the long
run, our public transport choices as a society often seem wrong-headed.
If we each optimize for ourselves at the expense of the whole,
eventually it will catch up with us, as we run out of gas and our
infrastructure collapses around us.
In a single day recently I got five separate invitations to join
someone’s “professional network” at LinkedIn. Two
were from people I had never heard of, one was from a colleague from a
previous life, and two were from former students that I remembered
fondly. It’s always a pleasure to hear from friends, but they may
not realize that invitations are being sent in their name automatically.
In any case, I don’t use LinkedIn, so the messages are ignored and
eventually LinkedIn gets tired of resending them. Facebook has gotten
some pushback for the same kind of outreach — last week the
company agreed not to send unsolicited invitations to non-members in
Germany. This is not the only example where users have been upset.
Last month a new feature made it easy to unwittingly hand out
one’s own address and cell phone number to advertisers. That
bothered enough people that it was rescinded three days later. Since I
don’t use Facebook either, I don’t have to worry about the
effect of continually changing privacy policies and occasional bugs,
like the one that let a hacker post to Mark Zuckerberg’s fan page
last week.
I also received unsolicited mail from Twitter last month. Some
misguided soul wanted to follow me; all I had to do was create an
account and start tweeting. That’s unlikely to happen. I once
heard someone characterized as “a person who puts the twit in
Twitter,” and would rather not belong to that group. The aphorism
“Better to remain silent and be thought a fool than to speak out
and remove all doubt” is often attributed to Abraham Lincoln;
whoever its author, he was perhaps paraphrasing Proverbs 17:28,
“Even a fool, when he holdeth his peace, is counted wise: and he
that shutteth his lips is esteemed a man of understanding.” My
lips are sealed, at least on Twitter.
Whenever an invitation arrives it means that someone, however
well-intentioned, has handed out information about me, vouched for its
authenticity, and given the recipient carte blanche to use it. For
example, LinkedIn’s privacy policy says “By providing
email addresses or other information of non-Users to LinkedIn, you
represent that you have authority to do so. All information that you
enter or upload about your contacts will be covered by the User
Agreement and this Privacy Policy and will enable us to provide
customized services such as suggesting people to connect with on
LinkedIn.” That seems to say that if your contact list includes
me, you’ve told LinkedIn that it’s OK to spam me.
Social networking sites are useful, but their business model is to
sell detailed information about their users. As a result, our personal
data spreads far and wide, our privacy is eroded and our risks increase.
In mid-December I received e-mail from my friend Charlie, sent to a
small private mailing list. It said, “Apologies for having to
reach out to you like this, but I made a quick trip to the UK and had my
bag stolen from me with my passport and credit cards in it. The embassy
has cooperated by issuing a temporary passport, I just have to pay for a
ticket and settle hotel bills.” It went on to ask for a loan, and
gave a UK phone number.
It was possible. Charlie is a frequent traveler and could have made
a trip; I hadn’t seen him for a couple of weeks. I called the
number in England, but it didn’t ring, which was odd. I replied
to the mail and received a personal response: “Hi Brian. I am in
a bit of trouble, please lend me $1,800. Western union will be OK, let
me know if you need my details for the transfer. I will pay back once I
return to the States.”
I asked for instructions but I also asked a question that only the
real Charlie could possibly have answered. Back came precise
instructions for sending money by Western Union but there was no answer
to my question. I was way beyond suspicious at this point, and a quick
Google search revealed that this was a phishing attack that had been
going on for months; the perps hadn’t even bothered to change the
script or the phone number. We learned later that Charlie’s Gmail
account had been compromised, providing the bad guys with lots of
potential targets, but the scam could equally well have been based on
information gleaned from a social network.
As Zuckerberg said on his blog last May, “When you share more,
the world becomes more open and connected.” The flip side is that
as more and more of our personal information becomes available online,
we are ever more vulnerable. Real friends are beyond price, and e-mail
and social networks make it wonderfully easy for us to keep in touch,
but not every “friend” is really a friend, and relying on
the kindness of online strangers is most unwise.
A sabbatical offers a year without teaching or other responsibilities
(though with a commensurate reduction in pay). It’s a chance to
renew oneself, learn new things and in my case try to write a
book. The sabbatical is working out well. I’ve been spending
much of my time at Harvard, where the computer science department and
Berkman Center for Internet and Society have both been welcoming and
supportive beyond measure, another reason why the time is so enjoyable.
I have a comfortable office among old and new friends, less than a
10-minute walk from home. There’s free coffee, a new computer and
lots of interesting things to do. I would be happy to have a sabbatical
every year, though Princeton is unlikely to support this, and Harvard
might tire of my presence.
Harvard and Princeton are quite different places, of course, but
there are times when their similarities seem unusually close. For
example, both schools are major tourist attractions. One of the most
popular local sites is the statue of John Harvard, which stands (or in
his case sits) in Harvard Yard outside University Hall. Harvard
didn’t found Harvard University nor did he attend it, as the tour
guides are fond of pointing out, but when he died in 1638 after barely a
year in the colonies, his will left nearly £800 and his library of
400 books to the new college just starting up in Cambridge, and this was
enough for naming rights.
Tradition has it that if one rubs the statue’s left foot, it
will bring good luck. Most of the statue is dark brown, but the left
foot always gleams golden and bright, and at almost any time of day,
some tourist is rubbing it while someone else is taking a picture.
People who cross the Yard regularly try to walk behind the shooters so
as not to interfere, though sometimes the crowds are so thick that
it’s impossible.
John Witherspoon, who was born almost a hundred years after John
Harvard died, has his statue in Firestone Plaza. The two Johns both
have longish hair and wear similar clothing — styles evidently
didn’t change rapidly at the time, at least as imagined by
sculptors who lived 200 years later — and both have a big book in
hand. But Witherspoon is standing, resting his book on the outstretched
wings of a tough-looking raptor, and there’s no sign that anyone
has ever tried to rub his foot; somehow one gets the feeling that he
would not have approved.
Like Harvard, Witherspoon was a minister, and he left his library of
300 books to Princeton. Unlike Harvard, who died at 30, Witherspoon had
a long, rich and influential life. He was a popular teacher whose
students included Aaron Burr, James Madison and dozens of senators and
congressmen. He was the sixth president of Princeton (then known as the
College of New Jersey), a congressman and a signer of the Declaration of
Independence. With a little better luck and timing, or the services of
a good public relations firm, we might all be at Witherspoon University
today, but his name lives on only in a dorm, a nearby middle school and
the street that leads from FitzRandolph Gate to Small World Coffee. By
contrast, John Harvard is immortal.
Another similarity strikes me on my usual walk across campus. On the
south side of Harvard Yard, a small gate leads in from Massachusetts
Avenue. The Dexter Gate, which was built in 1901, leads through a dark
passageway under a dorm into almost an alley at the rear of Widener
Library. But it has two of the most compelling inscriptions in the
whole place.
Over the entrance is written “Enter to grow in wisdom.”
The words come from Charles William Eliot, Harvard’s president
from 1869 to 1909. Eliot was an educational reformer, generally
credited with modernizing Harvard, including innovations such as
standardized entrance exams, an elective system and written
examinations. (The Harvard Crimson noted in September last year that
less than a quarter of undergraduate courses now have written final
exams, so Eliot’s influence in that sphere seems to be waning; I
would not be surprised to find a similar trend at Princeton.)
But back to the inscription. It would be hard to improve upon the
sentiment of Eliot’s words. What better goal for those who come
to a great university than to grow in wisdom?
The inscription on the other side of Dexter Gate, hard to see as one
leaves campus for Massachusetts Avenue and the bustle of Cambridge, is
another Eliot quote: “Depart to serve better thy country and thy
kind.” The words are different, but in spirit they seem almost the
same as Princeton’s own “in the nation’s service, and
in the service of all nations.” Surely, words to live by, in both
directions and at both places. It’s so easy to take these
wonderful universities for granted; entering or leaving campus we might
occasionally pause to think about why we’re here.
My wife and I spent our sabbatical in a lovely house on a quiet street
in Cambridge, a short walk from Harvard Square and the university. It
belonged to a couple of professors who were spending their sabbaticals
in the real Cambridge in the fall and then in Singapore in the
spring. One of the many charms of their house was that it was crammed
with books. Some of those reflected busy professional lives; she had a
lot of books in Chinese and his dealt with environmental issues around
the world. But many fell into the category of books I should have read
but hadn’t — serious fiction whose titles I knew well though
not their contents, scholarly biographies of important people, deep
studies of philosophy and religion, big books on politics and economics.
Indeed, a whole year of full-time reading would not have been enough to
get through even a tenth of them.
I did read a dozen heavy tomes that appealed in one way or another,
like Jared Diamond’s “Collapse,” which I had started
when it first came out but never finished. There was a John Keegan
history of World War I that I had somehow missed (be glad you
didn’t live in Europe in 1914) and I re-read Barbara
Tuchman’s “A Distant Mirror” (be glad you didn’t
live in the 14th century). But after a while I began to yearn for books
that weren’t in the “should have read” category. I
wanted lightweight popular history or complete fluff like detective
stories. The “light” part was important figuratively, since
my brain needed a break, but also literally, since it’s just too
much effort to hold up a 700 page book while reading in bed; lightweight
is good.
Maybe this is where the Kindle and its kin shine. Smaller
e-book readers don’t weigh as much as even one real book but they
store lots of them, and online book sellers make it seductively easy to
buy more — one click and you’ve got it, no matter when or
where you are.
Are e-books the future of reading? In Amazon’s latest earnings
report, CEO Jeff Bezos ’86 says “Kindle books have now
overtaken paperback books as the most popular format on
Amazon.com.” Barnes & Noble said much the same for Nook books,
though one should be careful to read the fine print — that applied
only to online sales, not those in brick and mortar stores.
I don’t understand the economics of e-books. Since their
production and distribution costs are close to zero, one would expect
them to be quite a bit cheaper than their physical equivalents, but they
are often comparably expensive — immediate gratification has a
price. For instance, when a new cop novel by one of my favorite
authors was published last fall, the Kindle version was only 10 percent
less than the physical hardback. The paperback version just
appeared; it’s five bucks cheaper than the Kindle edition! This
all seems weird, given that when you “buy” an e-book, you
don’t even own the bits. Like most digital media, the book is
“licensed, not sold,” and you only have a transient right to
make limited use of it. You can’t sell your e-copy to someone
else nor can you donate it to a local charity for their annual book
sale. And as we saw with the Orwell copyright fuss in 2009, Amazon can
even unsell the book.
I don’t own a Kindle, so when we returned home, I headed over
to Firestone to find some lightweight entertainment. To my delight,
sitting on the shelf in the Dixon collection was the very same hardback
cop novel by the favorite author that I had seen on Amazon. No
waiting, no fourteen dollars, just bring it back within a month.
As I was checking out my handful of books, the young woman behind the
counter commented on my choices. I said that they were only for
enjoyment (I probably said “trash” to convey that they were
certainly not “learned”). She said, “I’m really
looking forward to reading books that I want to read.” From this,
you could safely infer that she was a senior and had finally had enough
of books that other people told her she had to read.
Naturally I asked the standard and probably irritating question
— is your thesis done yet? “Not even close,” she
replied, which is definitely the standard answer, especially at the
beginning of April. Hang in, dear seniors. In a couple of weeks, your
theses will be done. In two months, you’ll be free. You’ll
miss Princeton more than you could imagine, but you will finally be able
to read any book you want and purely for your own pleasure. With a bit
of luck, you’ll be near an outstanding library that lets you
wander freely through its collection, wherever curiosity and serendipity
lead. It’s a great experience and a lot more fun than any e-book.
Enjoy.
At last! As is often the case in Princeton, we went from frost alerts
and furnace weather to “turn on the air conditioner” in
about a day. Spring is a lovely time here, with flowers everywhere and
delicate shades of new green on trees and bushes. Maybe this was the
image that F. Scott Fitzgerald had in mind when he described Princeton
as “rising, a green Phoenix, out of the ugliest country in the
world.” By now, the flowering trees have mostly done their
thing: the magnolias by the Scudder Plaza swimming hole have dropped
their petals into a slippery mess, but they were great while they
lasted. The ethereal archway of flowering pear trees along Witherspoon
Street is now just fresh leaves. The cherry tree in our backyard
exploded into blossoms in a single day. According to my wife’s
records, this happened about 10 days later than it did last year,
further evidence that spring was a bit overdue.
One downside of all this burgeoning plant life is that the lawn has
started to grow like crazy as well, along with a healthy crop of
dandelions, and everything has to be mowed frequently; if one lets it
slide for a couple of days too long, there’s a danger that small
children will get lost in it. Of course if we continue to get rain all
the time, that will make the grass grow even faster while preventing me
from mowing it often enough.
Spring is an active time on the wildlife front as well. We
haven’t had deer on our property for several years now; we
sometimes wonder whether that’s the natural cycle of life, or the
unhappy result of hostile action by the deer police. But there are
plenty of birds and of course uncountable squirrels to entertain the
family cat.
Canada geese, which are pretty much permanent residents of the area,
appear to be getting ready for new family members; it’s quite
common to see a pair with one standing guard while the other sits
quietly on the ground in the hatching position. Soon there will be a
bunch of new geese, which are very cute when they’re young;
unfortunately they rapidly become adults with the same unpleasant habits
as their parents. When I was a kid growing up in Canada, one normally
saw Canada geese only as they flew south for the winter and back north
in the spring. They were mysterious and magical — amazing flying
machines. The latter is still true, but as year-round residents and
pests, they are now anything but magical.
One good place to watch wildlife is the pair of paths that run along
Lake Carnegie, between the lake and Faculty Road. (I will defer
discussion of wildlife on the Street for another time.) Not far east of
Washington Road, and easily seen from the paths, there’s a small
island in the lake. A group of Canada geese hangs out there, making a
great racket almost all the time. I can’t see what they’re
excited about, though perhaps it’s some kind of turf thing with
the other big birds in the same area. Those look like cormorants,
though I’d be happy to have a bird expert correct me. They spend
most of their time standing on dead logs, gazing silently out at the
lake. I’ve seen an occasional heron in the area as well, usually
gliding along a few inches above the lake surface, and I think that I
spotted one of the local eagles a couple of weeks back, though it was so
high up that I couldn’t be sure.
To me, turtles are the most interesting wildlife in the lake area.
If the weather is warm and sunny, turtles like to bask on the
half-submerged logs. I’ve seen as many as 30 on a single log,
packed side by side as close together as they can get. Many of them are
quite substantial, a foot or more in diameter, and a few are noticeably
bigger. Again, I’m no expert, but the majority appear to be
snapping turtles. Not that I propose to move in for a close-up
inspection; snappers really do bite, and they don’t seem to have
pleasant personalities either. On the other hand, these turtles are
skittish, and it’s hard to sneak up on them even to take a
picture. The slightest sound or movement leads to a mass dive into the
lake, leaving the logs bare — it’s turtles all the way down.
Fall in Princeton is nice, but spring is the best. The weather is
getting warmer, the world is young again and everyone can look forward
to the end of classes and exams. But it’s too short. Reunions
and graduation will come and go in a flash and then it will be hot and
humid summer.
Hurricane Irene arrived in Princeton on Saturday, Aug. 27, after a huge
media buildup, where every channel seemed to be all Irene all the time.
We had battened down our hatches and laid in enough food and drink to
last for four or five days, so all was well on that front. As it turned
out, at least in this area, the winds were not as strong as had been
predicted, and we personally got lucky — no fallen trees, no
flooding, no power failure, just a lot of small-scale debris on the
lawn. On Sunday afternoon, when things had settled down and the rain
had temporarily stopped, I walked downtown to survey the damage. It
didn’t look too bad, though there was a large tree lying across
William Street near 185 Nassau. As I walked back along Olden Street
towards Prospect, however, the big oak at the corner by the Fields
Center seemed to be leaning a lot more than normal. It had started to
pull up the ground around it, so I skirted it carefully; only a minute
later there was a loud crack and a thump as it went down.
When I got home shortly afterwards, my wife told me that the power
had just gone off, which didn’t seem like a coincidence. After a
while, I called PSEG, where a robo-person walked me through a
voice-recognition dialog (“Please press or say two”) and
assured me that although no problems had been reported in my area,
someone would call back with a resolution by Sept. 4, a full week away!
That was the first real inkling of the extent of the problems.
In retrospect, we were lucky indeed. Our power came back two days
later, which seems to have been about the norm in Princeton. Many
people in the Northeast were not nearly so fortunate; their power stayed
out for days, and there was disastrous flooding in many parts of New
Jersey. We never did hear back from PSEG, which clearly had bigger
problems to deal with.
All this is a reminder of how dependent we are on infrastructure
without which we can’t operate at all, let alone enjoy the
services that we take for granted. For example, our house has a gas
stove with an electric igniter. If my wife had not laid in a large
supply of matches, we would not have been able to cook. Luckily I had
bought half a dozen cheap LED flashlights last year, so we could even
cook in the dark, and they’re better for reading than candles are.
Fortunately our gas supply was unaffected and we had clean running water
throughout (and thanks to a gas water heater, it was hot); many people
did not. Our furnace burns gas, but the motor that drives the fan that
forces the hot air needs electricity and of course so does the
specialized computer that controls the whole thing, so it’s just
as well that the power failed when the temperature was in the 70s
instead of the teens. Of course there was no air conditioning.
Our cordless phones were all useless, but we have one old-fashioned
phone that doesn’t need house current; it operates during power
failures thanks to Verizon’s backup batteries, which are probably
good for a week. Cell phones might have lasted a few days if used
sparingly. Computers could soldier on with batteries for a few hours,
but since the wireless router in the basement needs electricity, we were
disconnected from the Internet. Naturally the TV and the cable box were
dead, so we had no news.
Locally, things came back to normal quickly, but next time could be
different; after all, this was just two days of heavy rain and
moderately high winds, and, unlike earthquakes, there was plenty of
warning. The more our lifestyle depends on robust and complex
infrastructure, the more vulnerable we are when major disruptions come
along. The longer we go without problems, the harder it is to cope when
troubles do arise, or even remember how to do things. Where are the
flashlights and matches? How do I silence a powerless smoke detector
with a weak battery that’s beeping in the middle of the night?
(Answer: swear a lot, rip it off the ceiling and put it outside.) How
long will food last in the refrigerator? How do we open the garage door
when the opener doesn’t work?
We depend on an intricate web of electricity, phones, gas, water,
roads, sewer systems and of course an army of people who work long and
hard when things get rough. The longer we shortchange and cut corners
on maintenance and upgrades and lay off experienced workers to make
financial numbers look good, the more likely it is that there
won’t be any resilience when it’s needed. Those who say
that there is no role for government and that market forces will take
care of everything might want to think a little more deeply about what
happens when things go seriously wrong.
As I was wandering the campus on a beautiful Sunday a couple of weeks
ago, I saw an unusual number of couples walking hand in hand where one
of the pair was simultaneously talking on a cell phone. Strangest of
all was a couple who were both talking on their phones. It’s
possible that they were talking to each other, but somehow I doubt it.
It does seem a shame to waste an idyllic stroll with one’s
significant other by talking to someone else on the phone. It’s
not at all uncommon to see parents talking on the phone while pushing a
stroller. Presumably being ignored by one’s parents builds a
child’s character and independence, but again it seems like a
missed opportunity for a bit of togetherness. Maybe it’s just
payback when teenagers don’t want to be seen with their parents, a
phenomenon you can observe in family groups where a bored and disdainful
teen chats on the phone while the elders point out the campus sights.
And I suppose it’s some kind of delayed compensation when those
same parents helicopter in to monitor their kids after they arrive at
Princeton.
But cell phones do seem to encourage us to do too many things at the
same time. Last spring I saw a guy riding a bike, steering and holding
his dog’s leash with one hand while talking on his phone with the
other. The dog seemed quite happy; from the canine perspective, this
experience might really be quality time. One wonders, however, what
would happen if the dog decided to chase a squirrel, or the bike ran
into one of the giant potholes that had opened up over the winter, or a
cell-phoning parent and stroller suddenly came out of nowhere.
Professors Sam Wang and Alan Gelperin of the molecular biology
department gave a very interesting talk in April about NEU 101:
Neuroscience and Everyday Life. One of their topics was experimental
results that show unequivocally that multi-tasking is not as efficient
as doing tasks sequentially — overall performance suffers when we
try to do two things at the same time. In lectures the problem is the
distracting effect of texting, tweeting and checking Facebook while the
person at the front of the room is trying to explain something.
There’s little doubt that people who play with their phones and
computers while sporadically tuning in to the lecture do not learn as
well as those who focus. Is this clear-cut educational penalty going to
cut down on the use of laptops in the classroom? My bet is not, based
on discouraging conversations with a fair number of my colleagues and my
own unsuccessful attempts at moral suasion.
But perhaps that’s not the important battle. About the same
time, another study showed, again unequivocally, that driving while
distracted is a major cause of auto accidents; cell-phone use,
especially texting, is implicated in perhaps a quarter to a third of all
fatalities. This is totally believable; one doesn’t have to drive
far on a typical interstate to see someone weaving between lanes or
drifting onto the shoulder or slowing down dangerously because
they’re looking down at a phone. Of course there are plenty of
other driving distractions, like eating, reading, shaving (not me),
putting on makeup (ditto), fiddling with the GPS and holding animated
conversations with passengers. But texting seems to be the equivalent
of being well over the legal limit for alcohol.
A recent article on the growing use of glitzy graphical interfaces in
cars suggests that this is going to get worse before it gets better. In
ancient cars like mine, the controls are buttons and knobs of various
sizes and shapes, which can be manipulated by feel. But the trend in
new cars is toward interfaces that require looking at a touch screen to
select items from a menu, just like a phone or a computer screen. You
can’t operate such interfaces without taking your eyes off the
road for significant periods of time. Here’s a bit of
quantitative reasoning: at 75 miles per hour, a car travels 110 feet in
one second, so if you poke around on a screen for three seconds trying
to adjust the radio, you’ve covered the length of a football field
without seeing what’s around you.
It’s hard to imagine that people would do something so
potentially disastrous, but they do, just as they drink and drive.
Another thing that Sam described in his talk is that the brain
doesn’t reach its full maturity until one is well into one’s
twenties. This means that teenagers, and even some of you, dear
readers, do not assess risks as well as you will in a few years. So in
the meantime, let me encourage you in the obvious: don’t drive
while distracted by any of these things. You might even put aside your
laptop and your phone to spend a bit of quality time with the people
right around you.
The November 2011 issue of The Atlantic has an article by James Fallows
called “Hacked!”, which describes the ordeal of
Fallows’ wife after her Gmail account was compromised, causing her
to lose (at least for a while) essentially all of her online life,
including six years of mail, photos, and personal documents. Her
correspondents were told that she had been mugged in Madrid, but the
rest of the story line reminded me of an incident of my own a year ago:
an urgent email from a friend, sent to a very small mailing list, asking
for a short-term loan. In both cases, someone had obtained the
victim’s Gmail password, and that was enough to let the bad guys
focus on very accurately targeted groups of potential victims.
Phishing attacks like these are getting more sophisticated.
There’s probably no one alive who hasn’t received a Nigerian
scam letter, offering to split ten or twenty million dollars if the
recipient will just send along some bank account information. Those are
so bogus that it’s hard to imagine that anyone would bite, though
apparently some still do. But an attack like the one reported by
Fallows is far more plausible because it is so precisely targeted;
that’s why it’s called spear phishing.
How are accounts cracked? Sometimes it’s through systematic
password guessing, or asking for a password reset, which is what
happened to Sarah Palin’s Yahoo account in 2008. But it seems
that the most common problem is using the same password for more than
one site. If you do that, your security is no better than that provided
by the site with the weakest controls: even if Google does a perfect
job, that won’t help you if some other site can be easily broken
into. A break-in at Gawker in December 2010 stole more than a million
passwords; Fallows says that this might have been the vector for the
attack on his wife.
Our online lives are increasingly controlled by passwords. I tried
counting mine, but gave up at a hundred. Many of these are throwaways,
of course — who cares what my New York Times password is —
but others matter a lot, like the one for posting student grades.
That’s different from what I use in the computer science
department, which in turn is different from one I use at Google, which
is different from the ones for access to my credit card account,
administering my computers, my Internet domains, the wireless router in
my house and on and on and on. A handful of these are used every
day, but most lie dormant for months or even years; there’s no
hope of remembering them and I probably wouldn’t notice if someone
did break in.
An extra complication is that every site, no matter how
insignificant, has a different set of rules for what constitutes a valid
password, a capricious combination of dos and don’ts. For
instance, at the U.S. Copyright Office, where I recently registered my
new book “D is for Digital,” a password must have eight
characters with two letters, one number and one special character (but
not an ampersand!), and no consecutive repeated characters; it must not
include the user name or any part of it, or the names of a spouse,
children, pets, or one’s own name, or any sports teams or players,
or any part of a social security number longer than one digit or (and
this is the killer) “words that can be found in any dictionary,
whether English or any language.” In my case, all this fuss was
for a password that I used exactly once and will likely never use again.
Passwords are out of control. They’re too numerous and too
weak to be the all-purpose authentication mechanism. We need so many
and the rules are so arbitrary that one is forced to write passwords
down, re-use them, and probably create them with some kind of pattern
anyway, all of which adds to the risk. Here’s a question that you
should ask yourself: if a bad guy saw a couple of your passwords, could
he guess more? Electronic break-ins at large sites are not uncommon,
and passwords are one of the marketable results.
There aren’t many good alternatives on the horizon either.
Perhaps biometrics like fingerprints or retinal scans will be practical
some day, but for now the best we have is two-factor devices, where you
have to know something (a password) but also have something (like a
device that generates an additional one-time password whenever
it’s needed). Some companies, including Google, offer this
service, and if you have a smartphone, there’s no extra gadget to
carry; it’s just another app. But if someone steals your phone,
you’re back in the same pickle. Life was much easier when we
didn’t have to spend so much of it trying to remember how to prove
who we are.
“... of making many books there is no end”
When Johannes Gutenberg developed his printing press around 1440, it
didn’t make writing a book any less work, but printing one became
a lot easier. As Gutenberg’s technology has been refined over
nearly 600 years, the words of Ecclesiastes have become even more
apposite.
When I was a grad student here in the late 1960s, people typed their
Ph.D. theses themselves or paid a typist a dollar or two a page to type
it for them. For three drafts of a 200-page thesis, that was serious
money for a starving student, as well as being slow and inflexible.
Being comparatively poor — though not starving — and
basically cheap, I took a technological approach by writing a program
that would format and print my thesis on printers at the Computer
Center, the ancestor of the Office of Information Technology.
It’s probably hard to imagine now, but there was only one
computer on campus at the time, an IBM 7094 that sat in a large
air-conditioned room in the E-Quad. To print a new draft, I handed
three boxes of punched cards to the operator: 1,000 cards of program and
5,000 cards of thesis text. (For the benefit of readers not alive in
such ancient times, a punched card is a piece of stiff paper about 7
inches by 3 inches with holes punched in it to represent information
like letters and numbers; each card holds up to 80 characters.) This
let me make a large number of revisions in short order and perhaps
improved the writing style, though it had no effect on the research
results. In the end I produced what was probably the first
machine-readable and computer-printed thesis at Princeton.
Ever since, I’ve been intrigued by tools for book authors.
After graduation I spent the next decade doing research in document
preparation software and wrote several books where my co-authors and I
did everything except for the actual printing and publication. In the
early days, this meant producing photographic paper with high-quality
images of the pages, which were sent to the publisher, who in turn sent
them to a printer to be turned into printed and bound books, shipped to
warehouses and bookstores and eventually sold to millions of
enthusiastic readers. (That last part is exaggerated, unfortunately,
but the rest is true.)
This approach had real advantages, not the least of which was rapid
turnaround. It was possible to create a clean new draft in no time,
which made it easy to polish the text. It also meant that we were safe
from copy editors, who generally didn’t understand what they were
editing and often converted arcane but correct material into something
just plain wrong. Printers were even worse, since they could never
transcribe computer programs accurately.
We have now entered a new age in the relationship between authors,
their publishers and copy editors and printers and their readers.
Enterprises like Lulu and Amazon’s CreateSpace remove the
intermediaries, making it possible for anyone with a bag of words to
convert them into a book, printed on paper or distributed electronically
or both, without ever going near a traditional publisher. The
technology is remarkably simple — upload content and cover files,
set a price and have the book available within hours. Copies are
printed on demand, not on speculation, which means no inventory and much
lower prices than conventional channels. Royalty rates are much
higher, a win for a successful author.
I tried self-publication for my latest book, “D is for
Digital,” which I published through CreateSpace after a brief
flirtation with Lulu. The elapsed time from when I decided the content
was good enough until it was listed on Amazon and the first copy sold
was about 24 hours. The cover shows that I have no talent as a graphic
designer, but otherwise the book is indistinguishable from one produced
by a regular publisher. Since there’s no intermediary, the $14.95
price is a third of what it would have been. Of course, publishers do
provide useful services, like publicity, which I’m not good at.
The ease with which one can publish a real book is already changing
publishing, and it’s going to change other things as well. For
example, in some disciplines it’s expected that an assistant
professor will publish a book as a necessary part of the tenure process.
But what if publishing is no more complicated than reformatting
one’s thesis and designing a cover? The result is clearly a book:
It has an ISBN, it can be purchased through Amazon — how does that
differ from a monograph laboriously shepherded through a university
press? Not at all, it would appear. We’re at the stage where
anyone with an idea and enough words can have a book in print in a week.
With electronic versions, it’s even easier; I converted all my
previous ‘Prince’ columns to a Kindle book in a couple of
days. (Advertisement: you can buy it on Amazon!) Will “I
published a book” lose its cachet? Stay tuned. I’m working
on a book about it.
This Thursday through Saturday, and next week too, the campus will be
crawling with youngsters (comparatively) wearing lanyards and exuding a
combination of cockiness, feigned sophistication, innocence and
wide-eyed wonder. It’s Princeton Preview, when a horde or two of
admitted but not yet decided high school students visit Princeton to see
if they want to be Princetonians instead of heading off to Stanford or
Harvard or, all heavens forfend, Yale. How do people make this weighty
decision, especially if it’s a free choice among equally good
schools? I’ve talked to enough prospectives over the years to
know that it’s not always a slam dunk, and little things can wind
up making a big difference.
One obviously irrelevant but influential factor is the weather. I
dimly recall that when I visited Princeton long ago trying to decide on
a grad school, it was cold and rainy most of the time, and I never saw
the campus in its amazing spring splendor. Luckily the awful weather
didn’t dissuade me from what, in retrospect, was clearly the right
choice, but I’ve often wondered whether some data miner in the
depths of West College is looking for correlations among temperature,
sunshine and acceptance rate. If there is a correlation, then surely
the famous Princeton Weather Machine will be turned on for Preview as it
almost always is for commencement.
Do campus tours make a difference? I remember going on tours with my
son when he was making similar decisions a few years ago. One tour of a
small New England liberal arts college took us through an actual dorm at
about midday, an experience that made me realize messy rooms were not a
phenomenon unique to our teenager. No other tour ever revealed so much
dirty laundry, both literally and figuratively; perhaps this is one of
several sound reasons why Princeton tours never go through dorms. Of
course, prefrosh stay in dorms with their hosts, where they will quickly
learn about a range of housekeeping standards.
I happened to be visiting Yale on the first day of spring this year
and got a chance to see both tours and weather at work. It was a
perfect day: warm, sunny, the trees in full flower and the campus full
of happy sunbathers. It was actually so warm that at one point I went
inside the Beinecke Library to cool off, check out the new Shakespeare
exhibit and pay my respects to the Gutenberg Bible that is pretty much
on permanent display there. I sat down in one of the comfortable
leather chairs a few feet away and watched no less than five campus tour
groups come through in half an hour, each one getting a good close-up
look at this amazing historical document while listening to a pitch on
the rich resources that Yale offers. I don’t know whether many of
the somewhat bored-looking high schoolers in the groups realized how
remarkable this access was, but the adults certainly did, and it
won’t hurt Yale’s yield a bit.
Do all the information sessions make a difference? For my sins,
I’ve been drafted into being on a Preview panel where the
panelists answer questions, mostly from parents (if experience is a
guide) and less often from students. I did this once before and
remember none of the audience questions, only that it was fun. One of
the other panelists was a woman who had been in my class a couple of
years earlier, and it was nice to talk to her again, an added benefit.
I’m going to guess that for the most part, whatever is said in
this setting won’t change many minds. The event does seem geared
more to parents than students, and realistically, it’s the student
who ought to be making the decision.
That leads to another question, of course: do parents make a
difference? Probably they do, though one suspects that it’s
sometimes negative: if Mommy and Daddy lobby too hard for a particular
school, there’s a natural tendency to rebel by choosing someplace
else. Fortunately, most parents are wise enough to know this and bite
their tongues throughout the decision process.
The bottom line for most college applicants is that if you have
thought hard about what matters to you and made sure that the schools
you applied to will do well in those things, then it doesn’t
actually much matter where you go. The experience will be different for
sure, but not in any predictable way, and for the most part, it will be
what you make of it, independent of weather, tours, panelists and
parents. So let’s hope all our visitors applied to schools that
really make sense for them, thought hard about what they want in a
school and, in the end, make the right decision: to come to Princeton.
For 10 years, The Daily Princetonian’s weekly Opinion column was
written by John Fleming, who retired in 2006 after 40 years in the
English department. Professor Fleming’s column was called
“Gladly lerne, gladly teche,” a phrase from the
Clerk’s Tale in Geoffrey Chaucer’s “Canterbury
Tales.” I first met Fleming at a dinner at Forbes, one of those
occasions, all too infrequently attended by undergrads, where a
distinguished faculty member would grab a hasty bite then deliver a talk
on some academic subject and lead a brief discussion. It was the first
such talk I ever went to, and I no longer remember the topic, though a
vague memory remains of something in prospect narrow and dry as dust.
As it turned out, what I saw was an academic stand-up comic, doing
hysterically funny riffs on some esoteric bit of medieval literature. I
don’t remember any of the punch lines, but I still recall
chuckles, giggles and outright belly laughs. How could a man with such
awesome academic credentials be at the same time so side-splittingly
funny? My late father-in-law was a professor in Princeton’s
English department, so the breed wasn’t totally unfamiliar, but
this breadth of talent was new to me.
I got to know John better over the next few years. He used to open
the pool at Dillon Gymnasium every morning, so I often ran into him on
my walk to the Wa to get a paper, and we sometimes fell to talking. As
I learned more about the remarkably erudite man behind the witty
‘Prince’ columns, it was clear that I just didn’t
belong; I felt like a puppy running beside a real dog. It was a
struggle to keep up the pretense that I was a member of the same
faculty.
So you might imagine my feelings when five or six years ago the
Opinion editor asked me about writing a regular column to continue the
tradition that Professor Fleming had set. The editor said that several
faculty members would write in rotation, so the load would be divided
among us and so would the responsibility for filling Fleming’s
shoes. But it was still a daunting prospect, and sharing the shoes with
my current fellow columnists, all writers of great talent, does not make
it easier.
John Fleming continues to write. His blog lets him adjust length and
frequency to his taste and to include pictures; it continues to be
erudite and witty and often very funny. There’s no keeping up
with him.
On our last night in England after a long (though too short) vacation
this summer, my wife and I stayed in Southampton, where I discovered a
modern statue of an important person from long ago: John Le Fleming,
mayor of the city in the early 1300s. I took a couple of pictures and
sent them off to Professor Fleming in the hope that he might find them
amusing. There followed in short order a note of thanks and a new post
in his blog, about famous Flemings, starting with Southampton’s Le
Fleming. It was entertaining, instructive and funny.
How is it possible to keep up with someone so exceptional? I
don’t belong here.
There is a common psychological disorder called “the Impostor
Syndrome.” Wikipedia says of its victims, “Regardless of
what level of success they may have achieved in their chosen field of
work or study or what external proof they may have of their competence,
they remain convinced internally that they do not deserve the success
they have achieved and are really frauds. Proofs of success are
dismissed as luck, timing or otherwise having deceived others into
thinking they were more intelligent and competent than they believe
themselves to be.” Sounds familiar to me.
And it might well sound familiar to you as well. Everyone here is
very good at something and often very good at several or even a bunch of
things. Indeed, some of you are simply off-scale in your
accomplishments. Faculty advise presidents or write best sellers or win
Nobel prizes. Undergrads routinely win Rhodes scholarships or lead the
University orchestra or win an NCAA championship or make a credible stab
at solving environmental crises. A few do several of these
simultaneously. And most manage to do it while remaining really nice
people and even have lives at the same time. How did normal people ever
get here? We don’t belong!
But maybe we do. If you probe your superstar friends and colleagues
a bit, you will find that they often suffer from the impostor syndrome,
too. It affects the newly arrived most of all, of course — new
faculty members and especially new freshmen — but everyone has a
twinge from time to time. So take pleasure in having amazing people as
friends, and take solace from the fact that you’re not entirely an
impostor either.
When I was very young I often read a cartoon series called “The
Watchbirds” that gently discouraged various kinds of childhood
misbehavior. Each strip included a couple of sketches. One was a
“watchbird” watching a child; the other said “This is
a watchbird watching you.” The meaning was clear: anything bad
would be noticed. I was reminded of watchbirds this summer while on
vacation in England. England does a remarkable amount of routine
surveillance of its citizens and visitors. Orwell would surely have
been surprised at the number of closed-circuit TV cameras on streets, in
buildings, at landmarks and, for all I know, in private homes. Indeed,
“CCTV in use” was one of the most common signs.
Most of the cameras are meant to discourage theft, vandalism and
other anti-social behaviors. Sometimes, however, an outsider may not
know what’s illegal. As George Bernard Shaw once said, England
and America are two countries separated by a common language. I’m
still uncertain about a large sign in a parking lot in York that said
“No fly-tipping.” Right beside it was a CCTV camera; if I
had inadvertently tipped a fly, it would have been recorded.
There are thousands of speed cameras on English highways as well, to
encourage drivers to stick to the speed limit. The locations of most
are not only well known, but published; my AA map listed thousands, and
my GPS warned of them all the time, with a muted but irritating ding
whenever one was coming up. I was never caught speeding, but a
newspaper story claimed that 20 percent of all drivers had received an
automated ticket in the previous year, a stunning number if true.
Cameras are at least familiar. There’s another kind of
pervasive surveillance that goes on all the time, and not just in
England. The Internet has become a worldwide system for watching what
we do online, quietly recording pretty much everything. Kinnari Shah
’14 and Matt Dolan ’13 wrote about this in articles in
September; let me add my two cents’ worth here.
The basic Internet tracking mechanism is the cookie, a bit of text
that is deposited on your computer when you browse to a website. The
next time you return to that site, the cookie is sent back to the site
where it originated. This gives the site a way to recognize you as a
return visitor, and thus to know that you are already logged in and
perhaps have some items in a shopping cart.
That sounds benign, but cookies are used extensively for tracking
because what you might think of as a single site often includes
components like images from other sites, each of which gets to deposit
and retrieve cookies too. Aggregating that information makes it
possible to build up a very detailed picture of the sites you’ve
visited.
But wait — there’s more. Most sites today also include
trackers based on Javascript, the programming language that browsers use
to provide dynamic effects. Javascript trackers can track your return
visits, but they can also do more invasive monitoring, including
reporting every mouse click and even where your mouse moves as you poke
around a page.
Javascript tracking seems to be on the increase. A browser add-on
called Ghostery blocks trackers, using a long list of known offenders;
that list had more than 1,100 trackers a few days ago and continues to
grow. One of Ghostery’s services, besides the vital one of
eliminating the trackers before they can do anything, is to report on
what it found. It used to be that a typical page might have four or
five trackers, and the most I had encountered as of two years ago was
12. Today a dozen trackers per page is common, and I’ve seen as
many as 18. (If you see more, let me know; this contest has no real
upper limit on the score.)
Many Internet companies make their money by learning as much as they
can about us and selling that information to others. That’s the
quid pro quo: we get valuable services like search and social networks
for free; advertising revenue pays for them. The downside is that
we’re being watched all the time. It’s possible to reduce
tracking by turning off cookies and using blockers like Ghostery, but it
requires continuous vigilance, and if everyone did it, the free services
would eventually go away, a high-tech version of the tragedy of the
commons.
At the moment the surveillance is for commercial purposes, but
it’s not a giant leap to imagine governmental scrutiny as well.
That certainly happens in other countries, and there have been plenty of
attempts here by various agencies to obtain information about individual
users from companies like Google, Facebook and Twitter.
That brings us back to watchbirds. What are the limits on tracking
and monitoring? How do we know when we’re being watched and for
what purposes? This is not a new question: 1900 years ago, Juvenal
asked, “Quis custodiet ipsos custodes?” It’s even more
important today when we’re all connected all the time. Who is
watching the watchbirds?
To P/D/F or not to P/D/F, that is the question. And today is the
deadline for deciding. P/D/F is a good idea. In theory it offers a
way to take a course that might be interesting but challenging, without
jeopardizing one’s grade point average if the course turns out to
be too tough. The truth, however, at least as far as I can tell
from a dozen years of teaching and advising, is that in practice P/D/F
is used primarily to lighten the workload — by choosing to PDF,
you can get by with significantly less work than if you’re
striving for a good grade.
I’m strongly in favor of low-risk academic exploration, and
pretty sympathetic to workload management, though as I remind students
in my course every fall, P/D/F has three possible outcomes, only one of
which is good, and it’s unwise to probe the boundary between P and
D too closely.
Having three outcomes is a lot better than plain pass/fail. From the
instructor’s side, coming up with fair grades is hard enough and
an F is a serious sanction that one is reluctant to use without very
clear grounds. D is often a suitable middle ground: it sends a strong
message, and it sticks to the student’s academic record, but
it’s not fatal.
How many people P/D/F a given course? Those numbers are certainly
known in registrar-land, but they seem to be a closely held secret. By
virtue of being on the right university committee, I once saw the
statistics for my own course for a five or six year period. I
don’t now recall the exact details, but if memory serves,
P/D/F-ers varied from about 25 percent to as high as 75 percent, a
surprisingly large range. Would it change the way that faculty teach if
we were to learn ex post facto how many students had P/D/F’ed a
course? It seems unlikely, but my attempts to get this aggregate
information have thus far been to no avail. Everything is anecdotal,
and one learns only from individual students who volunteer the
information. Of course I can make a guess from the apparent effort
that a student is putting in, but that’s not very reliable; some
of the hardest-working people in my class are worried about passing, not
about whether they’ll get an A.
The P/D/F election deadline used to be much earlier in the semester,
before midterm week and thus before most students had received any real
feedback about their likely success in a class. That probably
wasn’t fair, and every so often a proposal is floated to move the
P/D/F deadline even later. In the limit, say some, why not let students
make their choice after their final grades are received? This
isn’t likely to fly with the faculty, however, since it amounts to
a game in which the dealer plays with all cards face up, and the end
result would be a weird sort of Gresham’s Law in which the only
grades were A and P.
Perhaps the deadline could be after the final exam but before the
grade is revealed? That’s only marginally better, and at least
seems just as unlikely to happen. The current deadline is a decent
compromise.
I recently had a visit from a student in this year’s class.
She had been planning to take advantage of the P/D/F option, but was
pleasantly surprised by a better than expected midterm grade. Now she
was wondering whether to take it for a letter grade after all. There
was a real chance of doing well, but she worried that a poor grade would
hurt her GPA and thus diminish her future prospects. It’s never
going to be an easy decision in such a situation, but since COS 109:
Computers in Our World is a QR course, let’s do some quantitative
reasoning. Suppose that she’s a strong A student and will sustain
a 3.95 GPA to the end. If she does horribly for the rest of the
semester (most unlikely, given her obvious effort) and only manages a
B–, her new GPA will be 3.91. She would take a hit but it
won’t make the difference between grad school at Stanford and
flipping burgers at McDonald’s. It’s still a tough
decision, but not nearly as fraught as she might have feared.
So today is the big day. For most people, it’s a non-event:
the large majority of students never use all of the four P/D/Fs they are
entitled to. For those of you still on the fence, my advice is simple:
Do whatever you like. You might get luckier than you thought and pull
off a decent grade, especially if you do work a little harder. Or it
might be better to focus your time on other courses or on having a life;
there’s no shame in that. Just don’t slack off too much.
Remember, I also have to make a decision: to D or not to D.
A few years ago, a big push called “Major Choices” tried,
quite reasonably, to get students to explore the many fine options among
smaller departments, rather than to follow the path of least resistance
into the big ones like economics. As the dep rep for the computer
science department at the time, I went to any number of lunch and dinner
gatherings in some or other residential college and described the
wonders of concentrating in computer science. The program is still
active; mail about it last week pointed to a website with useful
information and helpful comments from current majors. (“Why would
anyone want to date a sociology major? We are naturally empathetic, so
we can give you what your friends don’t know you need.”) But
Major Choices didn’t have a discernible effect on computer
science; our number of concentrators continued to hover in the low 30s
per year.
Something has changed in the past couple of years, and computer
science has gone from well down in the pack to a scarily big department.
We used to have 60 majors; now we have 165.
And that’s just majors. A lot of students seem to find our
courses appealing even if they have no plans to concentrate in computer
science. COS 126: General Computer Science has 458 registered, more
than the traditional leader, ECO 100: Introduction to Microeconomics,
which has 437. Enrollments for this semester are astoundingly high
overall, with well over 1,700 in CS courses, where last spring it was a
little over 1,300, and in spring 2009 about 750. We’ve more than
doubled in four years.
There’s a handy rule of thumb for exponential growth patterns
like this, called the Rule of 72. Divide the doubling time (say four
years) into 72, and the result is the approximate rate of growth, in
this case about 18 percent per year. If this continues, it will be only
a few more years before everyone at Princeton is taking computer science
courses every semester.
Naturally the giant increase in enrollments is causing angst within
the CS department. How do we find preceptors for a class of 450? How
do we grade assignments for a class that big? The administration is not
going to let us hire more faculty for what might be a short-term
transient — faculty slots are one of the most rigorously
controlled of all resources — but something has to give. In our
case, that means more lecturers and, something not often seen at
Princeton, undergrad graders for the introductory courses.
Why is computer science so popular all of a sudden? Leaving aside
its intrinsic interest, of which there is some, perhaps to concerned
undergrads and their parents it seems like a path to a good job in
uncertain economic times. The latest exit survey from Career Services
suggests that a typical CS graduate has no trouble finding a job and
might well pull down $100K a year. For adventurous souls, a solid
computer science background might offer a chance to do something neat,
like a startup. The barriers to entry for creating a new business are
almost non-existent: hardware, software and scalable infrastructure are
somewhere between free and dirt cheap. All you need is a good idea and
someone to implement it.
If you do have a good idea, however, it’s better to do the
implementation yourself than to contract it out to a random programmer,
who might well have ideas of his or her own, or the ability to adapt
yours and leave you behind. You may recall the movie “The Social
Network,” which describes (presumably with artistic license) the
early days of Facebook. Without in any way judging whether the
Winklevoss twins had a legitimate case, they might have done better if
they had spent less time rowing and more time programming. If you
aspire to be as successful as Mark Zuckerberg, it surely helps to have
his intelligence and drive, but it’s also highly desirable to have
some of his programming skill.
The University publishes enrollments in the big departments every so
often, and it always says, “Undergraduate concentration patterns
have remained fairly constant over the years.” That seems true
enough: the top pair is usually econ and politics, with WWS, history and
molecular biology not far behind. Naturally there’s more churn
further down. CS wasn’t even in the top 10 in 2011-12, but if
nothing else changed, we would be number three in 2012-13.
The Elements of Programming Style (with P. J. Plauger)
Software Tools (with P. J. Plauger)
Software Tools in Pascal (with P. J. Plauger)
The C Programming Language (with D. M. Ritchie)
The AWK Programming Language (with A. V. Aho and P. J. Weinberger)
The Unix Programming Environment (with R. Pike)
AMPL: A Modeling Language for Mathematical Programming
(with R. Fourer and D. M. Gay)
The Practice of Programming (with R. Pike)
The Go Programming Language (with A. A. A. Donovan)
Understanding the Digital World
Advise and consent
October 2, 2006
Can you hear me now?
November 6, 2006
The sounds of music
December 4, 2006
Creep in our ears: soft stillness and the night
Become the touches of sweet harmony."
With sweetest touches pierce your mistress' ear
And draw her home with music."
Too precious to waste on the young?
February 5, 2007
What do they teach in professor school?
March 5, 2007
Lost and found
April 2, 2007
The Sons of Martha
May 7, 2007
The Sons of Mary seldom bother, for they have inherited that good
part;
But the Sons of Martha favour their Mother of the careful soul and the troubled heart
And because she lost her temper once, and because she was rude to the Lord her Guest,
Her Sons must wait upon Mary's Sons, world without end, reprieve, or rest.
It is their care in all the ages to take the buffet and cushion the shock.
It is their care that the gear engages; it is their care that the switches lock.
It is their care that the wheels run truly; it is their care to embark and entrain,
Tally, transport, and deliver duly the Sons of Mary by land and main.
They have cast their burden upon the Lord, and —
the Lord He lays it on Martha's Sons!
Sometimes the old ways are best
September 17, 2007
By the dawn's early light
October 15, 2007
Other Princes
November 19, 2007
Making the grade
January 21, 2008
Where the books are
February 18, 2008
The bleeding edge
March 24, 2008
By the numbers
April 21, 2008
Native guide
September 11, 2008
Leading indicators
October 20, 2008
What would Orwell do?
November 24, 2008
The ratings game
January 7, 2009
Washington crossings
February 16, 2009
Gladly learn, and gladly teach
March 23, 2009
What makes smart phones smart?
April 20, 2009
How was your summer?
September 21, 2009
Home alone
October 19, 2009
A sense of where you are
November 23, 2009
Millions, billions, zillions
February 22, 2010
Action figures
March 29, 2010
Friends and family
April 26, 2010
Call me Ned
September 27, 2010
We're number one!
October 25, 2010
Sic transit
November 20, 2010
Anti social
January 31, 2011
Enter to grow in wisdom
March 1, 2011
Read any good books lately?
April 4, 2011
Spring is sprung
May 2, 2011
Gather ye rosebuds while ye may,
Old Time is
still a-flying:
And this same flower that smiles to-day
To-morrow will be
dying.
Powerless
October 3, 2011
Give me your undivided attention
November 7, 2011
Who goes there?
December 5, 2011
So you want to write a book
March 12, 2012
Ecclesiastes 12:12
Preview of coming attractions
April 16, 2012
I don't belong here!
September 24, 2012
Watchbirds
October 22, 2012
D construction
November 26, 2012
The very model of a modern major major
February 3, 2013