A life lived well, cut short

9887734305_9f42939538_z

A park in Cambridge Square dedicated to the memory of Danny Lewin, an alumnus of MIT. Photo copyright 2013 by Todd Van Hoosear.

Thirteen years ago today, a 31-year-old man–a former Israeli Defense Force commando, a brilliant computer scientist, a successful Internet startup entrepreneur, and a husband and father of two small boys–tried to stop terrorists from hijacking a Boeing 767 and flying it into the North Tower of the World Trade Center.  Sadly, he was killed in the attempt.  He is believed to have been the first person to die on that horrible day.

His name was Daniel Lewin.  He couldn’t single-handedly prevent the atrocity.  But the algorithms he developed as co-founder of Akamai Technologies were critical in keeping the Internet from crashing that day.  Lewin’s research as a grad student at M.I.T. focused on improving the efficiency of online content retrieval through web caching, in which content (particularly images and videos) is replicated across a number of servers for faster retrieval.  On the day of the attacks, Lewin’s colleagues at Akamai observed a five-fold increase in the amount of traffic on company servers as millions of Internet users frantically refreshed news sites and clicked on videos, trying to find out what they could as the day’s events unfolded, but thanks to Lewin’s work,  the servers stayed up.

I’m not the sort of person who reminisces yearly about what I was doing on terrible days like September 11, 2001.  It was an awful time, and one I don’t particularly want to relive.  But today I’ll make an exception, for Danny Lewin’s sake.  That day at the Enrico Fermi Institute, we were supposed to have had a rehearsal for a DOE site visit later in the week (which wouldn’t take place for another month or so, because all flights were canceled for the rest of the week).  Instead, we ended up wandering around in confusion all day long, some people watching the news coverage on a small portable TV in the conference room, others calling everyone they knew–family members and friends on the East Coast, especially–to make sure they were all right and accounted for.

I was one of those people obsessively checking all the news sites over and over for updates, reading forum postings, trying to sort out genuine information from noise.  Behind me was a north-facing window with a view of the Chicago Loop five miles away.  We were in a major metropolitan area and economic center, and at the time, the possibility that Chicago could also be a terrorist target was all too real.  I kept looking over my shoulder through that window to reassure myself that the Sears Tower was still standing and that the skyline was intact.  I was panicked and teary and in despair, but in retrospect, I think being able to see new information as it came in, making the picture less murky, gave me comfort and a sense that there was still order in the universe, made me less fearful for the immediate future.  The Internet was my constant companion that day.

It was something I took for granted then, that in the face of disaster one could reliably go online and find information, knowledge, perspective, community.   Since then, we’ve observed repeatedly how critical social media and other online communication mediums can be for people dealing with disasters and crises in real time, how important it is to keep those channels open, responsive, and dependable.  Danny Lewin gave his life to try to save his fellow passengers.  But in the years since his death, the technology he was instrumental in developing may have helped save countless more.

G-D of overflowing compassion, Who lives in the highest and all worlds, give limitless rest to him who is now under your holy sheltering spiritual wings, making him rise ever more purely, through the Light of your brilliance.

Comments { 0 }

…or maybe it’s just turtles all the way down.

ceiling_cat

Another extremely popular creation narrative that captures our era’s zeitgeist.

What’s left for theoretical physics after splitting atoms and locating the Higgs-Boson?  Finding the pixels of the universe, apparently.

At least, that’s what Argonne is attempting, using a new device called the Holometer, built to test the theory of one Juan Martin Maldacena, a Princeton string theorist who hypothesizes that our universe is actually a two-dimensional hologram consisting of infinitesimally tiny pixels that, when observed (by us), becomes three-dimensional.  In other words, the Universe we know and love, and everything in it, are just a massive aggregation of information.

In other words, we are Big Data.

I can’t help but think that we are witnessing the formation of yet another anthropomorphic creation story.  Tribal religions posited that the universe was formed, created, excreted, or regurgitated by some giant version of an animal significant to their survival.  Yahweh, the God of the Old Testament, was a cantankerous, egomaniacal old patriarch who–well, let’s face it–was likely modeled on the original patriarchs of the culture that would come to be known as the Israelites, except that he was also omniscient and omnipotent (superpowers those grim old men could only wish they had).  Our notions of deity tend to be circumscribed by our own deepest fears and desires, which is why, from St. Paul to Augustine to Julian of Norwich to Jonathan Edwards to Reinhold Niebuhr toFred Phelps, God seems always to be verbalizing the morality that His/Her/Its self-proclaimed spokespeople wish they could impose on the world.

But theologians aren’t the only ones in the business of creating ontological narratives.  The idea that we are simply highly-evolved creatures in a kind of cosmological Game of Life playing out on a massive scale has been around for some time and has given rise to movies like the Matrix and its sequels, in which many of the very human, flesh-and-blood-seeming characters encountered by Neo, the series’ protagonist, are nothing more than programs or routines in the virtual reality in which his physical body has been submerged.  Perhaps that’s what we are…software, or objects created by software, running on a colossal supercomputer, originally written by a Divine Programmer who never bothered to comment his/her own code.

Now that our technological focus has shifted from computational performance to the generation and curation of information, this notion of the universe as a hologram, as an enormous, highly complex mass of data, seems to be the next iteration of our creation narrative.  It’s an interesting idea, a novel way of thinking about the universe.  I’m guessing it’s probably way off base.  But it’s a good measure of the current potential of our collective imagination.

 

Comments { 0 }

Thoughts in late August…

The huge and alarming push toward online education and distance learning in this country has been going on for a long while.  It’s not all necessarily a bad thing.  I’ve sat in plenty of courses where the instructor did very little but read from powerpoint slides projected on a screen at the front of the room–bonus negative points when the slides came prefabricated from the textbook company.  There are subjects that one can learn quietly at home on a laptop with a textbook open beside one–most of my computer programming courses are in that category.  And the potential for making an education available to students who wouldn’t otherwise have one because of distance or cost or lack of time is immeasurable.

But something is definitely lost when students don’t see their professors and fellow classmates face-to-face a few times a week.  When I was taking a sequence of IT administration courses at Parkland, our local community college, there were certain courses that required students to be present for every session, often because we were administering our own machines or doing hands-on work in class.  I got to know a lot of those other students, and we developed a kind of camaraderie over the course of about three semesters.  We were all either already in tech-related jobs or planning to get them, and we engaged in a lot of shop talk as a result.  I think it was an important experience for these students to have, because it helped expose them to the kind of work culture–and the kind of future co-workers–that they would likely encounter once they finished their degrees.

Last week Slate and the Atlantic Monthly published excerpts from William Deresiewicz’s intriguing book, Excellent Sheep:  The Miseducation of the American Elite, which was apparently released today.  Deresiewicz’s criticism focuses mainly on Ivy League schools, which he maintains, in the Atlantic excerpt, offer an education that has become so rareified that their students–supposedly the crème de la crème of the American education system–become paralyzed with fear of failure and therefore unable to take any sort of intellectual risk:

The prospect of not being successful terrifies them, disorients them. The cost of falling short, even temporarily, becomes not merely practical, but existential. The result is a violent aversion to risk. You have no margin for error, so you avoid the possibility that you will ever make an error.

This curse doesn’t just afflict Ivy League students, however.  Most high-achieving students face it at one time or another, especially those for whom there is a great deal of pressure to succeed.  Perhaps they’re the first in their family to attend college.  Perhaps they–and/or their parents–have taken on an enormous amount of debt.  And in an era where finding employment–any kind of employment–has become challenging for recent college graduates, the fear of screwing up and destroying one’s postgraduation prospects is absolutely terrifying.  These kinds of pressures produce graduates who, in Deresiewicz’s words, “are smart and talented and driven, yes, but also anxious, timid, and lost, with little intellectual curiosity and a stunted sense of purpose: trapped in a bubble of privilege, heading meekly in the same direction, great at what they’re doing but with no idea why they’re doing it” — students who are unable, say, to make the leap from directed lab work to initiating and executing their own original research, who don’t know how to passionately debate or defend ideas because they’ve never passionately engaged with them, always having been too busy trying to protect their precious GPAs to engage in any real, all-consuming intellectual work.

Online courses, for the most part, seem tailor-made to exacerbate this outcome.  Most are poorly constructed and vulnerable to cheating.  Some attempt at peer engagement is usually made, but often it seems forced and minimal.  And students may never interact once with their instructors.  This isn’t true of all such courses, of course, but they do tend to be more about delivering content than introducing students to new ideas, cultivating new ways of thinking.  And they can be deeply isolating.

In the Slate excerpt of his book, Deresiewicz makes an eloquent argument for college classrooms — not as sites of content delivery — but as places that enable a professor to “model and shape the mental skills she’s trying to instill”:

She conducts a discussion about the material, but she doesn’t simply let you talk. She keeps the conversation focused. She challenges asser­tions, poses follow-up questions, forces students to elaborate their one-word answers or clarify their vague ones. She draws out the timid and humbles (gently) the self-assured. She welcomes and en­courages, but she also guides and pushes. She isn’t there to “answer questions,” at least not for the most part; she’s there to ask them.

And Deresiewicz goes on to point out that classrooms are places where students learn how to learn together, whether in a seminar on Jane Austen or, as I found during my experience at Parkland, a course in introductory computer forensics.

But perhaps the most important aspect of the classroom is the human aspect–the one that’s so easy to forget for science professors facing classes of forty or more anxious, grade-conscious science majors while trying to do their own research and generate funding for their graduate students, or English department lecturers confronting five classes of twenty-five hostile, reluctant freshman writers in a semester.  It’s easy to see students as an indistinguishable blizzard of predictable behaviors and attitudes and flaws — and those of my generation in particular, having been the first to see our individual and collective prospects decline, seem to have taken to heart the pitiless pronouncement of Tyler Durden, rather than cater to the whims and demands of what we’ve been warned is an overprotected, overpraised, self-entitled generation.  It’s an extraordinary, counterintuitive, and for some, probably infeasible challenge:  “You have to get to know your students as individuals—get to know their minds, I mean—and you have to believe completely…in each one’s absolute uniqueness.”

Comments { 0 }

Catastrophic

Bruce Schneier on Heartbleed.

Transport Layer Security (TLS) and Secure Sockets Layer (SSL) are used to encrypt Internet communications–everything from email and VoIP to Web browsing and financial transactions.  The TLS Heartbeat Extension is a standard, implemented in both protocols, that enables an encrypted connection to be kept open by regularly requesting and sending small amounts of information (“heartbeats”).  Each transmission (or payload) is required to be no larger than 16KB.  In most versions of OpenSSL, this parameter is hard-wired and cannot be changed.

However, a vulnerability exists in the request code routine in OpenSSL 1.0.1 that allows the size of the payload transmission to be reset to as much as 64K.  This enables the requestor to grab much larger chunks of data than the sender is permitted to transmit–and because no logs are kept on heartbeats, the data leak goes undetected.  And that 64K can be anything, from user credentials to encrypted session and private keys.

Why is this so serious?  Because everyone relies on OpenSSL–banks, social media websites, schools, hospitals, software sites, even routers…the list is endless.  Up until now, OpenSSL was extremely trustworthy.  Now that trust is gone.  Even though a patch has been issued, it’s impossible to feel completely safe anywhere.

 

Comments { 0 }

We were light once.

Screen shot 2014-03-21 at 3.12.21 PMIn my house, which I share with three cats and one astronomer, there has been a great deal of excitement surrounding this discovery.  (Actually, it doesn’t take much to excite the cats, particularly Hector, who apparently has developed an affinity for hand-held power tools in recent weeks during some minor household renovations.)  According to my husband, Paul, confirmation of the theory of cosmic inflation, which accounts for the faster-than-light expansion of the universe in an infinitesimal fraction of a second after the so-called “Big Bang,” is a discovery on par with that of the Higgs-Boson particle.

Jorge Cham, who draws Ph.D. Comics, together with Jon Kaufman, a member of the BICEP2 team, which made the discovery, presents this monumental finding as a web comic.  It really is true that the best explanations are often the simplest and the most visually concrete.

Comments { 0 }

Getting in touch with my inner hedgehog

hedgehog

In one way or another, every new project is a learning experience for me–this is true whether the learning curve has to do with a new technology, field of research, set of requirements, or way of collaborating.  That’s one reason I love being a technical writer, although the speed at which I often have to process new information can sometimes be a little frustrating.  Often I encounter ideas and concepts that, personally,  I would like to take the time to grasp at a more fundamental level and on whose implications I would like to reflect a bit more deeply.  It’s the old conflict of fox versus hedgehog — being the fox–who knows many things–is definitely stimulating, but I think that even for academic support staff, having the opportunity, once in a while, to be the hedgehog–and view the world through one primary, unifying perspective–is helpful, and sometimes I rather miss that.  I think the last time I truly experienced that, however–perhaps the last time a lot of us did–was while studying for my Ph.D. exam.  And I suspect that even that startling moment of clarity was itself a kind of fata morgana.

I’m hoping to have a chance over the next couple of months or so to create and organize a kind of repository of resources on which I can draw to help with future projects and kind of formalize the routines and procedures I’ve found most helpful to make things a bit easier and less overwhelming at the outset of a project.  I’ve never really done this before because the projects I’ve fielded have been so varied that each one has required a more or less completely blank slate.  But I’m beginning to see now that there are some patterns and common needs emerging that I can more easily anticipate than I used to be able to.  And the kinds of expectations I’ve encountered on projects have changed as my own work has evolved–from “We need you to learn this technology and provide us with this deliverable” to “How do you think we ought to proceed on this?  You’re the expert.”  What has caught me off guard isn’t so much that I am expected to provide expertise as well as support, but the realization that, after almost fifteen years of learning from other people and adapting to the way they work, I have come to possess a certain kind of expertise…and quite a lot of it, at that.

Maybe I’m more of a hedgehog than I had originally thought.

Comments { 0 }

Well, we survived.

“They’re just so sad,” said a faculty acquaintance of mine on Friday afternoon, having come back from lunch amid the teeming hordes of undergraduates in ill-fitting green t-shirts and shiny green plastic Mardi Gras beads. “All wandering around in those stupid shirts, pretending they’re having fun.”

I have to admit that if Unofficial St. Patrick’s Day had been “a thing” at Penn State when I was an undergraduate, I probably would have spent the evening hiding out in the university library reading Faulkner or Joyce. Or maybe round up some equally nonplussed friends and go to a movie or a concert.

But my friends and I largely inhabited a different kind of culture from most of my fellow classmates. Lately, I’ve been watching the excellent Sherlock, which is a wonderful homage with which Sir A. C. Doyle, I think, would have been quite pleased. The title character is his arrogant, insufferable, brilliant self, exactly as one imagines him from the original stories, despite having been relocated to 21st-century London. But there’s an exchange in the pilot episode, meant to annoy viewers (or, perhaps, simply to make them cringe at Sherlock’s lack of emotional intelligence) where his faithful foils Watson and Lestrade aren’t following his neural leaps and he stops and demands, “Dear God. What is it like in your funny little brains? It must be so boring.” And that kind of describes the way I often feel on days like Unofficial St. Patrick’s Day. Is this the way students really want to spend the precious time they have when their brains are still fluid and flexible and their lives and hopes are still spread out before them like an enchanted carpet? Surely not.

It would appear that Illinois is not the worst university in the nation with regard to USP hooliganism. This year, that honor goes to the University of Massachusetts at Amherst, where police in riot gear waded into a crowd of four hundred drunk students who had descended upon a student apartment complex and were apparently threatening to dismantle it brick by brick. They got pelted with beer bottles and snowballs for their troubles and ended up making 73 arrests. I was also relieved to find that my alma mater, Penn State, where binge drinking, along with know-nothing football cultism, has apparently evolved into a tradition now handed down from generation to generation, managed to convince the State College bars to shut down entirely on Friday. They don’t need any more bad publicity, believe me.

Curiously, the students will still be around on the actual St. Patrick’s Day, which is next Monday, a week from today. I’ll probably stay in and watch a movie like The Dead, The Quiet Man, or my favorite, The Secret of Kells, and we’ll have smoked salmon on brown bread for dinner and toast with Irish whiskey my departed ancestors, who showed up at Ellis Island or the port of Philadelphia at various points in the nineteenth and twentieth centuries to escape encroaching famine, civil war, bigotry, and ennui and who found themselves confronted with all the same problems on this side of the pond, save the famine. Will these students ever look on St. Patrick’s Day as a day of remembrance and reflection, to say nothing of an actual religious holiday? Or will it always be simply a day to get drunk and behave like the savage drunken munchkins of American conventional wisdom? One can always hope for progress.

Comments { 0 }

See the Blue Waters Supercomputer up close on March 15!

In conjunction with the University of Illinois’ annual Engineering Open House, NCSA is inviting the public to the National Petascale Computing Facility for a first-hand look at Blue Waters, one of the fastest supercomputers in the world!  With a sustained speed of over 1 petaflop and peak speed of more than 13 petaflops, Blue Waters is capable of performing quadrillions of calculations per second. You can also tour the amazing, massive infrastructure required to support this powerful machine. The tour is self-guided.  Come any time between 9am and 3pm and stay as little or as long as you’d like.  The open house will take place Saturday, March 15, 2014 between 9am and 3pm at the NPCF facility at the corner of Oak St. and St. Mary’s Road.  Free parking is available across the street in Lot E-14.

Comments { 0 }

Edward Tufte — HERE! — April 10!

From "After the Storm:  Considerations for Information Visualization." M. Baker and C. Bushell,  IEEE Computer. Graphics and Applications, vol. 15, no. 3, 1995, pp. 12-15.

From “After the Storm: Considerations for Information Visualization.” M. Baker and C. Bushell, IEEE Computer. Graphics and Applications, vol. 15, no. 3, 1995, pp. 12-15.

I just found this in my mailbox downstairs.  Wow!  To technical communicators, information designers, and scientific visualization folks everywhere, Edward Tufte is a rock star.  OK, he’s a god.  And, sponsored by NCSA, he’ll be giving a public talk in Foellinger Auditorium on April 10 at 7:00 PM.  (I’m guessing the only reason we’re not hosting him in the NCSA auditorium is that it’s just too small for the audience he’s likely to get.)

Maybe you know his work?  His critical analysis of Microsoft PowerPoint and its possible role in the Columbia space shuttle disaster?  His volumes of beautiful, data-rich illustrations and visualizations?  Or, if you were at NCSA before my time, you might know his work from the amazing image above (which appears on the cover of his third book, Visual Explanations) — a still from a famous 1989 thunderstorm visualization on which he collaborated with NCSA research programmers.  (A PDF of the article is here.)  At any rate, his talk, titled “The Thinking Eye,” will “discuss seeing, reasoning, and producing in science and art” and touch on “evidence and inference, strategies for identifying excellence, and practical advice for seeing better in the real world and on the glowing flat rectangle of the computer screen.”

Sorry, I’ll calm down now.  Just as soon as I get over to Foellinger and set up camp in the front row for the next two months or so.  (Just kidding. Sort of.)

Comments { 0 }

International HPC Summer School 2014 – Budapest, Hungary, June 1-16, 2014

Grad students and postdocs are invited to apply—and take note:  meals and housing will be covered by the program, as well as travel for participants from the U.S., Canada, and Japan.   Faculty will include prominent computational scientists and HPC experts, who will provide instruction on general and discipline-specific HPC fundamentals and challenges, data-intensive computing, scientific visualization, and other critical topics.  The deadline for applications is March 9, 2014.  

2014 HPC Summer School

Comments { 0 }