The case for mortality

Marvin Minsky’s Society of Mind was my first exposure to the notion that our early concepts — those things we learned to believe first — are the hardest to change. The reason is that so much else depends on them. They are like our mental alphabet, and modifications risk collapsing the structure of our thoughts and beliefs. (John Holland and others showed similar results for genes, but that’s a story for a different day.) The more complex and experienced our minds become, the more inflexible. It’s not biology so much as it is mathematics. So we are born with an expiration date. As long as the world changes around us, we will continue to be left behind. Whether this is by accident or design, we are unlikely to alter it.

My grandfather was born to a world without cars, radio, or electricity.* He didn’t get running water inside the house until he was in his sixties, and still preferred the walk to the outhouse when weather permitted. I suppose relieving oneself in the house is a concept that requires some getting used to, if you didn’t grow up with it. I don’t think he ever had a telephone, and he managed to live eighty-four years without stepping on an airplane.

My grandfather and myself in the “initial food prep” area of his farm. The smokehouse is in the background, with his impressive collection of walking sticks. Just out of the frame to the right is a spigot that delivered the only running water on the farm.

My parents’ generation was forged in a global depression sandwiched between the two most destructive wars our civilization has ever experienced, harbingers of the promise and potential horror of globalization and technology. Fortitude, stability, and duty were their watchwords. Their world was a hard and dangerous place. The wise were prepared for anything. They believed in citizenship and strong social institutions. If they were the greatest generation, it is likely because they lived the greatest challenges.

My lot grew up in the Cold War. Whether we were playing cops and robbers, cowboys and Indians, Allies and Nazis, or something else, there were good guys and bad guys, and no question of which was which. Science and technology had started and ended WWII, cured polio and smallpox, and were helping us beat the dirty Russkies to the moon. Most families had one car and one working parent. Cokes were a dime and comic books were 12¢. The ‘¢’ symbol was common. We roamed pretty much wherever we wished as children, with the only restrictions that we look both ways before crossing the street, try not to put anyone’s eye out, and be home for dinner.

I notice two phenomena as I age. The world moves away from us, in my case with globalization, new technology, complicated politics and 24 hour cable news (rapidly being replaced by Internet channels of all kinds). And we become less interested in keeping up with progress, or maybe it’s that progress seems more of an illusion. I learned enough model numbers as a teenager trying to prove I knew stereo equipment. I have very little interest in keeping up with the newest smartphone features. A dear friend’s father retired from architecture a few years early when his firm computerized, choosing not to bother learning a new way to do a job he had done all his life, and had done better than almost anyone. Like most his age, he eventually learned to e-mail and surf the Internet, but computer technology is neither his friend nor his constant companion.

While many of us never lose our fear of death, we begin to lose our place in life. The present becomes at once lonelier, more confusing, and more mundane. We fill the void by investing heavily in memories of the past and hopes for the future.

On the other hand, no one appreciates a day like a person who doesn’t know how many more they will see. The countability of our moments gives them meaning. I think it’s impossible for the young to fully appreciate this, but it is no problem for someone who attends funerals as regularly as the rest of us go out to dinner. Every time I think of how I miss my father, I am reminded.

The result (and possibly the cause) of all this is that the little things — family, friends, the sun in your face on a perfect day — become the big things in life. The title on a business card, h-index, or the number of Twitter followers and Facebook friends fade in comparison. At least that’s how it seems to be working for me.

I’ve always believed that we should enjoy our days, because the clock is ticking. It’s only been in recent years that I have decided I’m okay with that.

* Technically, all of these things existed when my grandfather was born. They just weren’t at all common. Sort of like Segways.

Beware of being too worried about the IKEA effect

I wrote the post about the IKEA effect as a sort of personal admission that the software architecture I have been developing was just not that interesting or useful. A few days after I published that post, I received notification that a paper I wrote on the topic had been deemed interesting enough for conditional acceptance in a conference that accepts less than one in five submissions. I still believe it’s important not to allow oneself to get too attached to one’s own ideas, but it is also important to trust in our own vision. Like most things, I suppose the secret is finding a balance.

Beware the IKEA Effect

The IKEA Effect describes a psychological bias through which people tend to place higher value on a product if they expended some effort creating it. The name is from a study showing that people valued furniture more highly if they assembled it themselves, even if they admittedly did a poor job. The effect was reduced if the products were not finished, or if participants built and then destroyed their creations.

I try to watch for biases, and this one is especially insidious, in that it likely applies to ideas. This would imply that the more we work toward completing something we came up with ourselves, the better of an idea we are likely to think it is. I have tried to apply this newfound knowledge to a software architecture I’ve been working on for a few years now. The results of my informal internal analysis were revealing and humbling. I’ve been dealing increasingly with people that don’t seem to understand the value of some of the more esoteric points of this system. While that’s not particularly unusual for scientific research, or necessarily bad, I have decided to take a fresh look at the various aspects of this project, and look for validation along the way.

Increasing access to scholarly research

Last week the White House issued a memorandum directing federal agencies that fund more than $100 million in research to create plans for making the studies they fund freely available to the public. The positive responses from both academic publishers and access advocates lead me to believe that the solutions will primarily be some form of author-funded publication of this research, with the fees paid by the funding agencies.

Access to taxpayer funded research has been a sore point with many researchers for years, but the issue gained new visibility following the suicide death of Internet activist Aaron Swartz. Swartz’ case brought into stark relief the conflict between the U.S. government’s responsibility to make taxpayer-funded research available to its citizens, and its need to protect the lucrative industry that exists to publish much of this same research.

Whatever plans are put forward by the various agencies, I would surprised if they acknowledge two realities of publishing today. The first is the growing importance of direct self-publication (i.e. without a publication house involved) in society at large. Academics may be understandably slow to adopt this model for publication of research, but it is likely that a large number of open access journals will arise without the involvement of established academic publishers. Any one that attracts a sufficient number of distinguished scholars could prove successful.

The second factor is the impending obsolescence of print publications, whether paper or electronic. In the humanities, digital technology is quickly becoming the most popular way to publish research. As humanists become more familiar with technology (and students read less), interactivity will follow quickly and the linear narrative will likely fade. Science research has been ignoring the man behind the curtain for decades now. The litmus test for published scientific results is supposed to be whether they can be reliably repeated, but very few conventional journal publications include (or can include) more than a vague description of the data, and even less information concerning the apparatus and software codes that were used to produce the results.

E-books are already suffering declining sales, as interactive tablets and media rise quickly. The future of academic publishing is likely to be very different from its present.


First Melete nodes are installed

Melete is a small research cluster being developed at CCT to explore concepts, opportunities, and challenges associated with interactive high-performance computing. The first nodes are installed and … mostly generating heat. A series of software installs in the coming weeks should produce the first tangible results. Funding for Melete is from the National Science Foundation and CCT. The principal investigator is Brygg Ullmer.


Some things are just too big to be indoors. Take the 363 foot Saturn V rocket, which weighed in at over six and a half million pounds when ready to shoot people to the moon.* Because it weighed as much as a good sized freighter, and got mileage of around 5 inches per gallon, the rocket was designed to barely lift its own initial weight. As the weight of the fuel diminished, the acceleration would increase.

The Saturn V Center at the Kennedy Space Center in Florida. The building is about 50 stories long, and in addition to showing countless visitors the marvel of the moon rocket, it housed three or four thousand VIPs for every shuttle launch.

This made a Saturn V launch quite a dramatic thing to watch. The big engines would light, the tower would fall away, the big locky things at the bottom would unlock, and the rocket would … mostly just sit there. It only moved a few feet in the first several seconds.

I find that a lot of new technology development progresses this way. Enormous effort is poured into the early stages, often yielding very little visible progress, other than a lot of heat and steam. Any loss of momentum at this stage can be catastrophic, which is a shame, because slow starting projects are easy to ignore.

Super slow-mo video of Apollo 11. If you haven’t seen this, it’s worth the four minutes.

If development can surpass this first phase and build a little momentum, people will start to pay attention. The best reaction one can get to a technology demonstration (other than immediate additional funding) is a list of things it should do. This means people are engaged, and imagining how they would use this new thing.

It’s often only after too much effort and too time has been spent, often bringing a project to the brink of oblivion, that a new idea will seem  to be going anywhere. After this, the sky’s the limit. That is, assuming things stay on course, and that the idea doesn’t burn out too soon.

* What happened to us? We used to send people to the moon. I was sure my flying car and robot servants were just around the corner. Now we can’t even keep our schools and bridges from falling down.

Social sugar

My position often finds me acting as facilitator between science professors and industry types, between faculty and staff at our center, or between faculty from different fields or departments. It is difficult to overstate the interpersonal impedance mismatch that often occurs between these groups. From basic expectations of timeliness and adherence to deadlines, to the relative values of knowledge and utility, academics and industry professionals share less ground than either group understands.

This communication minefield is very often made worse by the social awkwardness of science and mathematics researchers. Some of this is due to background or specific … idiosyncrasies, or real disabilities, but some is due to a personal disregard of the importance of social niceties. Programming languages often include what computer scientists term syntactic sugar, which comprise structures, tokens, and other constructs intended to make the language easier to read, program, or understand. A sizable fraction of computer scientists view syntactic sugar as wasteful or counterproductive, since it can obscure the internal workings of the language. For a number of researchers in all areas of science and mathematics, this attitude extends to personal interactions as well. They would prefer to interact in the most honest and efficient way possible, and do not understand why others insist on more indirect or subtle styles of intercourse.

In industry, valuable research scientists with this attitude are often accompanied by “handlers” when they leave the lab. These are project managers or business development people whose responsibility it is to ensure that the charge neither offends collaborators or commits to unnecessary work in the name of logical sense. Many years ago, in the minutes before a meeting at a national laboratory, I heard a young computational scientist ask his manager, “Is this one of those meetings where I’m supposed to sit quietly and not speak unless you tell me to?” She nodded her head affirmatively and took her seat.