Finding your passion

This is commencement week at LSU. Young people around campus are busy planning their graduation blowouts, and realizing that they should have been looking for a job already.

The last time I cared about who was speaking at commencement was my own graduation, but I am sure whoever it is this year will exhort the new graduates to “follow your passion.”  I remember a news story published last year that centered around this universal commencement advice. One new graduate, concerned that he had not yet found his one true calling, sought advice from an economist about what to do. As our young protagonist pointed out, some people find their passion early in life, while others search for decades, or forever. (Spoiler alert: this is another area where economists were sure they could provide an answer, but in the end were as clueless as the rest of us.)

Working with first rate researchers, I am quite familiar with people who found their calling at an early age. Most of my colleagues were firmly on their current career path before they graduated high school. I was a member of that other group. I drifted in and out of institutions of higher learning and through several careers, trying to find my one true calling, the thing that would motivate me to realize the high potential I had always been told that I possessed.  Each attempt eventually ended with me bored and disillusioned, and I grew increasingly pessimistic that I would ever find my intended vocation.

It was a series of personal setbacks that finally turned things around for me. Unemployed and back in school, I found myself too busy trying to survive to worry about my passion. I focused my efforts on accomplishing each my new objective, and solving problems new and old. I had no time for self-absorption.  I was content with the satisfaction that came with learning something new, acquiring a new skill, or making a new friend. I learned to enjoy my own company, and to appreciate the days as they passed.

Along the way, I fell in love with computer science. I realized that my passion had always been learning new things, understanding how the world works, and thinking hard, and CS is a perfect fit. Twenty-five years later, I am still excited to come to work every day. I don’t wish to imply that I would not have cared for computer science if had not struggled, but I think that focusing on what I have and enjoying who I am, rather than what I wish I had or who I would rather be, has helped me to stay committed and content with my chosen path. Corny, I know, but no worse than “follow your passion.”

The Future is Friday

 

Baton Rouge’s Red Stick Festival is getting a makeover. This year will be about listening to the community about which elements should be the focus in future years.

It’s free and open to the public. And a great chance to see our new Digital Media Center.

redstick_8.5x11

Congratulations Class of 2013

Graduation was — well, a few weeks ago now. We see them come and go three times a year, and often forget what a landmark it is for those participating. This Spring, one of the students in our lab was chosen as the student commencement speaker for the College of Engineering. A video of this speech is here.

Congratulations, Andre. And the rest of the Class of 2013!

The case for mortality

Marvin Minsky’s Society of Mind was my first exposure to the notion that our early concepts — those things we learned to believe first — are the hardest to change. The reason is that so much else depends on them. They are like our mental alphabet, and modifications risk collapsing the structure of our thoughts and beliefs. (John Holland and others showed similar results for genes, but that’s a story for a different day.) The more complex and experienced our minds become, the more inflexible. It’s not biology so much as it is mathematics. So we are born with an expiration date. As long as the world changes around us, we will continue to be left behind. Whether this is by accident or design, we are unlikely to alter it.

My grandfather was born to a world without cars, radio, or electricity.* He didn’t get running water inside the house until he was in his sixties, and still preferred the walk to the outhouse when weather permitted. I suppose relieving oneself in the house is a concept that requires some getting used to, if you didn’t grow up with it. I don’t think he ever had a telephone, and he managed to live eighty-four years without stepping on an airplane.

My grandfather and myself in the “initial food prep” area of his farm. The smokehouse is in the background, with his impressive collection of walking sticks. Just out of the frame to the right is a spigot that delivered the only running water on the farm.

My parents’ generation was forged in a global depression sandwiched between the two most destructive wars our civilization has ever experienced, harbingers of the promise and potential horror of globalization and technology. Fortitude, stability, and duty were their watchwords. Their world was a hard and dangerous place. The wise were prepared for anything. They believed in citizenship and strong social institutions. If they were the greatest generation, it is likely because they lived the greatest challenges.

My lot grew up in the Cold War. Whether we were playing cops and robbers, cowboys and Indians, Allies and Nazis, or something else, there were good guys and bad guys, and no question of which was which. Science and technology had started and ended WWII, cured polio and smallpox, and were helping us beat the dirty Russkies to the moon. Most families had one car and one working parent. Cokes were a dime and comic books were 12¢. The ‘¢’ symbol was common. We roamed pretty much wherever we wished as children, with the only restrictions that we look both ways before crossing the street, try not to put anyone’s eye out, and be home for dinner.

I notice two phenomena as I age. The world moves away from us, in my case with globalization, new technology, complicated politics and 24 hour cable news (rapidly being replaced by Internet channels of all kinds). And we become less interested in keeping up with progress, or maybe it’s that progress seems more of an illusion. I learned enough model numbers as a teenager trying to prove I knew stereo equipment. I have very little interest in keeping up with the newest smartphone features. A dear friend’s father retired from architecture a few years early when his firm computerized, choosing not to bother learning a new way to do a job he had done all his life, and had done better than almost anyone. Like most his age, he eventually learned to e-mail and surf the Internet, but computer technology is neither his friend nor his constant companion.

While many of us never lose our fear of death, we begin to lose our place in life. The present becomes at once lonelier, more confusing, and more mundane. We fill the void by investing heavily in memories of the past and hopes for the future.

On the other hand, no one appreciates a day like a person who doesn’t know how many more they will see. The countability of our moments gives them meaning. I think it’s impossible for the young to fully appreciate this, but it is no problem for someone who attends funerals as regularly as the rest of us go out to dinner. Every time I think of how I miss my father, I am reminded.

The result (and possibly the cause) of all this is that the little things — family, friends, the sun in your face on a perfect day — become the big things in life. The title on a business card, h-index, or the number of Twitter followers and Facebook friends fade in comparison. At least that’s how it seems to be working for me.

I’ve always believed that we should enjoy our days, because the clock is ticking. It’s only been in recent years that I have decided I’m okay with that.


* Technically, all of these things existed when my grandfather was born. They just weren’t at all common. Sort of like Segways.

Beware of being too worried about the IKEA effect

I wrote the post about the IKEA effect as a sort of personal admission that the software architecture I have been developing was just not that interesting or useful. A few days after I published that post, I received notification that a paper I wrote on the topic had been deemed interesting enough for conditional acceptance in a conference that accepts less than one in five submissions. I still believe it’s important not to allow oneself to get too attached to one’s own ideas, but it is also important to trust in our own vision. Like most things, I suppose the secret is finding a balance.

Beware the IKEA Effect

The IKEA Effect describes a psychological bias through which people tend to place higher value on a product if they expended some effort creating it. The name is from a study showing that people valued furniture more highly if they assembled it themselves, even if they admittedly did a poor job. The effect was reduced if the products were not finished, or if participants built and then destroyed their creations.

I try to watch for biases, and this one is especially insidious, in that it likely applies to ideas. This would imply that the more we work toward completing something we came up with ourselves, the better of an idea we are likely to think it is. I have tried to apply this newfound knowledge to a software architecture I’ve been working on for a few years now. The results of my informal internal analysis were revealing and humbling. I’ve been dealing increasingly with people that don’t seem to understand the value of some of the more esoteric points of this system. While that’s not particularly unusual for scientific research, or necessarily bad, I have decided to take a fresh look at the various aspects of this project, and look for validation along the way.

Increasing access to scholarly research

Last week the White House issued a memorandum directing federal agencies that fund more than $100 million in research to create plans for making the studies they fund freely available to the public. The positive responses from both academic publishers and access advocates lead me to believe that the solutions will primarily be some form of author-funded publication of this research, with the fees paid by the funding agencies.

Access to taxpayer funded research has been a sore point with many researchers for years, but the issue gained new visibility following the suicide death of Internet activist Aaron Swartz. Swartz’ case brought into stark relief the conflict between the U.S. government’s responsibility to make taxpayer-funded research available to its citizens, and its need to protect the lucrative industry that exists to publish much of this same research.

Whatever plans are put forward by the various agencies, I would surprised if they acknowledge two realities of publishing today. The first is the growing importance of direct self-publication (i.e. without a publication house involved) in society at large. Academics may be understandably slow to adopt this model for publication of research, but it is likely that a large number of open access journals will arise without the involvement of established academic publishers. Any one that attracts a sufficient number of distinguished scholars could prove successful.

The second factor is the impending obsolescence of print publications, whether paper or electronic. In the humanities, digital technology is quickly becoming the most popular way to publish research. As humanists become more familiar with technology (and students read less), interactivity will follow quickly and the linear narrative will likely fade. Science research has been ignoring the man behind the curtain for decades now. The litmus test for published scientific results is supposed to be whether they can be reliably repeated, but very few conventional journal publications include (or can include) more than a vague description of the data, and even less information concerning the apparatus and software codes that were used to produce the results.

E-books are already suffering declining sales, as interactive tablets and media rise quickly. The future of academic publishing is likely to be very different from its present.