Discover more from Thoughts from a Bench
Trying not to fool oneself
Quick thoughts on proving yourself wrong
Summer is flying by…
I thought I’d share an interesting perspective on sabbaticals that I learned from author and futurist Kevin Kelly. This resonated with me as I push through my graduate school applications and wonder when the next period of rest is due (the answer: I’ll plan it when I’m less busy….).
“I think we over overemphasize our productivity and efficiency, but […] the best thing for your work ethic is to have a rest ethic.”
Emphasizing rest, he finds taking sabbaticals every few years is an ideal way to recharge and reset your perspective. A productive, successful career doesn’t need to be a constant march forward. In fact, it can be less fruitful. Returning from a sabbatical, you might be ready to make a big leap forward you couldn’t do while just taking one step at a time.
There are two keys to a good sabbatical:
It has a completely different rhythm and structure from what you normally do (Kevin is a prolific writer; during one sabbatical, he only let himself read books all day, which opened his mind to some really unique ideas)
It lasts at least 6 weeks (but ideally, it’s several months)
During my sabbatical in a few years, I see myself reading and writing and spending as much time away from my dual monitor as possible.
The following post was originally published on Oct. 16th, 2021 on my own blog. It’s one of my favorites though, and thought I would re-share.
“Everything that needs to be said has already been said. But since no one was listening, everything must be said again.”
Have a good week!
“The first principle is that you must not fool yourself—and you are the easiest person to fool.”
How often should you try to prove yourself wrong?
I think I’ve learned the answer for myself: more often.
I recently worked on a healthcare analytics project in which we were looking into de-identified data of patients prescribed cancer therapies. We thought a seasonal spike in patient requests for services was caused by Medicare's peculiar coverage structure.
This belief was observable in the data and intuitive. The chart plotted out really cleanly on a Tableau dashboard. The only problem?
The belief was wrong.
The uptick was actually just caused by a software upgrade in the system providing the historical data. This upgrade basically caused a bunch of duplicate data to be coincidentally entered during that time of year.
The only way we eventually got this right was by challenging our original belief (which admittedly wasn’t easy to do) and letting go of what we wanted to believe was true, of what was merely intuitive, of what simply aligned with a narrative. We could not have exposed such a counter-intuitive explanation otherwise.
This story reminds me of how crucial it is to pressure-test beliefs, not only accepting those ideas that align with intuition. In his book Think Again, Adam Grant refers to this as "thinking like a scientist." Scientists (the good ones at least) don’t just search for facts that support their existing beliefs; they aim to prove their beliefs wrong until they no longer can. What survives this rigorous disconfirmation process is that much more likely to be true.
Good scientists are aware that doing otherwise is a recipe for what Richard Feynman called “fooling yourself". There are costly consequences to this.
Many states fooled themselves when they invested heavily in “scared straight” programs in the 1980's and 90's to reduce reoffending rates for juvenile delinquents. The popular 1978 documentary Scared Straight had ostensibly “proved” to these states, via much profanity and histrionics, that dragging juvenile delinquents to the local state prison so that they could be berated, threatened, and terrorized by the serving inmates would help deter them from a future in crime.
"Cause that's all you are when you come to somebody's prison. A number. You lose everything. But this is what you clowns want!!"
As well-intentioned as these programs were, the evidence supporting their efficacy was almost entirely anecdotal. And the municipalities who invested in and implemented these programs at a broader scale did not realize their mistake until decades later, when subsequent experimental studies conducted by scientists, not merely trusting intuition, showed that these programs were ineffectual, even harmful.
And ironically, Angelo Speziale, one of the juveniles in the original documentary, would later be convicted of a rape and murder he committed only 4 years after Scared Straight. He even ended up in the same Rahway State Prison that had supposedly “scared him straight” when he was 16 years old.
Had states run just a single controlled experiment prior to investing in "scared straight" programs, they may have developed the skepticism to wisely invest those taxpayer dollars elsewhere.
How much of what you believe could actually be wrong, because you haven’t pressure-tested those beliefs to find out? What would change, for the better, if you found this out?
There is certainly a greater likelihood of truth on the other side. As the scientist Louis Pasteur wrote,
It is indeed a hard task, when you believe you have found an important scientific fact and are feverishly anxious to publish it, to constrain yourself for days, weeks, years sometimes to fight with yourself, to try and ruin your own experiments and only to proclaim your discovery after having exhausted all contrary hypotheses. But when, after so many efforts, you have at last arrived at a certainty, your joy is one of the greatest which can be felt by a human soul, and the thought that you will have contributed to the honor of your country renders that joy still deeper.