Here is a theory about our sluggish economic recovery:
The old time-frames for bust-to-boom are no longer operative templates.
Recessions used to take a long time to play out, because industries didn’t have the moment-to-moment vertical integration we see now in supply chains.
Inventories that sat idle were not only signs of wasted resources, they were also a buffer that slowed the signals of economic activity.
This is the first downturn of a hyper-connected market. We’re shedding jobs like we never have, because small-to-medium businesses now have access to the relevant data that informs layoffs faster. Those looking at the slope are seeing the same data as before, but compressed in time.
The recovery will not be quite as compressed, because there will be re-organization and slower buildup. Much like how gas prices rocket up, and float down.
The significance of this is that I wrote it nearly three years ago, on Cringely’s blog. Seems like ages ago.
It seems to have held up, because the economy has been rather laggard. That’s not proof at all that I am right, but it is a nice data point about the power of observation. At the time, there was a zero percent chance that I could be proven correct. Now, it’s still under 5%, but at least the predicted outcome makes the conjecture a valid one to investigate.
Why bring it up? Well, it popped up in my Google notifications as a new mention of my site. It likely means that Cringely just did some housecleaning on his page, or enabled a new WordPress theme, and all the pings went ping-y again. But there is a larger point about what we call “science.”
In my case, I have a simple conjecture that might not even rise to a theory. The universe of alternate explanations for something as complex as an economy is way too large. It’s enough to say that I haven’t been proven wrong, but it will take a long time to winnow out other likely candidates. So why do we get led astray so often?
Stephen Colbert may or may not have made a lot of money making up words like “Truthiness,” but there is something rather profound in that concept. Just like the way we’re seduced by truth-like substance, there are people who will brandish statistics and formulae in order to explain what they see. Never mind that there might be equally-valid alternate explanations – we got the numbers to work so it’s Science, so there.
Don’t confuse measurement with Science. Measurement is an important part of it. Without measurement, you have no basis for comparison, observation, or even replication of findings. I see this quite a bit in my role as a social media guy, with people brandishing statistics as though the numbers are all that matter.
- Are you measuring the right things?
- Are you seeing causation, or just correlation?
- Is this something you can replicate?
- Are you even asking the right questions?
- Are there simpler, alternative explanations?
If you see “results” posted, with no explanation of the above questions baked into the methodology, then you’re just guessing, like I did in March 2009. It’s not to say that you’re wrong, but publishing numbers and graphs doesn’t make you right.