My recurring nightmare at university wasn’t sleeping through an exam. It was way more embarrassing: an infinite loop of internships.
Each time I started a new “dream job” it went well. Always seemed like I’d score an offer to return full-time. But after every 4, 8, 12… even a 20-month internship my boss would book an exit interview, then it was time to head back to school. My dream of being a brilliant analyst, a dream that could exist only in the Glass Bead Game that is business school, crushed. Again, and again, and again.
Writing Methodology: Human (sleep deprivation mode).
Systems Thinking has a Problem
It’s kinda silly to glamorize being an analyst. In my defense, I grew up watching James Bond and The Davinci Code, and later on, movies like The Big Short and Moneyball. While I never found much inspiration in buff Schwarzenegger types, there was plenty to resonate with in sleeker, still impactful protagonists. Bond’s skillset comprised mostly social skills and analytical chops, rather than pure brawn. The heroes of Moneyball weren’t on baseball cards.
Now I’m about 10 years into a hodgepodge analytical career. At first glance it’s been a lot like my recurring, embarrassing collegiate nightmare – a few months here, a couple years there – but overall, going well. I’ve gotten my fair share of chances to LARP as an extra in a spy or psychological thriller film, from 6am helicopter rides to extremely tense boardrooms. NDAs hide the nerdiest details, but not the gloss.
If I have a regret, or a cautionary tale, it might be reading Thinking In Systems: A Primer by Donella Meadows.
Meadows was apparently a wonderful woman, and she wrote a solid book. But if you’re a twenty-one year old desk jockey, like I was, systems thinking is basically a hard drug. Books like Meadows’, or The Fifth Discipline by Peter Senge, to name another example, will juice you up with a dangerous amount of hubris and delusion about your agency.
Why? Well, it’s one thing to know where to look for leverage points in a system (places where you can make a lot of change without a lot of effort, basically). But there are some complicating factors:
Your model of the system will be way too simplistic.
Awareness does not imply ability to execute.
Many of the things that look like patterns are actually emergent and cannot be predictably affected, let alone engineered.
Survivorship bias – anything that’s been identified as a leverage point was not done so through gigabrain insights, but through tinkering.
It’s not just Meadows as an author. Systems thinking books have a nasty ability to transform analysts in to Don Quixotes, who then hurl themselves at macro-level projects with reckless abandon.
The ambition is fine. This isn’t a call to be cynical about the prospect of large-scale change. It’s just about calibrating your ego to the tools you have at hand. But why am I talking about ego? Isn’t this essay supposed to provide career advice for analysts?
Corporate Self-Help
A lot of people have the job title of “_____ Analyst”. The majority aren’t quants. Most analysts deal with numbers on a regular basis, but rarely dip into statistics or probability. The math is rather trivial, even if high-stakes. Rather, the main task is qualitative research and analysis. Looking at scenarios in a particular light. Getting information from people Drawing lessons from historical cases studies. Consequently, a lot of business books are more about personal development – helping you see differently – rather than actual tools, techniques or methodologies (unless you’re a quant).
When you read a corporate self-help book like Thinking in Systems you adjust your perspective. You don’t just swap out your lenses, either. It’s also your view of your place in the world that changes. It’s a necessary part of the job and some people are excellent at it. An analyst plays devil’s advocate, researcher, enforcer, salesman, chief marketing officer, financial planner… all with the same title.
A kind of plasticity is required. Qualitative analysts face the challenging problem of being a contortionist – books, frameworks, and conceptual aids help them stretch into new perspectives. But therein is a the fatal flaw of systems thinking material – it situates readers in the driver’s seat by default.
As an analyst, especially a young one, you are definitively not behind the wheel. Even a wizened, wrinkled eminence gris is not in charge. Perhaps you have a keen eye for the underbelly of a system or its leverage points, but no access to fulcrums or levers long enough to move the system.
Furthermore, despite the language it’s cloaked in, systems thinking puts you in a minigame mindset. Immediately you’re looking for shortcuts. Leverage points = silver bullets. The only game worth playing is the long game.
Business books, like self help books, are about personal development. And that’s hard. Think about how long it takes to change one’s lifestyle, how many new habits must be tried on, how many New Year’s Resolutions must be failed… just for a slight improvement?
Even then, can you chalk up any improvement to a NYT Best Seller? How much? Or is it mostly that you got older, wiser, more tired, and a little less cocky? Maybe there is some long, boring work that is a prerequisite to get the most out of systems thinking books.
Minimum Reading Age
Perhaps the most damning externality of systems thinking is its tendency to spark an urgent search for causality at a macro level. People already have that tendency – especially smart young people. If anything, as an aspiring analyst, you might want to dampen your own compulsions to do so. Jumping to conclusions is dangerous, especially if your conclusions are disguised as an intermediate analytical step.
Aside: I’ve been told that, like your frontal lobe fully develops at 25, your ability to hold information, including emotions, before processing it doesn’t really kick in until about forty years old. That’s probably around the right age to start looking at books that deal with macro-level dynamics. I’ll let you know when I get there…
Abstracting a system into a model of stocks, flows, and feedback loops can be helpful, but it’s a loaded process. Such models contains a lot of presumptions. Of course, that’s a necessary evil. But qualitative, back-of-the-napkin models like those used by systems thinkers present as way too lucid. An aesthetic, but rotten model is a perfect agency trap. It will delude you into thinking that you can do more than you can.
If you give an analyst a model they’ll be wrong for a day. Teach an analyst how to model and they’ll always be wrong, but sometimes useful.
I think this delusion of agency is a source of burnout for analysts. A silly systems model can turn someone into a corporate Don Quixote. Pretty soon they’ll blow up an emotional Achilles tendon railing against a complex system.
My (perhaps overly daoist) interpretation of the phrase “the obstacle is the way” is neither that you must go through the boulder in your path, nor is that you must go around it. Rather, the solution will take the shape of your obstacle. Most of the time, systems thinking leads you to draw a drastically inaccurate silhouette of a problem space – with far too much confidence.
Blind Spots
My final critique of systems thinking is that it limits your analogical range, which is more or less your key asset as an analyst. Your net worth is your met work – how many metaphors, analogies, and frames you can muster in to help make sense of a problem. And not in a shallow way. A few good analogies will blow a huge range of weak ones out of the water. Analogical range is a product of breadth x depth and is impossible to mimic.
Some of the best advisors I know will hammer the same three ideas over and over. Things like bottlenecks, feedback loops, adoption curves, efficiency-thoroughness trade offs, antifragility, process modeling, stakeholder mapping, etc. You don’t need that many to do good work. Although it’s fun to have a wide repertoire. Others do that very well.
Systems thinking deteriorates your analogical range because it reads (to the unexperienced, as I was) as a universally useful model. You can use it to model anything, so why not just get really good at that one thing?
The blind spots of systems thinking are under-discussed and I wish had known them.
It’s great at flows and stocks, but not good at picking up on absences like safety, health, security.
Tail risks are discounted because they don’t show up in the stable dynamics of a model.
Heterogenous cultures are ignored because systems thinking fails to account for the various directions that stakeholders are pulling in terms of their aspirations at a macro level.
When you model a system, you rarely place yourself in that system.
Symbolic comfort tricks you into thinking you have a good feel for a system despite not having gone into its depths and edges.
Establishesfetishizes causal connections even at scales where emergent behaviors are most prevalent.
Systems thinking, at its worst, is a Great Man trap for analysts – the temptation to specialize in a cool, cerebral methodology, rather than collect a useful set of tools, ideas, knowledge, and contacts. At its best, it’s just another notation. Limited, always wrong, and sometimes useful.
What’s the antidote?
Well, if you’ve already corrupted your analytical brain with a systems thinking tome – that’s alright. I’m still in the process of recovery from such books. Not to mention my business degree (just kidding, Gustavson). There’s a lot of unlearning to do. Particularly with regards to “leverage points”.
At the macro level is that there are no levers. There are only strings. The further up the micro, meso, macro scale you go, the less you can push and the more you have to pull. You could call this the stringiness coefficient. For example, when you’re trying to launch a new field of study into the world, it doesn’t matter how much money you push at it. You need to create some things that draw people, resources, and influence in – but in a way that doesn’t lead to capture.
At the micro level, levers are so plentiful that you’re best off looking for big TAMs and making some clever marginal improvements. But things like “nudge theory” don’t work. They don’t scale to systems level change in any reliable manner.
You have to unlearn your micro and macro enough to focus on the meso: orgs, units, small market dynamics, political conflicts at city scales, etc.
I have to get back to making a course on this stuff, so if I had to give you a simple antidote it would be: get really good at a few technical forms of analysis, like process modeling or market size estimation or bottleneck analysis… and upgrade your charisma. Get better at talking, presenting, marketing – but don’t push on a string.
P.S. I saw some version of the Bullshit Jobs hypothesis floating around. As much as I like Graeber’s works, especially The Dawn of Everything, his take on office work is too cynical for me. There are better explanations for the explosion of weird jobs – like the Baumol Effect or this piece Most Work is Translation. It’s easy to dunk on analysts… and silly to think that their jobs are useless because they can’t clearly explain it to you. Analysts already have to do translation all day; you really expect them to translate it again, in a new language?
Weekly Puzzle
P.S. The FIDE Grand Swiss wrapped up this week. Anish Giri and Matthias Bluebaum placed first and second, respectively, and qualified for the candidates tournament next year. Bluebaum was a dark horse, at least to me. There was also a 14-year-old grandmaster who absolutely cooked which was fun to watch.
Cantankerous indeed. 😉 Once you start seeing things in "systems", it's turtles all the way down. Far better to see entangled mangrove swamp roots or balls of yarn.