What is Systems Thinking?

According to the latest paper by WPI professors Jamie Monat and Thomas Gannon, it’s an approach to engineering that incorporates the full context of the problem at hand.

And it might have helped avoid some of the most notorious disasters in engineering history.

Here’s a summary of five catastrophes Monat and Gannon contend wouldn’t have happened if those involved had engaged in Systems Thinking a little more actively.

1. The Microsoft Zune

You might not think of the failed attempt at competing with Apple’s iPod as a “disaster,” but it cost $289 million. Imagine if that money had gone to the Bill & Melinda Gates Foundation instead.

What happened: Typically cited more often in marketing classrooms than as an engineering case study, the Zune is still the butt of jokes in pop culture – notably in 2017’s Guardians of the Galaxy 2 – for not having the iPod’s aesthetic appeal or “cool” factor. In fact the Zune, according to Slate tech columnist Farhad Manjoo, was “perfectly fine” as an isolated piece of equipment.

Unfortunately, users don’t experience anything as “an isolated piece of equipment” anymore, especially not a pre-streaming audio playback device that, at the time, required users to maintain their own media library across an ecosystem of equipment and services. Microsoft didn’t adequately situate the Zune within the context of a full User Experience System, while Apple’s various iPod rollouts did. By leveraging systemic benefits like intuitive and stylish design, common parameters across multiple devices, a library of available music licensed for downloading, and an easy-to-understand pricing scheme, Apple made short work of Zune, which lasted only 5 years before being discontinued.

How Systems Thinking could have helped: Microsoft may have cemented a better reputation as a hardware company by building the Zune as merely one functional component of a complete User Experience System, rather than a stand-alone device. Instead of eliciting a chuckle every time somebody says “Zune.”

2. The Water of Ayolé

The newly constructed water supply infrastructure of a small West African village broke down after three years, forcing residents to use parasite-infested river water.

What happened: The rural village of Ayolé, Togo relied on the Amou River as a water source, exposing residents to guinea worms, tiny parasites that cause excruciating pain. Government and international aid organizations responded to the crisis by digging and installing new wells. After a few years of regular operation, the wells shut down.

How? The village was simply not equipped to handle the normal wear and tear on their new infrastructure. There were no spare parts available, no technical expertise on hand to help fix or maintain the pumps, and no money to pay for repairs.

How Systems Thinking would have helped: Stakeholders eventually applied System Thinking after the initial well-building project treated the water issue as merely an engineering problem. Togolese extension agents trained villagers in well maintenance and repair, the local hardware store established a supply chain for repair parts, and the village’s women organized an agricultural production and sales system to help pay for the parts. This uncovers one of the more important lessons of Systems Thinking: problems are best solved when incorporating the interrelationships between engineering, socio-economic conditions, logistics, and users themselves.

3. 20 Fenchurch Street, London

The curved façades of this high rise office building focus the sun’s reflection off its windows and into a concentrated “death ray.”

What happened: Designed by Rafael Viñoly and completed in 2014, the parabolic shape of this 38-story London office complex reflects a huge swath of sunlight onto a small area at street level for several hours each day, resulting in storefront temperatures exceeding 200°F. An automobile was partially melted, and a reporter fried an egg on the sidewalk. The thermal behavior caused locals to nickname the building “the Fryscraper.”

How Systems Thinking would have helped: System Thinking means including interrelationships between relevant environmental components — such as “the sun is hot” — into design. Failure to do so was especially galling in this case, due to Viñoly’s involvement with the similarly sizzling Vdara Hotel design in Las Vegas only 6 years prior to the Fryscraper.

4. The Russian K-141 Kursk Submarine Disaster

During a training exercise in August of 2000, the greatest tragedy in the history of the Russian Navy resulted in the loss of 118 crewmen.

What happened: A leak of hydrogen peroxide (H2O2) from one of the ship’s torpedos reacted with contaminants in the torpedo tube, triggering an explosion of the ship’s ammunition. The submarine flooded and sank within minutes, condemning the few crew members who survived the initial blast to a horrific fate.

How Systems Thinking would have helped: System Thinking incorporates advance planning for the eventual “Controllers” and “Maintainers” of a system, which in this case would have been the cash-strapped Russian Navy of the early 2000’s. The risk associated with hydrogen peroxide propulsion of torpedos were known and well documented, but the cost of removal or cleanup proved prohibitive. Rather than triggering an alert to scale back naval activity or decommission submarines containing H2O2, the danger was simply ignored.

5. “Galloping Gertie,” A.K.A., the Tacoma Narrows Bridge

Everybody’s seen the iconic footage of this well-known engineering disaster.

What happened: Wind shears surging through the Tacoma Narrows exerted extreme aerolastic tortional flutter (a synonym for “wobbling like a real-life cartoon”) on this ill-fated suspension bridge. Slight sways increased the amount of surface area exposed to gusting winds, acting as a force multiplier that twisted the bridge further and required greater elasticity for it to return to its original shape.

The oscillations were exacerbated by two major factors: a deck construction not stiff enough to dampen the twisting, and vortices downwind of the bridge that essentially turned it into a giant flapping airplane wing. After a cable finally snapped, Gertie had its last gallop, collapsing only two years after its initial 1938 construction.

How Systems Thinking would have helped: “Environment” is a key input to Systems Thinking, and in this case, that environment involved predictable forces acting on system components with enough force to cause structural failure. Cost concerns led to cutbacks on the initial design, which called for trusses that would have prevented the collapse. But failure to evaluate the interdependence of system components ultimately doomed the bridge.

 

Does anticipating and averting disaster sound like the right career path for you? Check out WPI’s graduate certificate program in Systems Thinking.

 

 Love this post? Subscribe to Catalyst

Leave a Reply

Your email address will not be published. Required fields are marked *