(by Jeffrey Kluger, Hyperion, ISBN 978-1-4013-0301-3)
Cholera outbreak in London; John Snow (physician/investigator) knew there was a source to the outbreak, but needed the data to prove it; canvassed the neighborhood, tracked it to a pump on Broad Street. Knew two things: Plagues were fantastically complex things--with the illness working myriad horrors in the body and spreading across the landscape in myriad ways. But diseases moved through simple choke points too--one person's handkerchief, one person's handshake, one handle on one foiled pump--all of them bottlenecks in the pathogen's flow. Seal off one, stop the disease. The complex illness could collide hard with the simple fix.
M.Coy bookshop (Michael Coy and Michael Brasky) in Seattle vs Amazon in the complex art/science of book recommendations. The machines aren't parsing data as much as steam-shoveling it, digging up vast masses of information; the likes of Brasky and Coy do the very same thing, but finely/instantly using human clues. It's easy to say that one of the two approaches is more complex than the other; it's a lot harder to say which one.
Complexity is a slipper idea, one that defies almost any effort to hold it down and pin it in place. Things that seem complicated can be preposterously simple; things that seem simple can be dizzyingly cpmlex. Manufacturing plant vs houseplant. Colony of garden ants vs community of people. Sentence vs book, couplet vs song, hobby shop vs corporation. "Human beings are not wired to look at matters that way; we're suckers for scale. Things that last for a long time impress us more than things that don't, things that scare us by their sheer size strike us more than things we dwarf. Star vs guppy. Consider a pencil: cedar (wood for the dowel), bauxite (aluminum sleeve), coal (graphite), polymers (eraser); who feeds Paris?
Psych is waking up to complexity too. Six-child family vs one-child family, complex in different ways. Political science plays simplicty-complexity, too; equally complex the stock market, crowds, geology, biology, even politics.
Trying to distill this down to a working definition of complexity and simplicity is hard.
Brain is a real-time machine, constantly scanning for input and assembling into impressions and actions; that was useful for survival, but it can mislead us now, causing us to overfocus on the most conspicuous features of a thing and be struck--or confused--by that quality. Thus we are confused by beauty, by speed, by big numbers, by small numbers, by our small fear, ....
(Confused by Everyone Else)
Treasury Secretary James Baker III, trip to Bonn to discuss Germany lower their interest rates and juice the mark, German finance minister did not agree, and Baker was unhappy: "not particularly pleased". This spooked the stock market, which triggered the crash of 87. Nothing smart or sensible had taken place that day--but smart and sensible forces had not ben at work, market forces had been. For every market analyst who sees traders as the informed and educated people they surely can be, there are scientists who see them another way: as unthinking actors who obey not so much the laws of economics as the laws of physics. Investors react not so much to variables that are in their interests, but to those that are in everyone else's interests. When the tide of the market shifts, most of us shift with it; when it flows back the other way, we do the same. We like to think we're informed by trends, but often as not, we're simply fooled by them--snookered by what everyone else is doing into concluding that we ought to do the same. "Economic models begin with the assumption of perfect rationality, of a universe of logical people all doing what they can to master their utility; physicists studying economics begin with the assumption that people can't think." (J. Doyne Farmer, Santa Fe Institute in New Mexico). "The term we use is zero-intelligence investors" (John Miller, economist at Carnegie Mellon University)
One of Murray Gell-Mann's favorite ways of deciding whether something is simple or complex is to ask a decidedly unscientific question: How hard is it to describe the thing you're trying to understand? "Start with the minimum description length as your first inquiry. The shorter it is, the simpler the thing is likely to be. In most cases, you don't even have to take your language from scientific discourse. It can come from ordinary speech." But description depends on context, and context changes everything. "Imagine an anthropologist approaching a civilization with which he shares a common language, but which is naive of any culture outside of its own. Now imagine trying to explain to that community a tax-managed mutual fund. What do you think the preamble to that explanation would be?" Description length has its limits; the problem is that it begins with an assumption of a clean and consistent line, with simplicity and short descriptions at one end and complexity and long descriptions at the other. A clean line, however, doesn't really capture things as much as a somewhat messier arc does.
Complexity scientists like to talk about hte ideas of pure chaos and pure robustness--and both are exceedingly simple things. Either extreme is uninteresting. Where you'd find rea complexity would be somewhere between those two states, the point at which the molecules begin to climb from disorder, sorting themselves into something interesting and organized, but catching themselves before they descend down the other side of the complexity hill. The more precisely the object can balanc at the pinnacle of that arc, the more complex it is. "It's the region between order and disorder that gives you complexity, not the order and disorder at the ends."
Things are not even that straightforward as even that explanation suggests, since any one system is not necessarily composed of just one point on the arc. Many different points come into play and the question of complexity turns on which one you choose. A foot-long copper pipe might be nearly as static as frozen carbon, but take in the vast array of skyscraper plumbing (of which it is just a part) and things look a lot more complicated.
Same holds true for human behavior. Consider all that goes into a handshake.
Any system (chemical, physical, cultural, fiscal) must be seen at all levels before you can begin to make a real determination about whether it can truly be called complex. "Ask me why I forgot my keys this morning and the answer might be simply that my mind was on something else. Ask me about the calcium channels in my brain that drive remembering or forgetting and you're asking a much harder question."
Blake LeBaron (Brandeis University) has an entire (virtual) stock exchange of his own. Over the years he's developed algorithms that allow him to simulate any kind of market--bull, bear, static, active, mixed--and then release simulated investors into that environment to see how they behave. On the whole, artificial traders behave precisely like the real ones, which is to say that they never show much imagination. "At the beginning of a run, all the traders just wobble around a bit, looking for guidance. Then someone will try a new strategy and do quite well at it and the others will notice. Pretty soon, a few more are trying it and it starts to get popular, like a clothing fad. Ultimately everyone starts to converge on that strategy and it dominates the market, precisely the way real markets behave." But it only goes so long--if everyone is chasing the same dollar in the same way, it takes only a few players to cash out before all of the other shares start to lose value. This is the classic bubble-popping phenomenon, one that's familiar to all investors. "The market first becomes too stable, too robust, then it collapses into instability.
Fishkeepers (aquarist) can learn a lot about chemistry, botany, animal behavior by watching what takes place on the other side. Study how schools of fish manage their fluid movements without any evident alpha fish leading. (Bird flocking, too?) In order for a traveling fish school to remain cohesive, all of the individuals must maintain a position no more or no less than about a single body length from every individual around them. Get too close, and you collide and compete. Drift too far apart, and you begin to stand out--never a good idea when predators lurk. The most important thing the fish thus keep in mind is not where they're going or who's leading them there, but making sure they don't fall out of ranks along the way.
But who's in charge of the motion? Schools do require at least a few leaders ("informed individuals"), but not many. In an average-sized school, it takes only about 5% of the members to know the proper route and set out in that direction. What's more, as the school grows, that leadership share actually shrinks, so that as few as one in a hundred need have any idea what the goal is and yet they still lead all the others.
The jellybean contest: The average of everybody's guess will come signficantly closer than any one person's guess. "It happens virtually every time; it's freakish and it still amazes me." (Brooke Harrington, professor of public policy, Brown University) More than just laws of probability here--we could all guess high or guess low. Instead, we all guess in such a precise distribution around the target that together we practically hit it. The greater the number of people participating, the closer the collaborative guess becomes.
Harrington wanted to pursue this more-is-more finding further; examined investment clubs (bringing largely inexperienced traders together in the hopes that they will collaboratively choose better). Confirmation: In general, the average group did better than the average individual; and larger, more diverse groups did better than smaller, more homogeneous ones.
One of the great bits of stock market wisdom is that investors of all kinds are jumpy and esily frightened by bad headlines. In 1989, economists selected the 49 most newsmaking events from 1941 to 1987 (including Pearl Harbor, JFK's assassination, Reagan's assassination attempt, Chernobyl). Did the news shake the market? Using the Standard & Poor's Index as their yardstick, what they found was that even on the S&P, bad news in the papers did not have to mean bad news on Wall Street. Biggest drop of all 49 days was a 6.62% drop in 1955 when President Eisenhower had a hard attack. (Compare: 23% loss in 1929 and 1987.) Next biggest drops were 5.38% when North Korea invaded in 1950, and 4.37% loss after Pearl Harbor. Study next looked the other way, examining biggest drops and looking for the news of that day. In only nine of the cases did a nonfinancial news story like international tensions or a political shake-up sem to account for the market movement.
One other exceedingly complex and uniquely emotional variable individual traders bring to the table: fairness; what it looks like or feels like when something looks like a square deal or when the bad guys are about to get away with omething. Researchers long wondered just how powerful the human fairness impulse is. Samuel Bowles (professor economics, University of Siena (Italy)) conducted a fairness test: Two volunteers, V1 and V2, are given a quantity of money to share which they get to keep if they can agree on how to divide it. V1 proposes the split, V2 accepts it. Perfectly rational actors will accept any split, since it's by definition more than what they had when they came in, but the average accepted bid is 57%/43%. "When you ask them why they reject a 20% share, they answer, 'Because that son of a bitch was getting 80%.' So people are willing to pay 20% just to punish a son of a bitch."
The key in all of these situations is to recognize that there is indeed great wisdom in what everyone else knows. But there is sometimes greater wisdom in knowing what only you know. There may be no such thing as mastering the perfect balance of those skills, but there is such a thing as becoming powerfully--and in the case of the markets, profitably--good at it.
(Confused by Instincts)
(9/11, Ed Schmitt, and his decision to leave the building (Tower One) when the first plane struck Tower Two, when so many others stayed.) Ultimately, we are misled by our most basic instincts--the belief that we know where the danger is and how best to respond to keep ourselves alive, whenin fact we sometimes have no idea at all. It's the job of the people who think about such matters to tease all these things apart and put them back together in buildings and vehicles that keep their occupants alive; it's the job of those occupants to learn enough about the systems so that they have the sense to use them.
20,000 people in a building or 1/2 million in a coastal city occupy the same spot on the complexity spectrum as air molecules filling a room--moving randomly and chaotically in all directions, filling all the available space more or less uniformly. Very active, but also very simple and disordered. Send the same people on a stampede down stairways or highways and things quickly grow overloaded and grind to a halt, jumping to the other end of the complexity arc--robust, unchanging, frozen in place, but every bit as simple as the ever-shifting air molecules. It's in the middle of the arc, where the molecules just begin to take some shape (or the people just begin to move to the exits) that true complexity begins to emerge.
Best way to understand the manner in which people move en masse may be to understand the way water does the same, particularly how it navigates around obstacles or breakwaters. A foundered boat or tumbled boulder creates turbulence; a non-sensical post in the middle of the floor does the same. That turbulence staggers their arrival slightly, allowing them to stream through the opening in a reasonably controlled flow, rather than colliding there at once and causing a pileup. The obstacle keeps you at the top of the complexity arc, preventing you from reaching the frozen end. "By adding a little noise to the system you produce coherence in the flow."
This, of course, assumes people behave sensibly. For one thing, people have different levels of decision-making skills, with some behaving more rationally than others. For another, all of us have a tendency to believe that the rest of the group knows what it's doing, and thus will gravitate toward a popular exit simply because other people have chosen it, even if the alternative is perfectly safe and much less congested. Finally, information tends to get distributed unevenly, with some people learning about an emergency first and acting before the others.
Simulations and models can't simulate emotion/fear; for example, the only thing that ought to count in stairways is pure speed; anything short of a stampede should get the building emptied in a hurry. But pure speed is hard to maintain. It doesn't take much for a smooth stream of downward-flowing evacuees to turn turbulent. For one thing, people from middle floors who enter the stairwell somewhere in mid-current can cause things to slow or stop. The delay flows back along the queue like a ripple through water or cars entering a highway. Sequential evacuation is the best way to handle this problem, with people on lower floors leaving first and each high floor following successively; but nobody pretends that people fearing for their lives would wait patiently at their desks until their floor is called. The design of the stairways themselves must provide the extra margin of speed. (Guiderails; lighting; fluorescent tape.) But stairways were too narrow, not taking into account that two-thirds of 2001 Americans are overweight or obese.
Social norms about who's making decisions change during the day-to-day, and all these norms are broken in an emergency. What appears in a time of emergency is something complexity researchers call the emergent norm--that is a whole new set of rules.
(Confused by Social Structure)
(Confused by Payoffs)
(Confused by Scale)
(Confused by Objective)
(Confused by Fear)
(Confused by Silence)
(Confused by Flexibility)
(Confused by False Targets)
(Confused by Loveliness)
Last modified 19 September 2022