CATEGORIES


Are You Smarter Than a Forecaster - Chapter 2


Exert of Chapter 2 from the book Cultural Cycles (Oct 9 2017) by Author Eric Wilson and contributions by Justen Collins available @ http://www.culturalcycles.com

Even though what I do – and forecasting in general – may seem implausible at worst and unreliable at best to many, it is also intuitive to most. Look at the chart below (figure 1) and guess where the next point will fall.

(Figure 1)

Overwhelmingly, almost everyone predicts the next point will fall below the center line and answer “C.


Congratulations! You are now a forecaster. Barely 700 words from where this book began, you have arrived at the same place it took me three decades – studying statistics, obtaining certifications, continuing education, and carrying on a career practicing analytics and forecasting – to reach. What this shows is that predicting the future is not crazy but rooted in some science. As much of it is math or algorithms, there is an equal part that is the art of prediction and logic that we use.

The core of predictive analytics relies on capturing relationships between explanatory variables and the predicted variables from past occurrences, and exploiting them to predict the unknown periods or outcomes. Often the unknown event of interest is in the future, but analytics can be applied to any type of unknown - whether it is in the past, present or future. For example, identifying suspects after a crime has been committed or credit card fraud as it occurs.

With the right data and analytics, a large big-box retailer can even predict when someone is pregnant and when they may be due. Such was the case for a Minneapolis father who found out – from Target of all places – that his fifteen-year-old daughter was pregnant. After shopping at Target, the girl began receiving mail at her father’s house advertising baby items like: diapers, clothing, cribs, and other baby-specific products. Her father was incensed at the company’s attempts to “encourage” pregnancy in teens and complained to the management. Turns out that what Target was doing was collecting point-of-sale data and clustering that data and comparing it to demographics. Through looking at past purchases and seeing patterns and descriptive models, Target could make assumptions of what coupons to send what customers. A few days after the irate father called Target, an embarrassed dad phoned the manager back to apologize. It appeared his daughter actually was pregnant. [1]

So, why are we discussing Target or how to predict sales of widgets for company XYZ? I am hoping to convey that society already uses predictive analytics – as well as intuition – to help determine events and occurrences. Many companies are using data and observations to help see patterns and drive behaviors. We are not just suddenly waking up in a world with pattern dependencies, but they have always been around us – waiting to be discovered and tapped. It is not a leap of faith to look for these same patterns in life and society to help better understand where we have come from but – equally as important – what may be next.

Using sound principles, my experience, your logic, and a little history, we can look at cultures and the world we live in and possibly gain insights into what is to come.

Keeping this in mind - what if I told you major and measurable events and aspects of world – and American – democratic civilizations and cultures have been occurring on roughly an eighty-two-year cycle for centuries?  What if the concentration of wealth or the business output and activity beat at a recurring rhythm that no one ever taught you in school?   What if we see periods of repeating gilded ages and times of religiosity come and go with seemingly predictability? What if things like wars and incidents of death and casualty rates are not that random?

Now, look at the same graph you previously used to forecast the next point with some additional information added (figure 2):


(figure 2)

Where do you imagine the next point will be? Where do you feel like we are heading? The three highlighted times and events in the bottom half of (figure 2) were not chosen at random. It was during the three bottom points that Americans experienced eras that saw over twenty-five percent reductions in business activity and double-digit inflation or unemployment. Sadly, these were also the only eras in U.S. history where conflicts resulted in one percent or more of the population being tallied among the dead, wounded, or missing in combat-related action.

April 19, 1775 - The shot heard ‘round the world and the beginning of the Revolutionary War that claimed close to 20,000 lives and many more casualties representing over one percent of the colony’s population.  Almost eighty-six years to the day later - on April 12, 1861 - Confederate forces under General P.G.T. Beauregard fired on Union soldiers at Fort Sumter. So began the Civil War the deadliest war on U.S. soil with 650,000 (or over three percent of the population) lives claimed or wounded. Eighty years later - on December 7, 1941 - America was attacked and entered World War II.  This was a day that will live in infamy…  until eighty to eighty-two years later when…


(figure 3) Sources: data taken from the “data-Fig4B” tab of the September 2013 update of the spreadsheet appendix to Piketty and Saez (2003). Emmanuel Saez and Gabriel Zucman. “Wealth Inequality in the United States Since 1913: Evidence from Capitalized Income Tax Data.” NBER working paper no. 20625. Cambridge, Mass.: National Bureau of Economic Research 2014. War time causalities and deaths Wikipedia @https://en.wikipedia.org/wiki/United_States_military_casualties_of_war

It is not by chance or accident that America has experienced great cataclysms or “crises” about every eighty-two years or so. Looking back before the shot heard ‘round the world and the founding fathers were signing the Declaration of Independence, another eighty-seven years had just passed since the Anglo-American “Glorious Revolution” of 1689 and Independence Day. Go back a slightly longer period, and you reach the English naval victory over the Spanish Armada—a turning point in England’s history. Another eighty years or so before that takes you to the end of the War of the Roses, a bloody civil war whose passage enabled “Tudor” England to emerge as a modern nation-state.

Can we apply these same principles and logic to the progression (or predictions) of behavior in education trends, business cycles, political swings, religion, or even people, cultures, and societies? Of course we can!  We see it back in (figure 3) with the income or wealth gap growing and the peak followed by a decline occurring approximately every eighty-two years. We saw the last spike just prior to the stock market crash of 1929.  Just shy of eighty years prior to the panic of 1857, we had the peak of 35% of the income going to the top 10% of the population followed by a steep drop. Go just over eighty years earlier (and prior to the Revolution) and we see another pinnacle of the wealth gap during the credit crisis of 1772. Forecasting the future and events is not voodoo, but simply taking sound principles, reason, statistics, and modeling to generate a probable conclusion.

So, what does this mean is next?  In this book, we must start to peel away at this onion and look deeper into the data and patterns to find causal variables and what is driving these cycles and rhythms.  We must use principles, logic, and intuition to better understand what has happened and what will happen next.



Part I
[i] Kashmir Hill. “How Target Figured Out A Teen Girl Was Pregnant Before Her Father Did”. Forbes Magazine Feb 16, 2012