Forecasting isn’t something only experts do when predicting rate increases or recessions, as we all do it virtually every day of our lives. We want to know if we need an umbrella or a jacket when going on a picnic, or whether one contractor will be more reliable than another when doing a home renovation. We choose a particular way home because we believe it will get us there faster than the one we took yesterday. And even though forecasts always lack certainty, they determine present choices and behavior. If you predict it will rain, you pack an umbrella. If a recession is looming, you conserve on spending. Either way, we can’t help but to make predictions of all sorts.
As central as forecasting is in our personal lives, it’s that much more for organizations, especially in a post pandemic era in which economic and political volatility has become a new normal. So much so that if one thing has become clear in the past two years, it’s that forecasting accuracy is now a top priority for every organization, whether public or private, for profit or nonprofit. No entity can rest on vague assumptions about what may or may not occur.
The value of a forecast is that it provides practical answers, and while everyone knows they need to do it decision makers are often unaware of how easily they go awry. These can play out in various ways. For instance, forecasters often use ambiguous terms such as “likely” or “rarely” to indicate some probability to an event, but such words are much too vague. When they fail to express their level of confidence or uncertainty in specific terms, then people are left guessing, which risks them being unduly aggressive or cautious. Part of the problem is that the definitiveness of an estimate depends on the extent of the knowledge on which it’s based, and people will be misled if they assume more or less knowledge than is warranted.
But even when confidence (or lack thereof) is stated, team members may still be left guessing whether the forecaster has overstated or understated how much is actually known. Like other biases psychologists have noted, these problems can have motivational and cognitive roots. For instance, bias can lead to overconfidence or an avoidance of public commitment and accountability – which is exasperated by the fact that people lack the ability to evaluate themselves. In fact, overconfidence is the most commonly observed tendency, even among experts. Finally, predicted events may be so complex that it’s difficult to know which parts of them to take seriously, and under such conditions it’s unclear whether analysts are predicting anything at all.
Unfortunately, the most extensive study ever completed shows that even expert forecasters fall far short of desirability, often making the same mistakes the rest of us do when predicting the future. However, the same research also highlights tangible ways forecasters can be better, which would be huge for any organization trying to navigate the turbulent times in which we live. What emerged is that the best forecasters tend to be eclectic thinkers who are tolerant of counterarguments and who hedge their bets by not straying too far from base-rate probabilities (how often things of this sort happen in situations of this sort). The thing that makes them good is less what they are than what they do e.g., the hard work of research, the careful thought and self-criticism, the gathering and synthesizing of other perspectives, the granular judgments and relentless updating.
In addition to showing the key things that mark the best forecasters, the research revealed 10 specific ways a person can dramatically improve his or her skills, which are as follows:
Rule 1: Triage – focus on areas where your hard work is more likely to pay off.
Rule 2: Break seemingly impossible problems into manageable sub-problems.
Rule 3: Strike the right balance between an inside and outside view.
Rule 4: Strike the right balance between under and overreacting to evidence.
Rule 5: Look for the clashing causal forces at work in each problem.
Rule 6: Strive to distinguish as many degrees of doubt as the problem permits, but no more.
Rule 7: Strike the right balance between under and overconfidence, between prudence and decisiveness.
Rule 8: Look for errors behind your mistakes, but beware of rearview-mirror hindsight biases.
Rule 9: Bring out the best in others and let others bring out the best in you.
Rule 10: Master the error-balancing bicycle.
For more details on the 10 rules and what makes a bad and good forecaster, download the white paper Why Most Forecasts Are Poor and How To Get Better.
Jeana has been in the software industry for 15+ years specializing in ERP reporting solutions. She has decades of experience in creative content development and marketing and enjoys exercising, traveling & spending time with her husband & twin boys.
Better Together: Connect to Any ERP, Multiple Datapoints & Unlimited Users
Experience The Synoptix Platform For Yourself
See how Synoptix’s Financial Planning & Analysis can transform the way you view, budget & run your organization:
Empower your team with real-time visibility into all of your data for the best decisions.
Connect your people, plans & data across all business functions - including finance, sales, production & more.
The Financial Risk of Using Spreadsheets for Reporting It's costing your organization more than you think.
Download Spreadsheets-The Corporate Secret Killer & What to Do About It to better understand its inherent errors, how to manage quality control & overconfidence, with detailed solutions on improving spreadsheets in financial reporting.
We want to provide you with an update on the Log4J vulnerability that was identified this weekend as it relates to your Synoptix installation. The short answer is that there should be no vulnerability issues with Synoptix. Synoptix no longer uses Log4J. Version 7 did use version 1.2 of Log4J (which was not vulnerable), and should therefore also be clear of any vulnerability issues related to Log4J version 2.0-2.14 (which was identified this weekend as having vulnerability).