I wrote a memo a week for six weeks starting on March 3, but I’ve skipped the last three weeks.  First, the string had to end sometime.  And second, I try to adhere to the principle that if I don’t have anything additive to say, I don’t write.  Hopefully you’ll find this one worth reading.

Our inability to know the future is a theme I’ve touched on repeatedly over the years, but now I’ve decided to devote an entire memo to it.  Being at home for nearly two months means I’ve had a lot of time on my hands, like everyone else.  And it’s a good thing, because getting philosophical musings down on paper is a lot harder than writing about current events and what to do about them. 

And while I’m explaining myself, I’ll apologize up front for the number of citations and their length – but there’s so much wisdom I want to share.

All We Don’t Know

As everyone knows, today we’re experiencing unprecedented (or at least highly exceptional) developments in four areas: the pandemic, the economic contraction, the oil price collapse and the Fed/government response.  Thus a number of considerations make the future particularly unpredictable these days:

  • The field of economics is muddled and imprecise, and there’s good reason it’s called “the dismal science.” Unlike a “real” science like physics, in economics there are no rules that one can count on to consistently produce a given outcome, as in “if a, then b.” There are only patterns that tend to repeat, and while they may be historical, logical and often-observed, they’re still only tendencies.
  • In some recent memos, I’ve mentioned Marc Lipsitch, Professor of Epidemiology at Harvard’s T.H. Chan School of Public Health. In my version of hierarchy, there are (a) facts, (b) logical inferences from past experience and (c) guesses. Because of the imprecision of economics, there certainly are no facts about the economic future. Economists and investors make inferences from past patterns, but these are unreliable at best, and I think in many cases their judgments fall under the heading of “guesses.”
  • These days, I’m often asked questions like “Will the recovery be V-shaped, or a U, W or L?” and “Which of the crises you’ve lived through does this one most resemble?” Answering questions like those requires a historical perspective.
  • Given the exceptional developments enumerated above, however, there’s little or no history that’s relevant to today. That means we don’t have past patterns to fall back on or to extrapolate from. As I’ve said, if you’ve never experienced something before, you can’t say you know how it’s going to turn out.
  • While unique developments like those of today make forecasting unusually difficult, the presence of all four elements at once probably renders it impossible. In addition to the difficulty of understanding each of the four individually, we can’t be sure how they’ll interact. For example:
    • Will the massive, multi-faceted Fed/Treasury program of loans, grants, stimulus and bond buying be sufficient to offset the unparalleled damage done to the economy by the fight against Covid-19?
    • To what extent will the reopening bring back economic activity, and to what extent will that cause the spread of the disease to resume, and the renewal of lock-downs?

For investors, the future is determined by thousands of factors, such as the internal workings of economies, the participants’ psyches, exogenous events, governmental action, weather and other forms of randomness.  Thus the problem is enormously multi-variate.  Take the current situation with its four major components (Covid-19, the economy, oil and Fed), and consider just one: the disease.  Now think about all the questions surrounding it:

  • How many people have it, including those who are asymptomatic?
  • How likely is contact with someone who’s infected to create another case?
  • To what degree will distancing and masks deter its spread?
  • Will the cases be severe, mild or asymptomatic? Why?
  • Will the supply of protective gear for medical personnel, hospital beds and ventilators be adequate?
  • Will a treatment be developed? To what extent will it speed recovery and prevent fatalities?
  • What will the fatality rate be relative to age, gender and pre-existing conditions? Will the impact of the disease on young people worsen?
  • Will people who’ve had it and recovered be immune? Will their immunity be permanent?
  • Will the virus mutate, and will immunity cover the new forms?
  • Will it be possible to inject antibodies to prevent infection?
  • How many people have to be immune for herd immunity to effectively stop the further spread?
  • Will social distancing delay the achievement of herd immunity? Is the Swedish approach better?
  • Will a vaccine be invented? When? How long will it take to produce and deliver the needed doses? Where will the U.S. stand in the line to get it?
  • How many people will refuse to be vaccinated? With what effect?
  • Will vaccination have to be renewed annually?
  • Will the virus succumb to warm weather and humidity?
  • Will the virus be with us permanently, and will it be controllable like “just another seasonal disease”?

Where am I going with this?  My point is that very few people can balance all these considerations to figure out our collective risk.  And that’s just Covid-19.  Now think about the many questions that pertain to each of the three other factors.  Who can respond to this many questions, come up with valid answers, consider their interaction, appropriately weight the various considerations on the basis of their importance, and process them for a useful conclusion regarding the virus’s impact?  It would take an exceptional mind to deal with all these factors simultaneously and reach a better conclusion than most other people.  (I believe a computer couldn’t do so either, especially given all the subjective decisions required in the absence of historic precedent.)

The challenge lies in trying to be above average in assessing the future.  Why is that so hard?

First of all, forecasting is a competitive arena.  The argument for the difficulty of out-forecasting others is similar to the argument for market efficiency (and thus the limitations of active management).  Thousands of others are trying, too, and they’re not “empty suits.”  Many of them are educated, intelligent, numerate, hard-working, highly motivated and able to access vast amounts of data and computing power.  So by definition it shouldn’t be easy to be better than the average.

In addition, since economics is imprecise, unscientific and inconsistent in its functioning, as described above, there can’t be a method or process for forecasting that works consistently.  To illustrate randomness, I say that if, when I graduated from business school, I was offered a huge budget, an army of PhDs and lavish financial incentives to predict the coin toss before each Sunday’s football games, I would have been a flop.  No one can succeed in predicting things that are heavily influenced by randomness and otherwise inconsistent.

Now consider the possibility that reaching conclusions is especially difficult in times of stress like today:

  • [Recent advances in neuroscience] suggest that we are no more than “inference machines” with various degrees of sophistication in how we explain our thoughts.  In other words, we use a lot of pattern-driven guesswork as we go about our daily lives or to fill in the gaps in an incomplete narrative. 
  • This is especially true in times of stress, as many of the mental processes that govern our reactions are associated with an urgent search for patterns to determine our moves.  That is our snap reaction in economic or financial crises and why we cling to our repertoire of charts of V, U or L-shapes of recovery, among many.
  • But, in very dislocated environments, we find serious limitations to this approach.
  • Looking at the current environment, with disruptions to supply, demand, health and liquidity tensions, we could build an ensemble of the Spanish flu, the Fukushima earthquake and components of the 2008 crisis, for example.  But given the very specific contexts of each event, we may run into endless combinations of the lessons learnt from these events. 
  • As a matter of fact, in a side-by-side comparison of many economic forecasts, even similar assumptions drive very different outcomes on how this crisis will play out.  This may be a case of the “Anna Karenina principle” coined by Professor Yossi Sheffi at Massachusetts Institute of Technology.  Paraphrasing Tolstoy, while happy economies are all alike, every unhappy economy is unhappy in its own way. 
  • We can’t assume that the response to public health or financial interventions will be similar across vastly different contexts.  The root cause of this mistake is to look at average responses from past events.  But the reality is not like that.  (Juan-Luis Perez, head of research, Evidence Lab and Analytics, at UBS, the Financial Times, April 22, emphasis added)

So forecasting is difficult for a large number of reasons, including our limited understanding of the processes that will produce the future, their imprecise nature, the lack of historical precedent, the unpredictability of people’s behavior and the role of randomness, and these difficulties are exacerbated by today’s  unusual circumstances.

Senior economics consultant Neil Irwin put it together very well in The New York Times on April 16:

  • The world economy is an infinitely complicated web of interconnections.  We each have a series of direct economic interrelationships we can see: the stores we buy from, the employer that pays our salary, the bank that gives us a home loan.  But once you get two or three levels out, it’s really impossible to know with any confidence how those connections work.
  • And that, in turn, shows what is unnerving about the economic calamity accompanying the spread of the novel coronavirus.
  • In the years ahead we will learn what happens when that web is torn apart, when millions of those links are destroyed all at once.  And it opens the possibility of a global economy completely different from the one that has prevailed in recent decades. 

I couldn’t agree more with what Irwin says.  Or, to use one of my all-time favorite quotes, from John Kenneth Galbraith:

  • We have two classes of forecasters:  Those who don’t know – and those who don’t know they don’t know.

While I’m in the subject of favorite quotes, I’ll take advantage of the occasion to share some others on this subject that I’ve stored up over the years (I think the first one may be the greatest ever):

  • No amount of sophistication is going to allay the fact that all of your knowledge is about the past and all your decisions are about the future.Ian E. Wilson (former Chairman of GE)
  • Those who have knowledge don’t predict; those who predict don’t have knowledge.Lao Tzu
  • People can foresee the future only when it coincides with their own wishes, and the most grossly obvious facts can be ignored when they are unwelcome.George Orwell
  • Forecasts create the mirage that the future is knowable. Peter Bernstein
  • I never think of the future – it comes soon enough.Albert Einstein
  • The future you shall know when it has come; before then forget it.Aeschylus
  • Forecasts usually tell us more of the forecaster than of the future.Warren Buffett

I think you get the point.  I seem to be in good company in my belief that the future is unknowable.

Having made that assertion, I’ll admit that it’s an extreme oversimplification and not entirely correct.  There actually are things we know about the macro future.  The trouble is that, mostly, they’re things everyone knows.  Examples include the fact that U.S. GDP grows about 2% per year on average, heating oil consumption increases in winter; and a great deal of shopping is moving on-line.  But since everyone knows these things, they’re unlikely to be much help in the pursuit of above average returns.  As I’ve described before, the things most people expect to happen – consensus forecasts – are by definition incorporated into asset prices at any point in time.  Since the future is usually a lot like the past, most forecasts – and especially macro forecasts – are extrapolations of recent trends and current levels, and they’re built into prices.  Since extrapolation is appropriate most of the time, most people’s forecasts are roughly correct.  But because they’re already reflected in security prices, most extrapolations aren’t a source of above average returns

The forecasts that produce great profits are the ones that presciently foresee radical deviations from the past.  But that kind of forecast is, first, very hard to make and, second, rarely right.  Thus most forecasts of deviation from trend also aren’t a source of above average returns.

So let me recap: (a) only correct forecasts of a very different future are valuable; (b) it’s hard to make forecasts like that, (c) such unconventional predictions are rarely right, (d) thus it’s hard to be an above average forecaster, and (e) it’s only above average forecasts that lead to above average returns.

So there’s a conundrum:

  • Investing is the art of positioning capital so as to profit from future developments.
  • Most professional investors strive for above average returns (i.e., they want to beat the market and earn their fees).
  • However, according to the above logic, macro forecasts shouldn’t be expected to lead to above average returns.
  • Yet very few people are content to invest while practicing agnosticism with regard to the macro future. They may on some level understand the difficulty entailed in forecasting, but their reluctance to admit their ignorance of the future (especially to themselves) usually overcomes that understanding with ease.
  • And so they keep trying to predict future events – and the investment industry produces a large volume of forecasts.

As I’ve expressed in recent memos, I feel the process through which most of us arrive at our view of the future is highly reflective of our biases.  Given the unusually wide chasm between the optimistic and pessimistic cases at this time – and the impossibility of choosing between them based on facts and historical precedents (since there are none) – I continue to think about the role of bias.

One of the biggest mistakes an investor can make is ignoring or denying his or her biases.  If there are influences that make our processes less than objective, we should face up to this fact in order to avoid being held captive by them.

Our biases may be insidious, but they are highly influential.  When I read articles about how difficult it will be to provide adequate testing for Covid-19 or to get support to small businesses, I’m pleased to see my wary views reinforced, and I find it easy to incorporate those things into my thinking.  But when I hear about the benefits of reopening the economy or the possibility of herd immunity, I find it just as easy to come up with counter-arguments that leave my concerns undented.  This is a clear example of “confirmation bias” at work:

  • Once we have formed a view, we embrace information that confirms that view while ignoring, or rejecting, information that casts doubt on it.  Confirmation bias suggests that we don’t perceive circumstances objectively.  We pick out those bits of data that make us feel good because they confirm our prejudices.  Thus, we may become prisoners of our assumptions.   (Shahram Heshmat, Psychology Today, April 23, 2015)

As Paul Simon wrote 50 years ago for the song The Boxer, “. . . a man hears what he wants to hear and disregards the rest.”

While I didn’t know the name for it, I’ve long been aware of my bias.  In a recent memo, I told the story from 50 years ago, when I was Citibank’s office equipment analyst, of being asked who the best sell-side analyst on Xerox was.  My answer was simple: “The one who agrees with me most is so-and-so.”  Most people are unlikely to think highly of anyone whose views they oppose.  So when we think about which economists we quote, which investors we respect, and where we get our information, it’s likely that their views will parallel ours. 

Of course, taken to an extreme, this has resulted in the unfortunate, polarized state in which we find the U.S. today.  News organizations realized decades ago that people would rather consume stories that confirm their views than those that challenge them (or are dully neutral).  Few people follow media outlets that reflect a diversity of opinion.  Most people stick to one newspaper, cable news channel or political website.  And few of those fairly present both sides of the story.  Thus most people hear a version of the news that is totally unlike the one heard by those on the other side of the debate.  When all the facts and opinions you hear confirm your own beliefs, mental life is very relaxed but not very enriching.

What’s the ideal?  A calm, open mind and an objective process.  Wouldn’t we all be better off if those things were universal?

In Praise of Doubt

Another favorite theme of mine – and I’m mildly apologetic for its repetition in these memos – is how important it is to acknowledge what we don’t know. 

First of all, if we’re going to out-invest the rest, we need a game plan.  There are a lot of possible routes to success on which to base your process: in-depth research into companies, industries and securities; arbitrage; algorithmic investing; factor investing; even indexation.  But if I’m right about the difficulty of macro forecasting, for most people that shouldn’t be it.

Second, and probably more importantly, excessive trust in forecasting can be dangerous to your financial health.  It’s never been put better than in the quote that’s often attributed to Mark Twain, but also to several others:

  • It ain’t what you don’t know that gets you into trouble. It’s what you know for sure that just ain’t so.

Just a few words, but a great deal of wisdom.  No statement that starts with “I don’t know but . . .” or “I could be wrong but . . .” ever got anyone into big trouble.  If we admit to uncertainty, we’ll investigate before we invest, double-check our conclusions and proceed with caution.   We may sub-optimize when times are good, but we’re unlikely to flame out or melt down.  On the other hand, people who are sure may dispense with those things, and if they’re sure and wrong, as the quote suggests, the outcome can be catastrophic.

Investing is challenging in this way, as in so many others.  Active investors have to be confident.  Yale’s David Swensen said it as well as it can be said (that’s why I go back to this quote so often in my memos and books):

  • Establishing and maintaining an unconventional investment profile requires acceptance of uncomfortably idiosyncratic portfolios, which frequently appear downright imprudent in the eyes of conventional wisdom.  (Pioneering Portfolio Management)

To do better than most, you have to depart from the crowd.  As I said in my April 6 memo Calibrating, echoing Swensen, all great investments begin in discomfort, since the things everyone likes and feels good about are unlikely to be on the bargain counter.  But to invest in things that are out of favor – at the risk of standing out from the crowd and appearing to have made a big mistake – takes confidence and resolve.  It also requires confidence to hold onto a position when it declines, and perhaps add to it at lower prices, in the period before one’s wisdom becomes clear and it turns into a winner. And it takes confidence to continue holding a highly appreciated investment you think still has upside potential, at the risk of possibly giving up some of the gains to date.

But when does reason-based confidence turn into hubris and obstinateness?  That’s the key question.  Holding and adding to declining positions is only a good idea if the underlying thesis turns out to be right and things eventually go as expected.  In other words, when do you allow for the possibility that you’re wrong? 

From the very beginning of my investing career, I’ve felt a sense of uncertainty.  But I don’t think that’s a bad thing:

  • “Investing scared” – a less glamorous term than “applying appropriate risk aversion” – will push you to do thorough due diligence, employ conservative assumptions, insist on an ample margin of safety in case things go wrong, and invest only when the potential return is at least commensurate with the risk. In fact, I think worry sharpens your focus.Investing scared will result in making fewer mistakes (although perhaps at the price of failing to take maximum advantage of bull markets).
  • When I started investing in high yield bonds in 1978, and when Bruce Karsh and I first targeted distressed debt in 1988, it seemed clear that the route to long-term success in such uncertain areas lay in limiting losses rather than targeting maximum gains. That approach has permitted us to still be here, while many one-time competitors no longer are.
  • I can tell you that in the Global Financial Crisis, following the bankruptcy of Lehman Brothers, we felt enormous uncertainty. If you didn’t, there was something wrong with you, since there was a meaningful possibility the financial system would collapse. When we start buying, Bruce came to me often saying, “I think we’re going too slow,” and then the next day, “I think we’re going too fast.” But that didn’t keep him from investing an average of $450 million per week over the last 15 weeks of 2008. I think Bruce’s ability to grapple with his doubts helped him arrive at the right pace of investment.

The topic of dealing with what you don’t know brings me to a phrase I came across a few years ago and think is very important: intellectual humility.

Here’s part of the article that first brought it to my attention:

  • “Intellectual humility” has been something of a wallflower among personality traits, receiving far less scholarly attention than such brash qualities as egotism or hostility.  Yet this little-studied characteristic may influence people’s decision-making abilities in politics, health and other arenas, says new research from Duke University. . . .
  • As defined by the authors, intellectual humility is the opposite of intellectual arrogance or conceit.  In common parlance, it resembles open-mindedness.  Intellectually humble people can have strong beliefs, but recognize their fallibility and are willing to be proven wrong on matters large and small, Leary said.  (Alison Jones, Duke Today, March 17, 2017, emphasis added)

To get a little more technical, here are a couple of useful paragraphs from a discussion of the paper cited above:

  • The term, intellectual humility (IH), has been defined in several ways, but most definitions converge on the notion that IH involves recognizing that one’s beliefs and opinions might be incorrect. . . .  Some definitions of IH include other features or characteristics – such as low defensiveness, appreciating other people’s intellectual strengths, or a prosocial orientation . . .
  • One conceptualization defines intellectual humility as recognizing that a particular personal belief may be fallible, accompanied by an appropriate attentiveness to limitations in the evidentiary basis of that belief and to one’s own limitations in obtaining and evaluating relevant information.  This definition qualifies the core characteristic (recognizing that one’s belief may be wrong) with considerations that distinguish IH from mere lack of confidence in one’s knowledge or understanding.  IH can be distinguished from uncertainty or low self-confidence by the degree to which people hold their beliefs tentatively specifically because they are aware that the evidence on which those beliefs are based could be limited or flawed, that they might lack relevant information, or that they may not have the expertise or ability to understand and evaluate the evidence.  (The Psychology of Intellectual Humility, Mark Leary, Duke University, emphasis added)

“Attentiveness to limitations in the evidentiary basis” (or to the limitations imposed by future uncertainty) is a very important further concept.  Here’s how I discussed it in my book Mastering the Market Cycle:

  • Most people think the way to deal with the future is by formulating an opinion as to what’s going to happen, perhaps via a probability distribution. I think there are actually two requirements, not one. In addition to an opinion regarding what’s going to happen, people should have a view on the likelihood that their opinion will prove correct. Some events can be predicted with substantial confidence (e.g., will a given investment grade bond pay the interest it promises?), some are uncertain (will Amazon still be the leader in online retailing in ten years?) and some are entirely unpredictable (will the stock market go up or down next month?) It’s my point here that not all predictions should be treated as equally likely to be correct, and thus they shouldn’t be relied on equally. I don’t think most people are as aware of this as they should be.

In short, we have to have a realistic view of the probability that we’re right before we choose a course of action and decide how heavily to bet on it.  And anyone who’s sure about what’s going to happen in the world, the economy or the markets is probably deceiving himself. 

It all comes down to dealing with uncertainty.  To me, that starts with acknowledging uncertainty and having an appropriate degree of respect for it.  As I quoted Annie Duke this past January, in my memo You Bet!:

  • What good poker players and good decision-makers have in common is their comfort with the world being an uncertain and unpredictable place.  They understand that they can almost never know exactly how something will turn out.  They embrace that uncertainty and, instead of focusing on being sure, they try to figure out how unsure they are, making their best guess at the chances that different outcomes will occur.  (Thinking in Bets)

To put it simply, intellectual humility means saying “I’m not sure,” “The other person could be right,” or even “I might be wrong.”  I think it’s an essential trait for investors; I know it is in the people I like to associate with.

As so often happens when I’m thinking about a memo, I recently got an incredibly helpful note from my friend Leslie Lichtenstein at the University of Chicago, connecting the concept of humility to the current episode.  Here’s what she wrote:

  • This morning I read an article from Behavioral Scientist by Eric Angner [professor of practical philosophy at Stockholm University] called “Epistemic Humility – Knowing Your Limits in a Pandemic,” which made me think of you and several of your recent memos. The article opens with a quote from Charles Darwin in 1871 – “Ignorance more frequently begets confidence than does knowledge.” It goes on to say, “Being a true expert involves not only knowing stuff about the world but also knowing the limits of your knowledge and expertise.” (Emphasis in Leslie’s note)

I couldn’t agree more.  People who are always sure are no more helpful than people who are never sure.  The real expert’s confidence is reason-based and proportional to the weight of the evidence.  Leslie’s note sent me to the original of the article she cited, and I found so much to share:

  • In the middle of a pandemic, knowledge is in short supply.  We don’t know how many people are infected, or how many people will be.  We have much to learn about how to treat the people who are sick – and how to help prevent infection in those who aren’t.  There’s reasonable disagreement on the best policies to pursue, whether about health care, economics, or supply distribution.  Although scientists worldwide are working hard and in concert to address these questions, final answers are some ways away.
  • Another thing that’s in short supply is the realization of how little we know. . . . 
  • Frequent expressions of supreme confidence might seem odd in light of our obvious and inevitable ignorance about a new threat.  The thing about overconfidence, though, is that it afflicts most of us much of the time.  That’s according to cognitive psychologists, who’ve studied the phenomenon systematically for half a century.  Overconfidence has been called “the mother of all psychological biases. . . .” 
  • The point is not that true experts should withhold their beliefs or that they should never speak with conviction.  Some beliefs are better supported by the evidence than others, after all, and we should not hesitate to say so.  The point is that true experts express themselves with the proper degree of confidence—meaning with a degree of confidence that’s justified given the evidence. . . . 
  • [Compare what you hear on TV against a tweet from medical statistician Robert Grant]: “I’ve studied this stuff at university, done data analysis for decades, written several NHS guidelines (including one for an infectious disease), and taught it to health professionals.  That’s why you don’t see me making any coronavirus forecasts. . . .”
  • The concept of epistemic humility is . . . an intellectual virtue. It is grounded in the realization that our knowledge is always provisional and incomplete—and that it might require revision in light of new evidence.  Grant appreciates the extent of our ignorance under these difficult conditions; the other characters don’t.  A lack of epistemic humility is a vice – and it can cause massive damage both in our private lives and in public policy.
  • Calibrating your confidence can be tricky. As Justin Kruger and David Dunning have emphasized, our cognitive and metacognitive skills are intertwined.  People who lack the cognitive skills required to perform a task typically also lack the metacognitive skills required to assess their performance.  Incompetent people are at a double disadvantage, since they are not only incompetent but also likely unaware of it [Galbraith’s forecasters “who don’t know they don’t know”!].  This has immediate implications for amateur epidemiologists.  If you don’t have the skill set required to do advanced epidemiological modeling yourself, you should assume that you can’t tell good models from bad.
  • . . . it’s never been more important to learn to separate the wheat from the chaff – the experts who offer well-sourced information from the charlatans who offer little but misdirection.  The latter are sadly common, in part because they are in greater demand on TV and in politics. It can be hard to tell who’s who.  But paying attention to their confidence offers a clue.   People who express themselves with extreme confidence without having access to relevant information and the experience and training required to process it can safely be classified among the charlatans until further notice. . . .
  • Again, it is fine and good to have opinions, and to express them in public – even with great conviction.  The point is that true experts, unlike charlatans, express themselves in a way that mirrors their limitations.  All of us who want to be taken seriously would do well to demonstrate the virtue of epistemic humility.  (Eric Angner, Behavioral Scientist, April 13, emphasis added)

The more I think about it, the bottom line is clear:

  • The world is an uncertain place.
  • It’s more uncertain today than at any other time in our lifetimes.
  • Few people know what the future holds much better than others.
  • And yet investing deals entirely with the future, meaning investors can’t avoid making decisions about it.
  • Confidence is indispensable in investing, but too much of it can be lethal.
  • The bigger the topic (world, economy, markets, currencies and rates) the less possible it is to achieve superior knowledge.
  • Even our decisions about smaller things (companies, industries and securities) have to be conditioned on assumptions regarding the bigger things, so they, too, are uncertain.
  • The ability to deal intelligently with uncertainty is one of the most important skills.
  • In doing so, we should understand the limitations on our foresight and whether a given forecast is more or less dependable than most.
  • Anyone who fails to do so is probably riding for a fall.

As Neil Irwin wrote in the article cited on page 4:

  • It would be foolish, amid such uncertainty, to make overly confident predictions about how the world economic order will look in five years, or even five months. 

Or maybe Voltaire said it best 250 years ago: Doubt is not a pleasant condition, but certainty is absurd.

May 11, 2020

0 Comments

Leave a reply

©2024 Finom Group | Website by: Ocala Website Designs LLC

Log in with your credentials

or    

Forgot your details?

Create Account