Nobody trusts a backtest.
Of course, in professions other than trading, historical data matters.
If we were considering taking lessons from a tennis pro who'd given over 2,000 lessons, we would trust that instructor. If we came across a doctor who'd performed 2,000 surgeries, we'd feel confident going under the knife.
But if we came across a researcher who'd tested trading systems for over 2,000 hours, we'd be skeptics.
Is it because the internet is full of liars? Is it because we've been told that backtesting never works in real life? Is it because we're all very nervous when it comes to money?
Whatever the reason, the fact remains.
Nobody trusts a backtest.
And that leads us to a big problem. Of all the things that go into trading a system successfully, trust is most important. If we don't trust our system, we lose. There's no way around that.
So how do we learn to trust?
If we know that our system wins 60% of the time, we don't worry when we lose a few trades. If we know we take 5 big losses per year, we don't flinch when the big loss comes around. If we know our max drawdown is $5,000, we don't worry when our account goes down a couple thousand.
Without evidence, we throw our system out after a loss. With evidence, we confidently soldier on.
Yet even when we know this--that evidence is the key--we still remain pessimistic.
Sure, that backtest looks good. But it won't work in real life.
We say we're looking for the truth, but when it comes around we say, "Go away. I'm looking for the truth."
What's the solution? Real-life results.
What if the backtesting turned out to be nearly identical to what happened in real-life? That could be a game-changer.
For example, if someone tested a system for over 2,000 hours, but the real-time results didn't match the testing, then our confidence would be zero.
If, however, someone tested for over 2,000 hours and real-life matched up, then confidence would be high.
Do we have any examples of this?
Let's look at the Hornet. I've traded live with the Hornet longer than any other robot, so it has the most evidence. That should be a good case-study.
First, let's examine the time I made 100% in a year. Back in 2014, I decided I'd tested enough and I wanted to see what a robot could do. So, in May of that year, I cranked up the leverage as high as possible (Forex regulations were a little less stringent back then) and tried to get 50% in a year. Obviously, I didn't get 50%. I did much better.
From May, 2014 to May, 2015, my $2,000 Tradestation account turned into $4,000 (I have screenshots from my account statements if interested). That was real-life.
How does that compare with the backtesting?
Going back to the Tradestation data, we can see that it's very similar to my actual results. Keep in mind that this uses the compounding technique I've used in many posts:
Two things jump out. One, the account dropped like a rock right off the bat. By the end of month number two, my account was down about 20%. How many of us would quit at that point?
Two, the end result is almost the same as my real account. The testing showed I should have been up 91% after twelve months and I was up 100% at that time.
The thousands of hours I'd spent testing had paid off. The expectations set by the data had been realized in live trading.
So let's fast forward and look at 2017 results.
Using a tiny Hornet portfolio with trade sizes set pretty aggressively, the Tradestation data says we should be up 14.6% on the year. It's been a fairly chaotic year and many of the best traders in the world are negative so far in 2017. Still, the data says we should be up about 14%.
According to this myfxbook account (trading the exact same portfolio), we're up 20.6% (as of Monday, August 28, 2017). Once again, real-life results are better than the testing.
Testing shows a 14% gain. Real-time shows a 20% gain.
Why is that? The main reason is that I build in conservative estimates of trading costs into all of my research. I want my research to not be as good as live trading.
But how about MT4 results? What does that look like?
I ran a backtest on MT4 for all of the members in the portfolio, and MT4 shows that we should be up 18.97% for the year. Again, real life results are better than that.
One of the big skeptical arguments against trusting backtesting is that real-life spreads can very wildly. In fact, we've had some idiotic spread-widening lately around the 5 pm EST time. This has turned some would-be winners into actual losers.
However, even with the problem with spread-widening (and news events and holidays), real-life still beats the testing on both MT4 and Tradestation (my preferred testing platform).
Make no mistake, trust is the secret to successful trading. Our job is to figure out who and what we're going to believe in.
And then let it rip.
**No Thursday webinar this week, but next week, we'll look at how the Heron's results match up.
Get the Heron Course here.
Note: The summer session of Thursday webinars start this week. Register here: https://attendee.gotowebinar.com/register/586982361387971587
My new website is here: https://www.scottwelsh.me/
My new eBook, The Inevitability of Becoming Rich: An Interview with a Master, is available on Amazon, and you can get it here.
For information on all my trading courses, go here: https://www.scottwelsh.me/courses/
The recordings of the Thursday webinars go on my YouTube channel. You can Subscribe by going here: https://www.youtube.com/channel/UCxAWDDaTLVy_diMVJCkGl3A
If you'd like a copy of my free eBook, go to https://www.scottwelsh.me/free-ebook/ and just fill out the form.
Follow me on Twitter @swelsh66.