LOUIS MENDELSOHN’S SPEECH AT THE HARVARD BUSINESS SCHOOL ALUMNI DINNER

Louis Mendelsohn is the guest speaker at the Harvard Business School Alumni Club Dinner

October 8, 1997

First of all, I just wanted to tell you a little about my company and my own role in the financial industry. Market Technologies was founded by me in 1979. I was a hospital administrator in Tampa at the time, and I was very active as a commodities futures trader for many years prior to that. My professional situation evolved to the point where commodity trading was more interesting to me than being a hospital administrator.

As I look back at it now I am very happy that I made that decision in my life. At that point in time, microcomputers were just coming on the scene. I became very interested in their application to financial market analysis and began developing trading software for the financial futures markets, first for my own trading purposes, and then subsequently for commercial purposes in terms of licensing software to other traders. Of course, at that time there was very little software available to futures traders and there was really a ripe opportunity, I felt, for someone to come into the industry and to really make an impact on the area.

Basically, the company has continued along those lines and is essentially an R&D company which licenses proprietary software to traders throughout the world. Market Technologies has clients in around 35 countries or so. As I indicated, my background was as a commodities trader and I evolved into software development through that effort.

The evolution of software for technical analysis purposes in the financial futures markets is a very interesting one. In the late 70’s and early 80’s, computers were really being used as nothing more than glorified calculators. They were being used as number crunchers. There was very little in the way of decision making type software available. There were primitive rule-based programs that were being marketed at the time. They were known as black box systems, where someone would develop a program and have other parties use it and they would not know what the underlying rules were or any of the parameters or variables that went into making the buy and sell signals get generated. It was just a really a very primitive arena at that point in time.

In 1983, after several years of development, I introduced into the industry for the first time, commercially, a capability in software which was known as system testing and optimization. This was something that had been done in the 70’s on main frame computers by large institutional firms, but had really never before been available at the microcomputer level. Basically, that capability was kind of like a time machine.

Now you are probably familiar with software that performs system testing. It’s essentially using past historical data and developing some type of trading analytic system and then testing the system on past data to see if it works. If it doesn’t work on past data, there’s not much likelihood that it’s going to work in the future. And then of course the optimization capability involved being able to more or less tweak the system, to change parameter values, to custom tailor the trading approach to specific markets, or to specific market conditions, and that, of course, became something that was feasible with the advent of microcomputers for individual traders and smaller money managers to be able to have access to. Whereas previously because of the cost involved that level of capability was really beyond their reach.

System testing, while at first not too well received by the industry, within 2 to 3 years really became the backbone of commercially available trading software. Today, it’s really been incorporated into virtually every trading software program in the industry. You may be familiar with some of the more popular mass-marketed programs like MetaStock from Equis International and TradeStation from Omega Research, which incidentally, just went public on the NASDAQ last week. They all base their programs on this concept that I had introduced years earlier, which allows traders to do their testing and optimization.

Of course, the underlying assumption is that history repeats itself, that somehow by looking at past data, by doing some work on the past information, by modeling a market in that respect, you’re going to be able to make money in the future. Needless to say, that’s an assumption that hasn’t fully proven itself in the real world. Nevertheless, that’s all that analysts and traders have to go by. Unfortunately there’s really not much new in the mass-marketed trading software area. Most of the technical indicators that are in software today are really rehashes of technical indicators that have existed for many years, for decades in fact, since the 70’s at least, and early 80’s.

Examples are things like moving averages. While they are very good at identifying trends, they by their very nature tend to lag the market. Of course there’s been a great effort over the years by technical analysts in the futures markets to try to tweak out the moving averages, to try to reduce the lag in their response to the market. They’ve done that with various efforts like weighted moving averages, exponential moving averages.

There’s been a tremendous effort by technical analysts trying to tweak out these various technical indicators that have been used for decades. Even displaced moving averages, which I find kind of interesting because basically it is taking like a 5 day moving average and computing it’s value as of tonight’s close and then just displacing it out, maybe 2 days or 4 days, into the future and making the assumption that the value 4 days from today is going to always be what today’s value is.

It is an extremely primitive forecast that’s being made. But at least, I saw, there, an effort towards forecasting rather than always looking at trend following. We’re at least beginning to start to look at some form of trend anticipation, or being able to look at price anticipation, looking forward rather than just backwards. I felt, of course, that there had to be better solutions to the problem than just using things like displaced moving averages.

And, of course, there are other limitations that exist in technical analysis software today. The whole problem of curve fitting with system testing, which you may or may not be familiar with. Basically it relates to the fact that you can take a trading system, whatever that system may be, it could be as simple as a 5 day moving average crossing a 10 day moving average, and you’re long when the short average is above the long, and vice-versa. You might tweak out the sizes of moving averages to optimize them to a specific market.

Of course, the more parameters that you put into the software, the more optimizing you do, you can end up just curve fitting the model that you’re using to the past data and then the ability of that model to function without decaying in the future as you’re applying it in real time is very suspect. More importantly than that, I found that the limitation of looking just at one market by itself, in isolation, is perhaps the greatest narrow focus and limitation that has continued to exist in trading software in the financial futures arena.

People are still just looking at one market, by itself. They look at the Bond market. They run all kinds of tests, studies, and analysis on Treasury Bonds. They go back 5 years, 10 years, but all they’re looking at is Treasury Bonds. They have their blinders on and that’s all they see. The markets have changed, over the years. The financial markets have become more interdependent and due to telecommunications capabilities and computers and so forth, there is much more interdependence between the markets, not just here domestically but throughout the world, globally. And yet the analysts still continue to look in a very narrow context.

In the mid-80’s, I began to take the posture that intermarket analysis was something that needed to be implemented in software. Around 1986, I introduced a commercial program, that in my opinion now of course was very primitive, which implemented a form intermarket analysis where it attempted to look at various markets and their relationships to one another and to discern the impact that these related markets would have on a target market. I did it at that level in a very, like I say, a very primitive sense, in that I was only looking at directional relationships, I really did not have a mathematical means at the time to identify in a quantitative way the effects that these various markets all simultaneously might have on a specific market.

So I was kind of frustrated with what I had implemented and I was looking around in the mathematical arena to see if there were some capabilities or tools that I might be able to implement to further advance what I was trying to do with intermarket analysis.

By that time, of course, the crash of 1987, I think probably illustrated to most futures traders at least, that the markets were very much interrelated and that you can not continue to just look at the stock market by itself or the Bond market by itself or the domestic markets in isolation of what’s going on globally. More and more people began to pay attention to intermarket analysis. A friend and colleague of mine, John Murphy in New York at CNBC around that time himself was beginning to do research in that area. He wrote a very interesting book on intermarket analysis in 1990.

While most people continue to acknowledge the intermarket dynamics that go on in the financial markets, even today I believe that there’s very little effort really being made on a substantive level, and surely not on a quantitative level, to actually discern these intermarket relationships. Most traders in the futures arena, still, at best, might subjectively look at charts, they may look at the CRB index when they’re trading Bonds, or peek over and take a look at the S&P index. But, it’s still being done on a very, very subjective and intuitive level. Mostly just visually looking at other charts, but nothing really serious in the way of intermarket analysis. And still, most traders are still focusing predominately just on one market. I feel that that, of course, puts them at a very severe disadvantage.

That brought me to neural networks. They came into vogue just around the time I had discovered them myself. Although they had been around for many years in other areas. They became very popular in the late 1980’s and early 90’s in the futures arena. There were a number of neural network software developers who came from outside of the financial industry who began to promote neural networks to the trading community.

But of course they had little background themselves in the financial area, and it was a case of taking a tool and not really knowing how to apply it – and the results were somewhat frustrating in terms of the fact that there were a lot of raised expectations for what neural networks would do and those expectations were not realized. At one point it was even being thought that maybe this is the holy grail that’s finally been found. Maybe the neural network is the answer. There’s artificial intelligence. The computer is now the thinker, the brains behind everything. You just push a couple of buttons, and you just go to the bank. That is just not the way it works.

There was, as I said, a lot of hype. And, of course, there was a lot of resistance within the financial community, and I can only speak from the futures side of things because that’s where I’m involved. There was a lot of resistance. The status quo, the parties that have a vested interest in what I consider older technologies. Neural networks can be somewhat intimidating. There’s a lot of math involved. So this was again similar to what had happened with system testing back in the early 80’s. There was kind of a back lash within the industry, in trying to say that neural networks were not the answer and they should just kind of go away, and let’s get back to what we’ve always been doing, chart formations and support and resistance lines and things of that sort.

I’ve always kind of joked about it. If you draw support or resistance line with your #2 pencil on a chart, depending on when you last sharpened your pencil that can determine where your stops are in the futures markets. That can be very costly if you didn’t sharpen your pencil just right. There’s still a lot to be done in the industry, but neural networks certainly, I believe, have demonstrated over the last 4 or 5 years that they do have a role to play. They’re not the holy grail. They’re not the answer. But they certainly have a very significant role to play. You don’t throw the baby out with the bath water.

There’s a lot of trial and error involved in developing an effective neural network-based trading program. The more you know about the background mathematics of neural networks, and of course the more you know about the financial markets, the more successful the outcome will undoubtedly be. This was one of the reasons why, in 1990, I formed a subsidiary of my company which is known as The Predictive Technologies Group, where I was able to bring together people who had varying skills in areas that I myself did not feel up to speed.

I have people that work for me who have Ph.D.’s in math and I certainly don’t feel comfortable reading books that don’t have any English in them. And these guys just read it for bed time reading. So I formed a group and began to put a research team together so that we could pursue the application of neural networks to intermarket analysis and that’s what we’ve been doing since 1990.

There are lots of things that go into effectively developing a neural network system. A lot of decisions that have to be made by the researchers, things that involve network architecture, various learning laws that are involved, the selection of the input data, the determination of what output you’re actually even looking to make a prediction for. The error measures that you use, various protocols for out-of-sample testing that you want to employ in order to determine which final network you actually want to use in the markets. All of these things obviously affect the end result, in terms of the effectiveness of the neural network system. And, therefore, the more expertise you can bring to bear on the task, the more likelihood that you’ll be successful at doing so.

Neural networks are very nice in that they are not limited with respect to the kind of input data that can be used. You can use strictly technical data, you can use fundamental data, intermarket data, data on weather, literally any type of data, day of week, time of day, week of month, month of year, whatever it is, any kind of effect that you’re looking to see if there’s some potential pattern or some influence that, that particular piece of data might have on the output that you’re trying to make the prediction on. Basically, the neural networks are really an iterative optimization procedure, where, through a process known as training, the neural networks learn to find patterns and relationships in data that would otherwise be perceived to be very disparate.

A human being with his mind would not be able to just look at that data and find those underlying patterns and influences that exist within the data. I’m not going to go into too much detail on neural networks. I don’t want to put you to sleep. My firm has a web site on the internet. Its address is www.profittaker.com. There is an extensive amount of information there on neural networks and their application to the financial markets that I’ve written over the years in book chapters and in various articles. So there’s a lot of material there if you want to pursue that, and there are certainly additional references there that you can look at as well.

The question is what’s next? Where do we go from here? Obviously, neural networks, I believe, will continue to play a role in the financial industry. There are obviously other information technologies, genetic algorithms, chaos theory, there’s a whole host of various technologies that I think will continue to get explored over the next several years and hopefully will prove to be fruitful. Clearly, the internet is going to play a role in terms of being able to provide a centralized database that analyst can draw upon and use to do various types of tests and analysis on financial data.

Unquestionably, the research is going to continue, as traders and analysts continue to strive to model the markets more accurately and more quickly, because of the fact that serious money can be made if you can predict the direction of the markets or the prices in the markets even if it’s just a day ahead or 2 days or 3 or 4 days ahead.

I have given a lot of thought to the question of how accurate can these predictions become. Can we have 100% accuracy at some point down the road, 50 years from today with microcomputers that may exist at that point in time, or main frame computers? I don’t think so. There’s randomness that’s in the markets that can’t be predicted, and there are of course unforeseen events. When Allen Greenspan decided to wake up this morning and make a little speech before a Congressional committee, the market immediately responded. Those types of things certainly are not something that can be foreseen in advance and certainly could not be modeled by any type of a modeling program whether it’s neural network-based or not.

In my opinion, I think probably 80% or maybe 85% accuracy is probably the maximum theoretical level of accuracy that could ever be achieved. But I don’t know, because we’re not there yet. My company is involved in making forecasts of moving averages, and we can do that with about 76% accuracy with up to 4 days advance notice. The problem, of course, is that most traders are, as I said, still using the mass-marketed software. It’s very limited with respect to the indicators that they use. They’re linear-based, whereas, of course, neural networks are non-linear in nature.

There’s been a lot of substitution of glitz and glamour for the underlying substance of technical analysis – fancy colored charts, and things of that sort are thought by many to be an indication of sophistication in software, and that’s certainly not the case. That is probably why the financial fatality rate in the futures markets is as high as it is. It’s quoted that 90-95% of all futures traders lose their money. It’s shocking.

If 95% of all airline passengers were killed in airplane crashes everywhere, it would be very hard to get me on a plane, I can tell you that right now. The fatality statistics are really very shocking. Yet this is an industry, because of the leverage involved and because of the big money that potentially can be made, that just continues to attract increasing attention throughout the financial community. I do believe that those traders and analysts who take the effort and do the hard work that’s necessary to be able to apply more robust quantitative approaches to the markets will be richly rewarded.

I just want to take a few moments to briefly tell you a little bit about my company’s software. VantagePoint, which is the software program that Market Technologies licensees is not a neural network training platform. It doesn’t allow someone to design and develop their own neural network program. There are other software programs on the market that do that. Instead, VantagePoint is a turn-key program. It’s been completely designed and trained by my research staff. Therefore, it requires really no expertise on the part of the user to use it with respect to expertise on neural networks or intermarket analysis. But it does, through the licensing arrangement that we have, give traders throughout the world access to a technology that they otherwise would not be in the position to have access to themselves.

There are presently 21 VantagePoint programs that we have available. They cover the financial interest rate markets, the stock indices, the energies, and the foreign currencies. They’ve each been individually designed and trained for each of those respective markets. Each of these programs currently uses 10 different inter-market as data inputs and there are 5 separate neural networks that are actually comprised within each program. So in other words, for the Treasury Bond neural network program there are actually 5 neural networks. It’s looking at 10 markets- the CRB Index , gold, the US dollar, various currencies, the S&P 500 Index, as well as the Treasury Bond data. It’s putting all of that into those 5 neural networks and generating from an intermarket perspective the prediction that the software makes each day.

As I said, one of the things that we’re predicting is moving averages. They’re very popular in the futures area, and yet, of course, because of their lagging nature, they have deficiencies which I feel have been overcome through our forecasting efforts at being able to actually predict moving average values for the future. Two of the outputs that are generated from VantagePoint are a prediction of a 10-day moving average of closing prices. That prediction is made for 4 days in the future. We also predict a 5-day moving average for 2 days in the future. And then, of course, these predicted moving averages are used in conjunction with other data to generate buy and sell signals but it’s doing it on a prospective basis looking ahead up to 4 days in the future, rather than just simply using regular moving averages which obviously are lagging the market.

So, through intermarket analysis coupled with the forecasting capability of neural networks, we have been able to eliminate the lag in moving average analysis, and I think that in that one respect VantagePoint really has taken technical analysis to a new level, and the industry has become very receptive to that.

Thank you very much.