Return to Decision Line Home Page
Return to DSI Home Page
SHAWNEE VICKERY, Feature Editor, Eli Broad Graduate School of Management, Michigan State University
Empirio-Criticism: Some Thoughts About Empirical Research in Operations Management
by Gyula Vastag, Kenan Institute
Ernst Mach, a distinguished Austrian physicist, whose studies of motion led him to a position that to some extent anticipated Einstein's theory of relativity, was concerned with the problem of knowledge and with the nature of physics. He devoted a book to perception. He came to the conclusion that all we know at first hand is sense-perceptions. (These days, almost a century later, Andre Agassi may be the best-known Machist since he tried to convince everyone that perception is everything.)
Mach believed that events in the physical world are complexes of sensations and that physics is the science of discovering the laws governing these complexes and their interconnections.
Sense-perceptions are ultimate elements, whose relations we have to establish empirically. Physics has only to deal with sense-perceptions and their functional relations,þ he wrote. His followers were called phenomenalists or empirio-criticists who recognized sense-perceptions alone as incontrovertible data.
If we substitute physics with Operations Management or empirical research in Operations Management, where more and more subjective or perceptual data are used, Mach's views on science are still valid today. All of us involved in empirical research would like to discover the hidden laws governing the complexes that we study and their interconnections.
All analysts are looking for interesting issues. Interesting for us, otherwise we would not do it. Interesting for others in academia and business, otherwise we could not do it or finance it. Ultimately we are looking for listeners whom we would like to influence.
In order to be able to attract listeners we have to establish credibility through understanding and predicting phenomena. Understanding and prediction require that all three elements of empirical researchþtheory, data, and methodsþbe present. Traditionally, we were taught that we must have a theory to start with, and based on this theory we can select the appropriate methodology, collect and analyze data, then report our findings.
This school of thought implied a hierarchical pyramid structure with theory sitting on top and controlling the methods and data. Moreover, it also implied a certain sequence of events (theory, methods and data) and implicitly assumed that the researcher has full control over all three.
I think that this view should be revised. The approach that I am proposing instead of the pyramid is a Venn diagram of three interconnected circles representing theory, methods and data, where the empirical research focuses on the common set of these three circles. This view implies that although we have to have all three elements present simultaneously, this process can be started anywhere and the researcher is not necessarily in control of all three elements.
As the number of available data sets increases, a researcher may choose to analyze data collected by others. If the studied phenomenon is a complex one, it may not be feasible financially or logistically for an individual researcher to get involved in data gathering. While the number of available public data sets is growing in the United States, data scarcity still remains a problem in international empirical research. If there is a data set complex enough (I will define this term later), it may be used to test several theories or methods.
For example, the data collected by the Global Manufacturing Research Group is distributed with the book to facilitate new ideas and results (see Global Maufacturing Practices: A Worldwide Survey of Practices in Production Planning and Control, D. Clay Whybark and Gyula Vastag, editors, 1993). The data included in this book are complex in the following sense.
First, the size of the data set is large, there are almost seven hundred observations or records in it.
Second, the dimensionality of the data, the number of variables is also high, recalling R. E. Bellman's comment about þthe curse of dimensionality,þ meaning that the higher the dimensionality, the sparser and more spread apart are the data points. Ten points on the unit interval are not distant neighbors. But ten points on a 10-dimensional unit rectangle are like oases in the desert. The complexity of a data set increases rapidly with increasing dimensionality.
Third, it has a non-standard structure meaning that it is a mixture of different data types.
Fourth, the studied phenomenon, manufacturing practices, is nonhomogeneous, that is, the relationships among variables in different parts of the measurement space (e.g., in different countries) are, or can be, different simply because the survey was carried out in more than twenty countries and two industries.
The analysis of large complex data sets to discover governing laws requires new methods of analysis and new statistical techniques. Speaking about statistics, you have probably heard the story about Malcolm Forbes, who once got lost floating for miles in one of his famous balloons and finally landed in the middle of a cornfield. He spotted a man coming toward him and asked, ``Sir, can you tell me where I am?''
The man said, ``Certainly, you are in a basket in a field of corn.''
Forbes replied, ``You must be a statistician.''
The man was surprised, ``That's amazing, how did you know that?''
``Easy,'' said Forbes, ``your information is concise, precise and absolutely useless.''
As I described earlier, complexity means much more than a larger data set and the traditional methods may not be able to reveal the underlying structure in the data, resulting in a useless analysis.
Most of the tools in data analysis were developed in the first half of this century and they do not rely on computer technology. The methods I am proposing are modern, computer-age versions of the traditional tools. Their common feature is that they are very computer intensive and they place a much greater emphasis on the graphical representation of data. Graphical representation should mean more than dozens of three-dimensional color pie charts, which are products of computer technology rather than analysis that enhances results.
These new methods, to name only a few of them, may include newer and expanded versions of the familiar linear regression, such as robust locally weighted regression and Classification and Regression Trees (CART) or Automatic Interaction Detection (AID). Bootstrapping may be used to get dependable confidence intervals instead of the familiar error estimate.
Philosophical consequences of Mach's views can be debated. Lenin, for example, fought fiercely against them in his 1908 book, Materialism and Empirio-Criticism. But this philosophical debate does not change Mach's truth in stating the goal of science, that is, to understand the laws governing the surrounding world and predict its future.