This Article 
 Bibliographic References 
 Add to: 
Comprehending Object and Process Models: An Empirical Study
July/August 1999 (vol. 25 no. 4)
pp. 541-556

Abstract—Although prior research has compared modeling performance using different systems development methods, there has been little research examining the comprehensibility of models generated by those methods. In this paper, we report the results of an empirical study comparing user comprehension of object-oriented (OO) and process-oriented (PO) models. The fundamental difference is that while OO models tend to focus on structure, PO models tend to emphasize behavior or processes. Proponents of the OO modeling approach argue that it lends itself naturally to the way humans think. However, evidence from research in cognitive psychology and human factors suggests that human problem solving is innately procedural. Given these conflicting viewpoints, we investigate empirically if OO models are in fact easier to understand than PO models. But, as suggested by the theory of cognitive fit, model comprehension may be influenced by task-specific characteristics. We, therefore, compare OO and PO models based on whether the comprehension activity involves: 1) only structural aspects, 2) only behavioral aspects, or 3) a combination of structural and behavioral aspects. We measure comprehension through subjects' responses to questions designed along these three dimensions. Two experiments were conducted, each with a different application and a different group of subjects. Each subject was first trained in both methods, and then participated in one of the two experiments, answering several questions relating to his or her comprehension of an OO or a PO model of a business application. The comprehension questions ranged in complexity from relatively simple (addressing either structural or behavioral aspects) to more complex ones (addressing both structural and behavioral aspects). Results show that for most of the simple questions, no significant difference was observed insofar as model comprehension is concerned. For most of the complex questions, however, the PO model was found to be easier to understand than the OO model. In addition to describing the process and the outcomes of the experiments, we present the experimental method employed as a viable approach for conducting research into various phenomena related to the efficacy of alternative systems analysis and design methods. We also identify areas where future research is necessary, along with a recommendation of appropriate research methods for empirical examination.

[1] R. Agarwal, A.P. Sinha, and M. Tanniru, “Cognitive Fit in Requirements Modeling: A Study of Object and Process Methodologies,” J. Management Information Systems, vol. 13, no. 2, pp. 137–162, 1996.
[2] R. Agarwal, A.P. Sinha, and M. Tanniru, “The Role of Prior Experience and Task Characteristics in Object-Oriented Modeling: An Empirical Study,” Int'l J. Human Computer Studies, vol. 45, pp. 639–667, 1996.
[3] G. Booch, Object-Oriented Analysis and Design with Applications, Addison-Wesley, Reading, Mass., 1994.
[4] F.P. Brooks Jr., The Mythical Man-Month (20th Anniversary Edition).Reading, Mass.: Addison-Wesley, 1995.
[5] S.N. Cant, D.R. Jeffrey, and B. Henderson-Sellers, “A Conceptual Model of Cognitive Complexity of Elements of the Programming Process,” Information and Software Technology, vol. 37, no. 7, pp. 351–62, 1995.
[6] P. Coad and E. Yourdon, Object-Oriented Analysis, second ed., Yourdon Press, Englewood Cliffs, N.J., 1991.
[7] L.L. Constantine, “Object-Oriented and Structured Methods: Toward Integration,” American Programmer, vol. 2,no. 7/8, pp. 34–40, Aug. 1989.
[8] T.D. Cook and D.T. Campbell, Quasi Experimentation: Design and Analysis Issues for Field Settings. Boston, Mass.: Houghton Mifflin 1979.
[9] L.J. Cronbach, Essentials of Psychological Testing, third ed., New York: Harper and Row, 1970.
[10] T. DeMarco, Structured Analysis and System Specification. Englewood Cliffs, N.J.: Prentice Hall, 1978.
[11] T. DeMarco, Controlling Software Projects: Management, Measurement, and Estimation. New York: Yourdon Press, 1982.
[12] H.J. Einhorn and R.M. Hogarth, “Behavioral Decision Theory: Processes of Judgment and Choice,” D.E. Bell, H. Raiffa, and A. Tversky, eds., Decision Making: Descriptive, Normative, and Prescriptive Interactions. Cambridge, England: Cambridge Univ. Press pp. 113–146, 1988.
[13] H. Eriksson and M. Penker, UML Toolkit. Somerset, N.J.: John Wiley&Sons, 1997.
[14] M. Fayad, W. Tsai, and M. Fulghum, "Transition to Object-Oriented Software Development," Comm. ACM, Feb. 1996, pp. 108-121.
[15] R. Fichman and C. Kemerer, "Object-Oriented and Conventional Analysis and Design Methodologies: Comparison and Critique," Computer, Oct. 1992, pp. 22-39.
[16] M. Fowler, UML Distilled: Applying the Standard Object Modeling Language. Reading, Mass.: Addison-Wesley, 1997.
[17] W. Gemino and Y. Wand, “Empirical Comparison of Object-Oriented and Dataflow Models,” Proc. 18th Int'l Conf. Information Systems, pp. 446-447, Dec. 1997.
[18] J. Iivari, “Object-Orientation as Structural, Functional and Behavioural Modelling: A Comparison of Six Methods for Object-Oriented Analysis,” Information and Software Technology, vol. 37, no. 3, pp. 155–163, 1995.
[19] I. Jacobsen, Object Oriented Software Engineering: A Use Case Driven Approach, Addison Wesley, Wokingham, UK, 1992.
[20] S.L. Jarvenpaa, “The Importance of Laboratory Experimentation in IS Research,” Comm. ACM, vol. 31, no. 12, pp. 1,504–1,507, 1988.
[21] S.L. Jarvenpaa, G.W. Dickson, and G. DeSanctis, “Methodological Issues in Experimental IS Research: Experiences and Recommendations,” MIS Quarterly, vol. 9, no. 2, pp. 141–56, 1985.
[22] Y. Kim and S. March, “Comparing Data Modeling Formalisms,” Comm. ACM, vol. 38, no. 6, pp. 103–115, 1995.
[23] J.H. Larkin and H.A. Simon, “Why a Diagram Is (Sometimes) Worth Ten Thousand Words,” Cognitive Science, vol. 11, pp. 65–99, 1987.
[24] T. Moynihan, “An Experimental Comparison of Object-Orientation and Functional Decomposition as Paradigms for Communicating System Functionality to Users,” J. Systems and Software, vol. 33, pp. 163–169, 1996.
[25] A. Newell and H.A. Simon, Human Problem Solving. Englewood Cliffs, N.J.: Prentice Hall, 1972.
[26] S. Papert, Mindstorms: Children, Computers, and Powerful Ideas. New York: Basic Books, 1980.
[27] D.E. Perry, N.A. Staudenmayer, and L.G. Votta, "People, Organizations, and Process Improvement." IEEE Software, vol. 11, no. 4, pp. 36-45, 1994.
[28] A. Porter and L. Votta, "What Makes Inspections Work?" IEEE Software, vol. 14, no. 6, Nov. 1997, pp. 99-102.
[29] B. Ratcliffe and J.I.A. Siddiqui, “An Empirical Investigation into Problem Decomposition Strategies Used in Program Design,” Int'l J. Man-Machine Studies, vol. 22, no. 1, pp. 77–90, 1985.
[30] C.L. Chang, R.A. Stachowitz, and J.B. Combs, “Validation of Nonmonotonic Knowledge-Based Systems,” Proc. IEEE Int'l Conf. Tools for Artificial Intelligence, Nov. 1990.
[31] P. Shoval and I. Frumermann, “OO and EER Conceptual Schemas: A Comparison of User Comprehension,” J. Database Management, vol. 5, no. 4, pp. 28–38, 1994.
[32] A.P. Sinha and I. Vessey, “Cognitive Fit: An Empirical Study of Recursion and Iteration,” IEEE Trans. Software Eng., vol. 18, no. 5, pp. 368-379, May 1992.
[33] P. Slovic and S. Lichtenstein, “Preference Reversals: A Broader Perspective,” American Economic Review, vol. 73, no. 4, pp. 596–605, 1983.
[34] E. Soloway, J. Bonar, and K. Ehlrich, “Cognitive Strategies and Looping Constructs: An Empirical Study,” Comm. ACM, vol. 26, no. 11, pp. 485–492, 1983.
[35] A. Tversky, S. Sattath, and P. Slovic, “Contingent Weighting in Judgment and Choice,” Psychological Review, vol. 95, no. 3, pp. 371–384, 1988.
[36] I. Vessey, “Cognitive Fit: A Theory-Based Analysis of the Graphs versus Tables Literature,” Decision Sciences, vol. 22, no. 2, pp. 219–240, 1991.
[37] I. Vessey and S.A. Conger, "Requirements Specification: Learning Object, Process, and Data Methodologies," Comm. ACM, vol. 37, no. 5, pp. 102-113, May 1994.
[38] I. Vessey and D. Galletta, “Cognitive Fit: An Empirical Study of Information Acquisition,” Information Systems Research vol. 2, no. 1, pp. 63–84, 1991.
[39] I. Vessey and R. Weber, “Structured Tools and Conditional Logic: An Empirical Investigation,” Comm. ACM, vol. 29, no. 1, pp. 48–57, 1986.
[40] S. Wang, “Two MIS Analysis Methods: An Experimental Comparison,” J. Education for Business, pp. 136–141, Jan./Feb. 1996.
[41] C. Welty and D.W. Stemple, “Human Factors Comparison of a Procedural and a Non-Procedural Query Language,” ACM Trans. Database Systems, vol. 6, no. 4, pp. 626–649, 1981.
[42] S.B. Yadav, R.R. Bravocco, A.T. Chatfield, and T.M. Rajkumar, “Comparison of Analysis Techniques for Information Requirement Determination,” Comm. ACM, vol. 31, no. 9, pp. 1,090–1,097, 1988.

Index Terms:
Cognitive fit, experimental method, human factors, model comprehension, object-oriented modeling, process-oriented modeling.
Ritu Agarwal, Prabuddha De, Atish P. Sinha, "Comprehending Object and Process Models: An Empirical Study," IEEE Transactions on Software Engineering, vol. 25, no. 4, pp. 541-556, July-Aug. 1999, doi:10.1109/32.799953
Usage of this product signifies your acceptance of the Terms of Use.