Computational & Technology Resources
an online resource for computational,
engineering & technology publications
Civil-Comp Proceedings
ISSN 1759-3433
CCP: 74
Edited by: B.H.V. Topping and B. Kumar
Paper 4

A Cognitive Complexity Measure for the Design Process

S. Sinha+, A.I. Thomson* and B. Kumar+

+Department of Civil Engineering, *Department of Design, Manufacturing & Engineering Management, University of Strathclyde, Glasgow, United Kingdom

Full Bibliographic Reference for this paper
S. Sinha, A.I. Thomson, B. Kumar, "A Cognitive Complexity Measure for the Design Process", in B.H.V. Topping, B. Kumar, (Editors), "Proceedings of the Sixth International Conference on the Application of Artificial Intelligence to Civil and Structural Engineering", Civil-Comp Press, Stirlingshire, UK, Paper 4, 2001. doi:10.4203/ccp.74.4
Keywords: competitiveness, complexity management, design management, human resources, planning.

Design projects are typically plagued [1,2] with schedule and cost overruns. A factor in these overruns can be attributed to projects being more complex than originally anticipated. Although research has been carried out in this area there is currently no method for measuring the complexity of the design process. In order to estimate costs and schedule projects more accurately it is essential that complexity of the design process can be measured. Presently, design process complexity is highly subjective. The objective of this paper is to quantify the complexity of the design process through the establishment of a `Complexity Index' for the design process, thus removing the subjective aspect. Complexity of the design process has been defined as "amount of information processing" within a particular context.

In order to bring objectivity to complexity of the design process a comprehensive list of the factors known as Complexity Generating Factors (CGF's) has been framed. The idea is to measure the information content associated within the identified CGF's has been used. As according to mathematician Von Neumann [3] complexity could be numerically measured, like any other system observable, if it was to be related to such things as the dimension of a state space, the length of a programme or the magnitude of a 'cost' in money or time

For this purpose the concept that is used as measure of complexity is entropy. Well known in the thermodynamics field, the entropic measure is a way to quantify the disorder that arises in a system, due to variety and uncertainty. Also rooted in information theory, entropy is defined as the expected amount of information necessary to describe a system. Expected means that the amount of information is actually an average covering the various possible states of the system.

Initial results show that the qualitative level of information associated with the different states (of selected CGF expressed as a probability of working observations) varies linearly with the amount of information processing done at that state on account of that CGF.

The model has some limitations like being process specific, information with reference to all the CGF's could not be quantified and the results are dependent on the expertise of the designer and the context in which the complexity is measured.

Norris, K.P., "The accuracy of project cost and duration estimates in Industrial R & D", R & D Management, 2, pp 25-36, 1971. doi:10.1111/j.1467-9310.1971.tb00091.x
Murmann, P.A., "Expected development time reductions in the German Mechanical Industry", Journal Product Innovation Management, 11, pp 236-252, 1994. doi:10.1016/0737-6782(94)90006-X
Gidado, K.I., "Project Complexity: The focal point of construction production planning", Construction Mangement and Economics, Vol. 14, pp 213-225, 1996. doi:10.1080/014461996373476

purchase the full-text of this paper (price £20)

go to the previous paper
go to the next paper
return to the table of contents
return to the book description