Computational & Technology Resources
an online resource for computational,
engineering & technology publications
PROCEEDINGS OF THE SIXTH INTERNATIONAL CONFERENCE ON THE APPLICATION OF ARTIFICIAL INTELLIGENCE TO CIVIL AND STRUCTURAL ENGINEERING
Edited by: B.H.V. Topping and B. Kumar
A Cognitive Complexity Measure for the Design Process
S. Sinha+, A.I. Thomson* and B. Kumar+
+Department of Civil Engineering, *Department of Design, Manufacturing & Engineering Management, University of Strathclyde, Glasgow, United Kingdom
S. Sinha, A.I. Thomson, B. Kumar, "A Cognitive Complexity Measure for the Design Process", in B.H.V. Topping, B. Kumar, (Editors), "Proceedings of the Sixth International Conference on the Application of Artificial Intelligence to Civil and Structural Engineering", Civil-Comp Press, Stirlingshire, UK, Paper 4, 2001. doi:10.4203/ccp.74.4
Keywords: competitiveness, complexity management, design management, human resources, planning.
Design projects are typically plagued [1,2] with schedule and cost overruns. A factor in these overruns can be attributed to projects being more complex than originally anticipated. Although research has been carried out in this area there is currently no method for measuring the complexity of the design process. In order to estimate costs and schedule projects more accurately it is essential that complexity of the design process can be measured. Presently, design process complexity is highly subjective. The objective of this paper is to quantify the complexity of the design process through the establishment of a `Complexity Index' for the design process, thus removing the subjective aspect. Complexity of the design process has been defined as "amount of information processing" within a particular context.
In order to bring objectivity to complexity of the design process a comprehensive list of the factors known as Complexity Generating Factors (CGF's) has been framed. The idea is to measure the information content associated within the identified CGF's has been used. As according to mathematician Von Neumann  complexity could be numerically measured, like any other system observable, if it was to be related to such things as the dimension of a state space, the length of a programme or the magnitude of a 'cost' in money or time
For this purpose the concept that is used as measure of complexity is entropy. Well known in the thermodynamics field, the entropic measure is a way to quantify the disorder that arises in a system, due to variety and uncertainty. Also rooted in information theory, entropy is defined as the expected amount of information necessary to describe a system. Expected means that the amount of information is actually an average covering the various possible states of the system.
Initial results show that the qualitative level of information associated with the different states (of selected CGF expressed as a probability of working observations) varies linearly with the amount of information processing done at that state on account of that CGF.
The model has some limitations like being process specific, information with reference to all the CGF's could not be quantified and the results are dependent on the expertise of the designer and the context in which the complexity is measured.
purchase the full-text of this paper (price £20)