[Next] [Previous] [Up] [Top]

5.0 DESIGN PROCESSING

5.1 THE DESIGN PROCESSOR

Advisors must work within a computer environment that provides explicit information for evaluations. As has been seen in previous sections, some prerequisite processing must be done on the available CAD data to provide feature objects and parameters for rule-based evaluation. While preparation of information into representations suitable for various advisors may be largely automated, there is little doubt that design systems, for the time being, must include the human element.

In the DFM system, the human element takes foremost consideration. A great deal of information processing and management is required to prepare and analyze design data. During the development of this project, or any other constantly evolving system, the information flow from one application to another, is contingent not only on a certain order of application invocation, but on often imprecise interface specification between applications.

Inclusion of the human element helps facilitate application integration when development is based on uncertain application behavior. Once the interface between applications becomes settled, the requirements to automate information from one application to another often becomes evident and easier to implement.

In the course of exploring comfortable human-computer interface paradigms regarding design advisor activities, the Design Processor paradigm has been developed. As with a word processor, which places the document at the center of all activities, the Design Processor focuses upon the design as an artifact, considering process as peripheral to the design. It shall be seen how DFM provides an interface which both anticipates the inclusion of new tools, allows consistent use of older tools, and permits incremental automation the portions of the design process.

Figure 37 is a recapitulation of the elements of the DFM system. There is incorporated another aspect of this system - a feed into the finite-element analysis utilities such as C-FlowTM and C-CoolTM. As with PIMES, the feedback of the numerical simulation will be merely that of a critic. Even though synthesis of a process is occurring, no alternatives are directly provided by those applications. That is, no part of the program tells the designer what specifications of the model must be changed.

Figure 37 Design for Molding Processing and Data Hierarchy

The original display of the DFM Design Processor is reproduced in Figure 38. This display was only one of a series of screens depicting the each state of the transformation of representations required to polygonize the IGES file into a Noodles model, abstract feature information and images, evaluate the PIMES critiques and provide the user with an interactive presentation of the solid model and its evaluations.

The drawbacks to the display of Figure 38 are related primarily to the lack of separation of the representations from one another, and from the explicit iconic depiction of the processes which act on them. On the implementation side, there may also be additions of agents and representations, requiring an adjustment of the visual structure. For example, the inclusion of the Medial Axis Transformation representation and utilities would have required a re-coordination of node and link icons to accommodate the new addition, as well as a new screen.

Figure 38 One Screen of the Original User Interface of DFM

With the original user interface, the process flow was of foremost importance. But the designer should not care about the intermediate steps toward critiquing a part. There are elements of the transformation that could or should be hidden from the user. To the designer, the ideal system to do this would be modeled as a direct interface to the model. While this may seem over-simplified, it serves to act as a guide for the reduction of complexity of the entire system. While there is a tacit acknowledgment of the requirements for transformation, processing between the CAD model and its critique and visualization is simply extraneous activity to the user. The user wants to see results declared and should not care about how to get there.

Automation allows a monolithic sequencing of processes such that the intermediate transforms are hidden. But the intermediate applications that perform the transforms are subject to change and improvement, or simply may be prototype utilities. Automation is simply not capable of the flexibility required to swap out applications or add new ones, especially within an evolving environment.

It became clear that a new system, in which transformations could be invoked independently, with explicit results and status available, would have to be artifact-centered. The result was that of the design processing model, which used the word processor as its metaphor. With the design processor, the methods of transformation and analysis are separated from the artifact objects of design.

[Next] [Previous] [Up] [Top]