HIGH LEVEL DESIGN 
QUALITY ASSURANCE CHECKLIST

Mechanics

ENTIRE DOCUMENT

[ ] Design document format guidelines are followed?
[ ] Are there less than three errors in spelling, punctuation, and grammar per page?
[ ] Are all components of the design consistent with each other and with the SRS?
[ ] All narrative text meets the class writing standards (WPE level 4)?

DESIGN OVERVIEW

  1. [ ] Is the design overview included as HTML comments in the javadoc overview section?
  2. [ ] Does the design overview contain all the sections required in the document format guidelines?
  3. [ ] Has the design overview been proofread for clarity by a technical person not familiar with your project?
  4. [ ] Are all significant design tradeoffs documented in the design rationale section?
  5. [ ] Do all design tradeoffs favor maintenance over performance (unless pre-approved by the instructor)?
CLASS DIAGRAM
  1. [ ] Does the diagram conform to standard class diagraming notation (e.g. UML)?
  2. [ ] Is the diagram neat and easy to read?
  3. [ ] Is the diagram free from duplicate association lines?
  4. [ ] Do the names and relationships match the class definitions (below) exactly?
  5. [ ] Are the classes, attributes, and methods consistent with the Data Dictionary and Data Model from the Specification?
  6. [ ] Are all Data Dictionary items included in the diagram?
  7. [ ] Has the Data Dictionary been updated to show new items that were added during design?
STRUCTURE CHART
  1. [ ] Does the diagram conform to standard structure chart notation?
  2. [ ] Is the diagram neat and easy to read?
  3. [ ] Do the module names match the method definitions in the class skeletons (below) exactly?
  4. [ ] Does the vertical dimension in the chart show decreasing levels of abstraction?
  5. [ ] Does the chart show proper decomposition? Is each module "part of" its parent module?
  6. [ ] Does the chart show only decomposition and NOT procedural information?  It does NOT show the order in which tasks are performed. It does NOT illustrate an algorithm.
  7. [ ] Does each block contain a verb phrase, e.g. "Validate Username."
CLASS DEFINITIONS (a.k.a. Class Skeletons, Javadocs)
  1. [ ] Does each class in the class diagram have an associated definition file that compiles? (Not needed for 3rd party classes, obviously.)
  2. [ ] Does each class header include the documentation required in the document format guidelines (especially an @author tag and a well written descriptive header)?
  3. [ ] Does each method header include the documentation required in the document format guidelines?
  4. [ ] Is each method name a verb phrase in active voice?
  5. [ ] (Design-By-Contract teams) Does each method have documented pre/post conditions that are clear, complete, and correct? (Exception: accessor methods normally don't require pre/post conditions).
  6. [ ] (non-DBC teams) Does each method document how all possible error conditions will be handled?
  7. [ ] Does the name for each abstraction and its description use terms from the problem domain and not the implementation? E.g., GameBoard, not Grid. CustomerList, not CustomerArray
  8. [ ] Are meaningful names chosen for parameters and data attributes? See coding standard.
  9. [ ] Are the data attributes consistent with the Data Dictionary?
  10. [ ] Has the range of data values been constrained so as to not allow values which are meaningless in the problem domain? E.g., don't use int for data that can never be negative. Instead, use something like this Natural class.
  11. [ ] Do collection classes handle their own persistent data?
  12. [ ] Are all instance variables private?
  13. [ ] Is there a comment included for each design element that traces it back to the SRS?
  Horstmann's "5 C's"  (Reference)
  1. [ ] Cohesion - Do all methods logically fit together to support a single, coherent purpose?
  2. [ ] Completeness - Does the class support all operations that are part of the abstraction that the class represents?
  3. [ ] Convenience - Does the interface make it easy to perform common tasks?
  4. [ ] Clarity - Do the method names clearly express their intent and contain simple calling protocols?
  5. [ ] Consistency - Are all methods consistent with each other with respect to names, parameters, return values, and behavior?
INTERACTION DIAGRAMS
  1. [ ] The diagrams conform to standard diagraming notation (e.g. UML)?
  2. [ ] A diagram is provided for each use-case or functional requirement (that has more than two objects involved, or has more than four steps)?
  3. [ ] Each diagram has a title.
  4. [ ] The diagram is consistent with the other design documents?  All the class and method names match exactly?
DATA STRUCTURES
  1. [ ] There is a javadoc comment explaining the structure of each composite data structure in your solution.
  2. [ ] Your implementation choice for each data structure has been explained in "Design Issues."
  3. [ ] javadoc uses the -private flag.
  4. [ ] The format and content for each external file is shown in detail.
APPENDIX
  1. [ ] A complete FTR Review Summary report is included.
  2. [ ] The requirements traceability matrix.
  3. [ ] Listings are included of the compilation script showing compilation of all classes, and all javadoc.
  4. [ ] Listings use a non-proportional font.
  5. [ ] There are no compile errors or javadoc errors.

Conformance to Design Principles

ABSTRACTION
  1. [ ] Does the design make effective use of abstraction? Does the structure of the software design mimic the structure of the problem domain? Whenever possible, the elements of the solution domain should map directly onto elements of the problem domain. E.g., People waiting in line for a bank teller should be represnted by a queue, not an array or vector. The bank teller never needs to see the 3rd person in line.
  2. [ ] Does the design isolate the data from higher level functions? E.g., no control processes that belong in the collection class should be included with the elemental data.
  3. [ ] Are User Interface classes completely disjoint from other classes? Avoid embedding user interface features within the internal data representation.
DECOMPOSITION
  1. [ ] Is the solution thoroughly decomposed? Is each method at the level of 'atomic' components? That is, it should perform just a single task that can be coded in less than 30 lines (approximately). "Miracle" methods are a symptom of inadequate decomposition. (Another example)
  2. [ ] Is the design highly factored? Each task the system needs to perform should be implemented in only one place.
  3. [ ] Does the design decomposition consist of interfaces that reduce the complexity of connections between modules and with the external environment (low coupling)? The operations shouldn't have hidden interdependencies. Minimize the number of associations between classes.
  4. [ ] Is the design decomposed into methods that exhibit high cohesion? Number of methods that access one or more of the same instance variables should be small (or zero).
  5. [ ] Does the design distribute the "intelligence" so that no single component has a disproportionate amount of responsibility?
INFORMATION HIDING
  1. [ ] Does the design respect encapsulation? Don't expose private class data with public get/set methods, unless absolutely necessary.
  2. [ ] Do the abstractions successfully separate interface from implementation? That is, no implementation details should have public visibility.
  3. [ ] Has an appropriate implementation been chosen for the data? For example,

General

  1. [ ] Is the design complete? Have all functional requirements and problem data from the system specification been included? Can you prove it? Can you trace each requirement into the design element which includes it? Did you complete the requirements traceability matrix and include it in the appendix?
  2. [ ] Are the design priorities (as documented in the design issues) consistent with the non-functional requirements? For example, if memory requirements are not a Quality Attribute, don't worry about "wasted space" in your data structures.
  3. [ ] Does the design exhibit an architectural structure that has been created using recognizable design patterns?
  4. [ ] Does the design exhibit uniformity and integration? It should appear as though designed by a single individual.
  5. [ ] Is it easy to create automated tests for each component of the design?
  6. [ ] Does the design anticipate and accomodate unusual circumstances, and if it must terminate, do so in a graceful manner?
  7. [ ] Does the design lend itself to evolutionary implementation and testing (a.k.a. "staged delivery")?
  8. [ ] Is the inheritance tree as shallow as possible? Deep inheritance trees are difficult to comprehend.  Consider object composition as an alternative.



If there are any items not checked, provide an explanation on an attached page.  As a general rule, checklists with more than five unchecked items are not ready for submission. 
 
Tip: When Dr. Dalbey evaluates your deliverable, in order to begin comprehending your design he will read these three sections first:
The quality of writing in these sections is paramount.  These sections deserve your absolute best effort at crystal clear writing.  If these sections are poorly written it is impossible to get a good grade.


Document History
 
 
Date  Author Changes
2/20/2008 JD Added items for non-DBC teams. Added more examples of appropriate data structures. Added "design priorities" item.
11/24/03
JD
Added writing tip
11/22/03
JD
Added requirements traceability to appendix
11/12/03
JD
Major reorganization into categories of design principles.
11/11/03 JD Reworked data implementation examples
10/25/02 JD Added Explanation section
11/17/01 JD Revised Mechanics
11/1/01 JD Reworked to separate General issues from Mechanics
5/15  JD Removed PSP forms 
11/15/00 JD Variable declarations must be commented. 
Defect Logs must be included.
10/1  JD Revised for Fall 2000