The HDL Complexity Tool

0.5.04 GPL v3    
5.0/5 2
A tool that parses large complex hardware projects' source code to produce useful complexity results.





This complexity score is intended to be useful for Verification to drive test plans. Hopefully, RTL designers can use this tool as a way to manage design complexity and as a guide to efficiently learn the structure of existing designs.

The HDL Complexity Tool is a simple tool to provide measurement data. The driving concept being that you cannot control what you cannot measure. We intend to use existing research to develop a tool that performs well on a set of real projects.

Actual defect data will be used to test complexity as a technique to identify risky components. Real designs will be measured to determine what are the practical uses of hct. In the end, this tool should be practically useful to anyone designing and/or verifying a complex hardware project.


 1. Download latest sources from
 2. Untar the files into a directory
 3. Execute: perl ./ in that directory
 ** If you want to install on a *nix box system wide, please run as root
 4. Follow the installer's instructions


The HCT is continuously evolving. We are starting with McCabe Cyclomatic Complexity analysis to understand branch complexity. Then we are improving this with more sophisticated complexity scores that are calibrated with real defect data.

A good background on this is available. Please refer to "Measuring the Complexity of HDL Models" by Michael Shaefers. Following we have a few excerpts that we use as design criteria for HCT.

There are a few HDL complexity factors that are defined in that paper:

- size
- nesting
- control flow
- information flow
- hierarchy
- locality
- regularity
- modularity
- coupling (of modules or instances)
- concurrency
- timing

In that paper, the idea of psychological complexity of HDL is introduced and the common aspects as well as differences of the software versus hardware design complexity are analyzed. The point is that a good complexity score will adhere to six rules:

1. The measure has to be based on a formal foundation
2. The measure has to be intuitive
3. Models under development have to be measurable
4. A Structured model has to be measured as less complex than an unstructured version
5. Adding new parts to the model has to increase the complexity
6. Replacing one part of the model by a more complex part has to increase complexity

Those are six lofty goals when you start to think about them. The transitivity implied by them is going to be tough to achieve. However, we are shooting for this and should use it as a guiding light.

We'll need users to correlate with their defect history to refine our attempts. Iteration on a pure and simple generic parsing and computation framework is key to effective iterative design. We must focus on the software architecture a elegance of our chosen language of implementation.

The end goal being to properly score the psychological complexity of hardware blocks in any HDL and use this to predict defect rates and schedule risks.
Last updated on October 10th, 2008
The HDL Complexity Tool - Usage message

0 User reviews so far.