Skip to main content

Posts tagged with 'heuristics'

I came across an abstract and slides (PDF) about using AOP to detect code smells. It got me thinking: is clean code and SOLID architecture itself a cross-cutting concern? Usually "good, maintainable code" isn't ever written down explicitly as a requirement (functional or otherwise); it's just sorta assumed that developers will write the best code they can. Obviously that doesn't always happen, and the customer probably won't know one way or the other until after release.

As a developer, sometimes it's hard for me to be objective when looking at my code and the choices that I've made. Pair programming is  one way to help alleviate this: I can get instant feedback from another developer as I'm coding and making decisions. Test driven development also helps, by forcing me to write code that's easy to test (and therefore loosely coupled). Not every project or code base has the luxury of either of these things: maybe there's only 1 developer, or maybe it's a legacy code base. Whatever the reason, another approach to take is code analysis: code metrics like cyclomatic complexity and maintainability index. There's also heuristics, aka "code smells" that (not always, but usually) indicate that there might be a problem.

There are three code smells addressed by the Juliana Padilha's slides, none of which I've heard before:

  • Divergent change: this sounds like the opposite of Single Responsibility. I.e. the class has more than one reason to change, and thus its responsibility is diverging.
  • Shotgun surgery: I've not heard this term, but I've certainly seen it (and been guilty of it myself). Making a change requires touching a handful of different classes instead of just one or two.
  • God class: I actually have heard of this, and if you consider classes with 300+ line Page_Load methods in ASP.NET to be God classes, then I've certainly seen it and done it.

The metrics that she uses to find these smells are not traditional metrics, but "concern-driven" metrics, meant to identify code "scattering" and "tangling" (i.e. the code that AOP is meant to help refactor), and includes:

  • Concern Diffusion over Class (CDC)
  • Concern Diffusion over Operation (CDO)
  • Number Concerns per Class (NCC)
  • Concern Diffusion over Lines of Code (CDLOC)

These metrics weren't defined in the slides, but I found them in another white paper from Columbia.

Matthew D. Groves

About the Author

Matthew D. Groves lives in Central Ohio. He works remotely, loves to code, and is a Microsoft MVP.

Latest Comments

Twitter