Cycle 1.5 – Measurement Planning

Data Dogs meets every first Monday at Hawthorne Lucky Lab from 6pm-8pm.

Registration is not required but content for the monthly meetup is published here:

Measure Twice, Cut Onc∑!

Measurement: The process or act of empirically assigning numbers, categories or symbols to any entity with an intention to describe only the measured attribute and no other variables.

  • Direct Measurement – a metric that does not depend on any other attribute.
  • Indirect Measurement – a derived measurement from a set of direct measures.

Measurement Planning: Documenting the strategic business requirements along with internal/external customer requirements. These requirements need to be first validated to ensure you are measuring what you think you are measuring. The higher the stakes the more important this validation to ensure you are really measuring the attribute correctly. Metrics which work well locally might also fail globally. (Is customer retention only measurable by subscription renewal?)

  • Confounding Variables – A lurking variables which explains away some or all of the correlation between an independent and dependent variable.

Instrumentation: Counting. Matching. Comparing. Timing. What natural variation may exist in these techniques? Many attributes we wish to measure may not have a generally agreed upon instrumentation methods. Customer satisfaction can have a counting, comparing and timing methodology but no real guidance on if you should collect one, two or all three at once in one sample, many samples in one day or across multiple days? The notion of several related dimensions is called a balanced scorecard(Kaplan and Norton) which can provide a more complete picture.

Organizational Measurement: It’s both hard and easy to generally measure something people sense would be “good” in order to “know how we’re doing”. Usually this means measuring a few results or effects and seeing it as the sum(∑ ) of everything we do. However, a good set of measures will align the organization from top to bottom and link cause and effect together. This process of measurement analysis can be helpful in aligning the whole organization towards the overall strategy.

Ian’s Philosophical Tidbits:

  • Measurement presupposed something to be measured. (Hyman’s Maxim)
  • You’ll never learn more about a measurement than getting as close to directly observing when and how it is taken.
  • If indeed causation is the basis for all empirical inference/prediction, then all empirical claims will follow from measuring causality.
  • When you don’t plan your measurements and you get negative results you should actually be happy because random chance should predict nothing! A model which does no better than random chance isn’t a model.
  • A boat goes nowhere when everyone inside it rows at his or her best. It produces chaos. Measure meaningful things which don’t demand excellence at every iteration.



1. Software Engineering Metrics

2. Level of Measurement Theory


Leave a Reply

Your email address will not be published. Required fields are marked *