Alice and Bob, famous for their secret communications are now working together on their own security product which blocks all traffic from “Foo”, an evil hacking organisation.
The product is called “Foo Bar”.
Bob wrote out all the requirements that he gathered from interviewing many potential clients, Alice built a prototype, they tested in on beta users and started scaling out, always using feedback to build new features.
Time past. Features were shipped. Clients were happy. The roadmap ahead looked grand.
Then something started happening. Bob kept hearing reports that Foo Bar wouldn’t always block traffic, but the reports were inconsistent in the detail even though the customers were consistent in their outrage. Complaints of “Your product is meant to provide us with 100% Foo blocking” and that ilk.
But Alice can see nothing wrong. Every requirement they wrote out is in the product, every use case is covered, every unit test and functional test passes. Nothing is wrong. It works. The tests they wrote say so.
Yet the reports come in. Not in a flurry, but every now and again. And this occasionally gets the team down.
Alice is left with this permanent “cannot replicate” issue on her list while Bob has a list of “must call” clients to whom he has nothing concrete to say, all because of a lack of information of what is going on.
And in a client meeting someone says that the software isn’t “high quality”.
In a flash, Bob and Alice both think of the pages of specification, the hundreds of unit tests and the hours of testing. If that didn’t ensure high quality, then what would? Yet still, after all the specification and continuous tests, where is this “quality”?
Software quality is extremely hard to define and, therefore, very hard to ensure.
Waterfall specifications, which chew up months of the domain experts in the business, are often hated by the implementation team, who catch the specification being thrown over the wall. Agile methods, which encourage continuous deployment into product thereby skirting the waterfall, require capturing as much detail as possible from production so the team can react rapidly.
All this is meant to ensure “value for the user”: features which provide value, business propositions which provide value, integrations which provide value.
But not all value and quality in software comes from outside specifications and direction; some of the quality just happens in the build of the software and this is unavoidable. It comes out of coding habits, best practices which aren’t quite on-point, and the organic nature of many software projects.
Software is the best specification of itself; to specify a piece of software 100% is to write the software. As the agile manifesto asserts: “working software is the primary measure of progress”.
If you cannot 100% specify software from the outside, where does that leave guaranteeing quality?
In a modern, continuous deployment world this shifts the measure of value and quality from upfront testing all possible scenarios to upfront testing of all understood scenarios and detailed monitoring of all actual scenarios.
While Alice was following best practices by writing tests and being customer-led in new features, she needed to see what was happening when the software was working in the wild. Bob needed to stop asking customers for “steps to replicate the issue” and just ask “when did it happen?”
This subtle shifts makes a profound change to pressures on the product team. More emphasis is given to the tools of monitoring, repeatability and replayability of previous events and less to the political language of “sufficient testing”, fire fighting and placating angry customers.
Software quality is usually defined as “lacking in defects” or “matching the spec” but this isn’t really the whole picture. Software quality includes the unknown behaviours of the software in the wild. The test cases no one wrote. These are usually taken as “feedback” from clients and users, but if we factor in this unknowable aspect of software quality it’ll change how we approach continuous integration, continuous deployment and the other tools which ship software quickly.