Agility as an Expression of Empiricism

It is in the layers of of middle management that fortresses of rationalism, determinism, and command-and-control mentality still have the strongest foothold.
9 minutes read

The field of software development is relatively young, but it has progressed very quickly. In just a few decades it has undergone maturing processes that have taken much longer in other fields. During the last decade Agile methodologies have gained wide spread acceptance, and are becoming mainstream. Transitioning to Agile is undertaken both by well established “traditional” software development organizations, as well as by more rudimentary “cowboy-style” development shops. In this post I examine the reasons why this transitioning is taking place, and what consequences come out of it.

From Rationalism to Empiricism

Peter Wegner [WEGNER-1997] examines the evolution of computer technology. He is concerned about computational complexity, and about giving proof that interaction cannot be expressed by algorithms, explaining the evolution from Turing Machines to Interaction Machines, broadening the realm of computability to include interactive computations. His observations provide an interesting perspective, and help interpreting the reasons behind the current transition towards Agile methodologies.

Wegner discusses the evolution from philosophy to the natural sciences, noting how it took 2000 years to go from Plato’s ideal world to empiricism, through Descartes, Hume, Hobbes, Hegel, Kant, Marx, Russels, Boole, Hilbert and Gödel. Wegner states: "Modern empirical science rejects Plato’s belief that incomplete knowledge is worthless."

Managing incompleteness is a key distinguishing mechanism between philosophical rationalism and scientific empiricism. The fact that software deals with uncertainty is well established. In this sense, uncertainty can be considered as a “lighter” form of incompleteness. More important: it can even be thought of as a concrete manifestation of incompleteness, and in particular one that can be “experienced” by human intellect and psychology.

Uncertainty is part of everybody’s everyday experience. Uncertainty is more concrete and tangible than rational ideas.

Uncertainty and Incompleteness

Uncertainty is inherent in software development, as noted by many scholars.

Watts Humphrey exposed the Requirements Uncertainty Principle [HUMPHREY-1995]:

For a new software system, the requirements will not be completely known until after the users have used it.

Hadar Ziv expressed the Maxim of Uncertainty in Software Engineering (also known as “Ziv’s MUSE”) [ZIV-1996]:

Uncertainty is inherent and inevitable in software development processes and products.

Earlier, Harlan Mills came to the conclusion [MILLS-1972]:

There is no such thing as an absolute proof of logical correctness. There are only degrees of rigor […] which are each informal descriptions of mechanisms for creating agreement and belief in a process of reasoning.

A notable consequence of Mill’s statement is that uncertainty is implied by the need of reaching “agreement” and “belief.”

Wegner’s Lemma

[WEGNER-1997] makes the most powerful statement, later formalized and known as Wegner’s Lemma in [WEGNER-1999]:

Before Gödel, the conventional wisdom of computer scientists assumed that proving correctness was possible (in principle) and simply needed greater effort and better theorem provers. However, incompleteness implies that proving correctness of interactive models is not merely difficult but impossible.

Wegner’s studies about the consequences of combining interaction with computational incompleteness, implies a powerful approach to problem solving. Elsewhere [WEGNER-2006] explains that the traditional approaches presuppose that:

The restriction of problem solving to thinking and question answering excludes interactive forms of problem solving that depend on the behaviour of the world rather than on apriori human beliefs.

Rationale for Empiricism in Software Methods

Agile methods acknowledge the role of interactivity, both between humans participating in the process, and between humans and the systems under development. Notably, one of the principles of the Agile Manifesto is about valuing “individuals and interactions [over processes and tools].” This principle implies the broadening of the scope of concern of any agile methodology to include psychology (individuals) and sociology (interactions), and therefore such an approach becomes more comprehensive than conventional, purely technical approaches.

Agile, by taking into account psychology and sociology, cover “incomplete” (excuse the pun!) areas of traditional approaches.

Interactions imply empiricism. Jeff Sutherland the founder of the SCRUM methodology often cites Humphrey’s Requirements Uncertainty Principle, Ziv’s MUSE and Wegner’s Lemma as theoretical foundations for SCRUM. Referring to Wegner’s Lemma, [SUTHERLAND-2001] states: "Here was mathematical proof that any process that assumed known inputs, like the waterfall method, was doomed to failure."

In the same vein, [SCHWABER-2001] describes how SCRUM was conceived of as an empirical process contrasting simpler “defined” processes, and refers to process expert [BABATUNDE-1994]:

It is typical to adopt the defined (theoretical) modeling approach when the underlying mechanisms by which a process operate are reasonably well understood. When the process is too complicated for the defined approach, the empirical approach is the appropriate choice.

On such basis, Schwaber suggests that an empirical approach is more appropriate.

Other Contemporary Precedents

Just as incompleteness caused the evolution from rationalism to modern empirical science, one could interpret that incompleteness is playing a similar role in provoking the transition from traditional methods to Agile ones.

While invoking the transitioning from rationalism to empiricism as an explanation of the transitioning from traditional to Agile methodologies might seem far fetched, there are other more recent, contemporary precedents, to support this as a general trend. In [NERUR-2007], Sridar Nerur observes how the conceptual shift from traditional to Agile has been experienced in other fields, like Architecture and Strategic Management.

Empiricism in Architecture

In Architecture, design problems were originally treated as a well defined sequence of steps that had just to be spelled out and followed. However, Nerur tells us that:

As early as 1963, continuous feedback among phases, communication, and iterative cycles of analysis and synthesis were recognized as key aspects of design.

Empiricism in Strategic Management

Similarly, in Strategic Management, Nerur notices there has been:

A shift from a mechanistic perspective to a perspective that acknowledges the existence of environmental uncertainty and complexity.

With regards to strategic management, contemporary approaches have abandoned the notion that the world is unchanging and foreseeable, and replaced analysis and reasoning with incremental learning, participatory decision making, and feedback: effectively transforming strategy formulation into an emergent process. Nerur concludes:

The trend in management thinking, moving from a deterministic/mechanistic view of problem solving to a dynamic process, characterized by iterative cycles and the active involvement of all stakeholders, is reflected in software development as well.

Considerations for Software Engineering Management

If the changes happening in the world of software engineering management are just a manifestation of the more general cultural transition from rationalism to empiricism, then there is yet a lot more ground to cover.

The “mapping” exercises, that are often undertaken that try to find common ground between traditional (e.g. PMBOK / CMM) and agile approaches seem like a compromise; and compromises are never a good solution. The underlying conflicts are still hiding beneath the compromise. To resolve the conflict, some assumptions have to be shown as flawed.

For example, one line often raised by the traditional camp, is that agile methods do not scale. While it is true that the origins and the anecdotal success stories on the agile side are in the small scale, there are signs that even more aggressively agile process can produce results that dwarf the scale of the largest traditional projects. For instance, [DENNING-2008] enumerates several prominent examples: The Internet, the World Wide Web and Linux; or the technologies developed by Google, Amazon and Apple; or even some large scale banking applications. Furthermore, Denning gives evidence about how large scale evolutionary agile processes outperform traditional ones, citing a World Wide Consortium for the Grid experiment comparing using evolutionary methods against standard acquisition processes. The conclusion:

[…] after 18 months , the standard process delivered only a concept document that did not provide a functional architecture, had now working prototype, deployment plan or timeline, and cost $1.5M. The agile method produced a “good enough” immediately usable 80% success for 1/15 the cost of the standard method, which seemed embarked on the typically long road to disappointment.

While this is just one example (relating to scalability), it illustrates the point: in the inevitable clash of ideas and powers between rationalism (traditional methods) and empiricism (agile methods), the proof of the pudding is in the eating.

Naturally, there are trends to take into account, like the inevitable change that agile methods are subject to themselves, and forceful counter-reactions and influences of rationalistic approaches. Already [GRIFFITHS-2007] observed that: "Agile methods are increasingly misapplied, watered down, and then criticised when projects fail."

Unlike typical management theories, agile has its origins in the field, lead by programmers inspired by excellence in craftsmanship. The engineering practices of eXtreme Programming and influences of Open Source Software development have increasingly improved the production capability of software teams. In fact, [ANDERSON-2008] observes that the “locus of interest” of Agile has moved from “programming” to “project and product management”. Anderson explains that the reason why focus is no longer on coding is that: "Programming and programmers are not the constraining factor on improved performance!"

On the one side we can expect agile to further move the “locus of interest” to yet higher level of responsibilities, where agile approaches are increasingly challenging the traditional ones as an alternative way to handle risk and yet deliver.

On the other side, the fact that the agile movement was initiated down at the lowest link of the food chain (by the humble programmer) is most significant. It implies that any kind of agile transformation, or agile adoption programme, needs to be supported by a deep understanding of the nature of software development. While programmers, coders and hackers (in the honorable sense of the word) have an intuitive understanding of the nature of software, once we escape the realm of the programming practitioner, the nature of software becomes an ephemeral concept.

It is here that the almost philosophical contribution of [WEGNER-1997] becomes crucially important. Accepting incompleteness as an inescapable aspect of any software related project, and thereby moving from rationalism to empiricism, is the key.

Currently, the struggle is at the project/middle management level. As explained by [NERUR-2007] contemporary management theories already incorporate concepts and approaches that are typically inspired by empiricism. So while strategic management “gets it,” and the humble programmer “groks it,” it is in the layers of of middle management that fortresses of rationalism, determinism, and command-and-control mentality still have the strongest foothold. Once those layers will have gone through the transition, then we will have gone full circle, and the next constraint will have to be sought for.

Published : March 25, 2012
Share this article at :
Twitter Facebook LinkedIn