Management trends
My pursuit of a management degree arose from curiosity and a growing sense I wanted a better understanding of management and leadership as a context for software development. General and project management trends shaped software development in the early years, as well as in following eras.
Inflexible cultures were the norm during the Wild West era—hierarchical, command-control, focused on planning and execution of those plans. Great strides were being made in engineering, and its assumed predictability crept into management thinking. Businesses were universally measured in financial terms—as always, driven by Wall Street. Software project success was measured by completion and cost. Just getting software delivered was considered a success, but schedule was also essential for projects. Cost was certainly important, but was secondary to getting systems up and operating.
Businesses operated on the premise—correct or not—that the world was nominally predictable and if plans failed to materialize, the problem was execution, not planning. Good managers and executives got things done—end of story. The nascent IT world was less predictable, which put IT executives in the hot seat because little allowance was given by general management for the still experimental nature of computers and software.
Looking at how management evolution impacted software development, four factors appeared important: industry evolution, work type, management style, and worker category.
As the industrial age blossomed in the early twentieth century, researchers like Frederick Winslow Taylor introduced the term scientific management, extolling the virtues of precise measurements and rigorous, prescriptive job duties. The view of the organization as machine became embedded in the management culture, and optimizing those machines became a key management goal.
Later, management theory began to change based on the work of individuals like Douglas McGregor and Peter Drucker. We hear about GOATs (greatest of all time) in various categories—but who would take the prize in literature? While it might depend on whose list you use, the general consensus is In Search of Lost Time by Marcel Proust.21 If there is a GOAT in management theory, it could well be Peter Drucker. In his time, Drucker wrote 39 books and coined the term knowledge work in 1959. Called the father of modern management, he defined management as follows: “Management is a multi-purpose organ that manages business and manages managers and manages workers and work” (Drucker, 1954). This succinct definition helps us assess changes over time as work changes, workers change, managers change, and managers of managers change. Drucker’s coining of the term knowledge work signaled that the very nature of work was changing.
“Organization as machine”—this imagery from our industrial past continues to cast a long shadow over management. Managers assumed stability was the normal situation and change was the “unusual state,” writes Rita McGrath in a 2014 Harvard Business Review article. McGrath identifies three ages of management—execution, expertise, and empathy. “If organizations existed in the execution era to create scale and in the expertise era to provide advanced services, today many are looking to organizations to create complete and meaningful experiences” (McGrath, 2014). These management style categories bring another dimension to our discussion of the software development eras.22
Unfortunately, I never found further McGrath material other than her Harvard Business Review article. Moreover, there is debate about the empathetic style.23 Even so, I liked the words McGrath used to categorize management periods. While the label command-control has often been applied to traditional management, none of the recent style names has emerged as “the” term. Names such as leadership-collaboration, adaptive leadership, Agile leadership, Management 3.0, savant leadership, and others have all appeared in the last two decades. So, I will nominate McGrath’s “empathy” as the best name for modern management.
KLAUS SCHWAB, CEO of the World Economic Forum, proposed a way of looking at the evolution of work. Schwab’s four ages are centered on the advances of science and technology:
First: The Age of Mechanical Production
Second: The Age of Science and Mass Production
Third: The Digital Revolution
Fourth: The Imagination Age
As the Age of Science and Mass Production24 got under way, organizations got bigger and needed a way to manage multilayered organizations, from ground-level supervisors to executives. Practices such as standardized processes, quality control, and specialization of labor were widely applied. Optimization—efficiency, consistency, measurability, predictability—was the goal. This approach, dubbed command-control management, defined the Execution Age. This was the age in which industrial workers were performing physical work.
With the Digital Revolution, computer technology evolved from mainframes to minicomputers to personal computers, broadening access to computing power. Concepts from other disciplines such as psychology and sociology began to creep into management theory, but this age primarily brought expertise into play, characterized by the concepts of reengineering, Six Sigma, and MBO.
Software development would add its own terms in this period—namely, waterfall and Monumental Methodologies. As the use of technology, including software, medicine, computers, materials, and computing devices, exploded, so did the need for knowledge workers. As knowledge work expanded, employees rebelled against existing manager–subordinate relationships, which drove early agilists to focus on building person-centric workplaces. In recognition of this change, Adaptive Software Development (Highsmith, 2000) used the term “leadership-collaboration” management, in contrast to the earlier “command-control,” to characterize practices of this age.
Schwab doesn’t set the time frame for the fourth industrial age, nor does he explicitly name it the Imagination Age, although there are references to imagination and innovation in his work. He defines this age by the velocity of change, the breadth and depth of change caused by the rapid evolution and integration of technology, and the systems impact, referring to international sociological systems. To prosper in this era, we will need to define “work” yet again, understand the differences between knowledge workers and innovation workers, and know how to lead, organize, and manage in an empathetic way that encourages imagination and creativity.
The Imagination Age is the period beyond the Digital Revolution, where creativity and imagination become the primary creators of economic value, as technologies such as artificial intelligence, biotechnology, robotics, quantum computing, and robotics become integrated into our world.
Eventually, workers were classified into three types: industrial, knowledge, and innovation. As the nature of work changed, the types of workers required changed, which in turn changed the way managers and executives (managers of managers) viewed and interacted with the workforce.
REMEMBER HOW YOU MIGHT have felt about the certainty of the future before the COVID-19 pandemic. And now? The ripple effects of the pandemic are unknown and largely unknowable until they fully play out. Many of these changes were emerging before 2020, and the pandemic just accelerated them. As uncertainty has increased, people have begun to theorize ways to model uncertainty and devise tools and methods to manage it.
Stephan H. Haeckel, who worked at the IBM Advanced Business Institute, published a Harvard Business Review article in 1993 and went on to further explain his ideas in his book Adaptive Enterprise in 1999 (Haeckel, 1999). His message: Organizations needed to move from a plan-and-execute to a sense-and-respond approach to the future. Sense-and-respond enables organizations to sense the outside world, respond quickly, and use feedback to initiate the next cycle. Organizations dedicated to plan-and-execute become so plan obsessed that deviations from the plans are considered mistakes rather than opportunities.
Why didn’t Kodak respond to the digital camera threat? Did digital cameras appear overnight, or did Kodak miss market cues? Why did Netflix oust Blockbuster? Didn’t the latter pick up on Netflix’s rising market share of movie rentals? Sensing, in our fast-moving business and technology environment, can be extremely difficult. What is noise? When does accumulated noise raise to the level of alarm? In her latest book, Seeing Around Corners: How to Spot Inflection Points in Business Before They Happen (2019), Rita McGrath provides insight into this difficult question. In attempting to sort through and analyze streams of data, you need context—what arena are you playing in?
Dave Snowden devised a way to think about uncertainty in a context that supports decision making. In 1999, Snowden introduced the Cynefin model derived from his study of complexity theory. Snowden’s model has been embraced and widely used by the agile community. With each category of change, Snowden proposed a practice type to use. His model identifies five categories, or types, of change:
Obvious, for which best practices suffice
Complicated, for which good practices are used
Complex, for which emergent practices are best
Chaotic, which requires novel practices
Disorder, for which practices might be unknown
As economies, businesses, and technologies evolved from somewhat complicated in the 1980s to complex, and then to chaotic in the 2000s, Snowden’s framework helps us understand the role that combating uncertainty played in the transition from structured to agile development. In this book, I will use the Cynefin model as an indicator of strategic, high-level changes in the business and technology worlds. At the tactical, project, and product levels, I will introduce the exploration factor (EF) in Chapter 6. These two “methods”—Cynefin and EF—provide tools for managing uncertainty.
Table 2.1 summarizes the changes in these factors over the four software development eras and helps us understand why methods and methodologies evolved as they did. During my evolution from structured to agile methods, these frameworks helped me put useful context around my work.
Table 2.1 Management and Work Evolution
Key Factors and Thinkers |
||||
Software Era |
Management Style (McGrath) |
Work Type (Schwab) |
Worker Category (Drucker) |
Type of Change (Snowden) |
Wild West |
Execution |
Science and Mass Production |
Industrial |
Obvious/complicated |
Structured |
Execution/expertise |
Digital Revolution |
Knowledge |
Complicated |
Roots of Agile |
Expertise |
Digital Revolution |
Knowledge |
Complex |
Agile |
Empathy |
Imagination |
Innovation |
Chaotic/disorder |
TOWARD THE LATTER HALF of the Wild West era, I began to delve into project management practices. While project management had a long history, practices relevant to software development emerged only in the 1950s and 1960s. Gantt charts (task and schedule) were used successfully on projects such as the Hoover Dam in the early 1930s. Other large projects in these early years included the Manhattan Project to develop nuclear bombs in the 1940s. Bernard Shriever, while in the U.S. Air Force, was credited with originating the term project management in 1954.
The cornerstone of modern project management techniques was the Program Evaluation Technique (PERT), popularized by the Navy’s successful use building Polaris submarines. PERT and Critical Path Method (CPM), invented in 1958 at Du Pont, began to be used in the U.S. aerospace, construction, and defense industries. The use of work breakdown structures (WBS) began in the early 1960s. The Project Management Institute (PMI) was founded in 1969 to do research into and promote project management practices. The most famous project undertaken in the 1960s was the Apollo Project (1963–1972), in which NASA successfully led six missions to explore the moon. Even though I had an exceedingly small part in the Apollo mission, this experience provided me with a happy quip: “My first project was a success.”