Software development has been going on for a long time. As with any discipline, it has evolved. When I began my career in software development, a friend loaned me a book entitled “The Mythical Man-Month” by Frederick P. Brooks Jr. There have been many editions of this book since.
It remains a stark reminder that throwing more resources at a software development project will not necessarily make it go faster. Stories abound of how software projects spun out of control on a grand scale causing them to go orders of magnitude over budget and schedule. As I wrote in my final BPTrends Column for 2013 (Process Automation from Scratch), there are a vast number of new architectures and tools that aid in the rapid development of applications.
This year, my column will explore application development from a process perspective. This topic should be relevant to anyone involved in business process management. Ultimately, process optimization hits a brick wall without automation. Once you introduce technology to a process, it doesn't really matter whether you're using a business process management system (BPMS) or building it from scratch; you are going to be developing a solution.
Those outside of the software development world tend to automatically assume that the major challenge will be building the solution. This could not be further from reality. If you've ever been involved in developing a software solution, you probably know that the devil is in the details. Those of us involved in business process management (BPM) know almost every process has exceptions that must be managed. The path that does not contain these exceptions is often called the “happy path” by business analysts.
Too often, software is developed to support the happy path and fails to deliver real-world results. We will be exploring how to mitigate this tendency in the projects you undertake. We will learn from some real life examples and try to put things in the proper context for BPM professionals. I happen to believe that BPM strategies can be harnessed to deliver first rate process automation, but I also recognize that there are other viewpoints out there and we will endeavor to evaluate some of these as well.
A Brief History of Software Development
When I started in the computer industry, it was 1978. Software programmers had been around for over a decade by then, but it was roughly the first time that someone could learn a programming language and write software while sitting at their kitchen table. Prior to the birth of the microcomputer, software was developed by large teams of programmers sitting outside air-conditioned rooms.
Projects of any significance used what has become known as the Waterfall Model of software development. It is so called because everything flows downhill:
This approach seemed to make perfect sense. You can't build something until you understand the environment and the problem you're trying to solve.
Naturally, gathering requirements comes first. This has never changed. What has changed is the process used to gather requirements. What has also changed is the level of detail at which requirements are gathered.
Using the waterfall model means that all the information that analysts gather during this phase will provide the universe of knowledge that will be used to design the solution.
Until fairly recently, the tools available to build software required extensive effort to do anything complex. Before a team would put in that sort of effort, they needed to make sure they were building the right thing. The best way to accomplish this was to write detailed specifications.
Business process automation almost always requires collection, manipulation and reporting of data. This made it easy to decide the first step of the design. Designers began with a data model. This informed the program logic, interface and outputs. Functional design documents traditionally contain all four of these elements.
The challenge has always been writing these documents for two very different constituencies. Before software can be constructed, the “client” must sign-off on the design by validating that it will effectively meet his/her requirements. Then, that same document must be used by programmers to actually build the application.
This dichotomy of purpose is the major failure point of the Waterfall Model. Some have solved this problem by writing two sets of design documents; one for each constituency. Of course this raised the cost and time associated with design. Few clients have the stomach for the added cost of such documentation. This often resulted in a document that was inadequate to effectively visualize how the application would look to the end user, nor did it provide enough detail for programmers to code the software without further clarification.
Working from the available design documentation, programmers begin to build the application. As the inevitable ambiguities started to emerge, the team had three options:
- The programmer chooses the best course of action
- The programmer asks the analyst who chooses the best course of action
- The programmer asks the analyst who then asks the client to choose the best course of action
As you can see, there are more words in each subsequent option. There is also more time (and thus delay) associated with each option. As you might imagine, option #1 is the preferred choice when a project is under growing time pressure; often with disastrous results.
Multiply these choices by the many ambiguities encountered even with good design documentation (let alone bad documentation, which is far more common), and it's not hard to see why projects can go so very badly off course. Keep in mind that programmers often worked for months or even years without presenting any work product to the client.
Once the application is built, there are two major things that need to be determined:
- Does the product meet the operational details of the specifications?
- Does the product actually work within the process for which it was designed?
Many decisions will have been made without consulting the client during the development. In a perfect world, this wouldn't be the case, but I know of nobody that lives in such a place. In the real world some decisions will have been better than others; just how well the application meets #2 will depend on those “less than better” decisions.
Using the Waterfall Model, the application will typically be tested and debugged based exclusively on #1 before what is known as client acceptance testing (#2) takes place. This means that great effort may be put in to make the software function “properly” before the client ever lays eyes on it. It was not uncommon for testing and debugging to take 50%-200% of the time it took to build the application in the first place, but who wants to show the client substandard work?
When the client finally gets the software, the team finds out for the first time how well they did. If their decisions were good, the application will be well received and additional effort to make it work within the specified process (or processes) will be minimal. If not…use your imagination.
Deploy and Maintain
The application ultimately needs to be made to work within the production environment in which it will live. These environments are rarely static. They contain multiple components from multiple vendors and each vendor is regularly updating their products. What works together today will not necessarily stay this way. So, even though the application is never changed, there may be a need to update it to work effectively in a dynamic environment.
This all assumes that the application remains the same. That would be great if the process remained the same, but that's not the way the marketplace likes it. Competition and advancing technology continually force us to improve our processes and eventually, the application that worked so well is inadequate to the task at hand. Depending on how well the application was constructed, it may be able to be expanded and enhanced to handle evolving processes. Eventually, it will probably need to be tossed, but building it to last will save you the joy of doing it all again.
A New Dawn
The Waterfall Model was used for many years. In 2001, a group of developers who had been experimenting with a better approach coined the term “Agile Development.” Since that time “Agile” has grown to be the predominate approach. As the name implies, it is better suited for our modern “turn on a dime” world. In my next installment, we will take apart the Agile approach and compare it to the Waterfall Method.