Hi my name is Peter and I’m a developer, now playing an architect, and still trying to figure out what that means. But that’s a post for another day. Probably many posts.

To put this post in context my background is mostly ‘front end’ software development in DOS, Windows, Web, and Mobile; mostly with software that has a user interface, but occasionally not.


I have always struggled with software development complexity.

I love software development. Communicating and collaborating with people and computers, thinking, planning, modelling, designing, delving into my creative mind and soul, engineering and crafting software that fulfils a need, creating is … awesome!

Software development is complex, but I believe we make it more complex. In our quest to make software development better, we lose focus, and we end up creating complexity.


Complexity can exist at many different steps in the software development journey. What do you do with something that’s complex? You create a process to perform it of course.

Perhaps the most infamous process is the waterfall model.

‘The waterfall model is a sequential design process, often used in software development processes, in which progress is seen as flowing steadily downwards (like a waterfall) through the phases of Conception, Initiation, Analysis, Design, Construction, Testing, Production/Implementation, and Maintenance.‘

Sounds great doesn’t it? Follow a detailed plan, err … ‘phases in a waterfall’, and you will have software at the end of the process that fulfils your need. People have been following it for decades. There’s only one problem.

‘The first formal description of the waterfall model is often cited as a 1970 article by Winston W. Royce, although Royce did not use the term “waterfall” in this article. Royce presented this model as an example of a flawed, non-working model.‘

But this didn’t stop people from using it. Still hasn’t. Perhaps there is perceived safety in process. If you follow instructions, surely you will achieve the desired end result?

On my first agile project we delivered a green fields software system in 8 months, which would have taken years using our traditional software development process, and this included learning an agile process, tools and techniques. After the project ended I returned to one of the traditional software development areas. Two things struck me like physical blows, and left me with a feeling of complete loss and bewilderment.

The first was that email was my process again. I sent more emails in that first day than in the previous 8 months. The second thing was when I was sitting in my little cubicle, nicely partitioned, I overheard a conversation between two developers - “Reject defect #537. That’s not a defect because…” I forget the specifics, but remember that it was two developers, an absurd defect number, and they literally talked about it for 10 - 15 minutes. It never occurred to them that they should go find the tester and talk to them.


Processes can be complex, and so can the dizzying universe of technology which is created and wielded by people.

Programming languages are used to build software, and there are thousands of them, with more being created every year. Each one has it’s own syntax (form) and semantics (meaning) for expressing it’s instructions to a machine.

There are different types of software. For example operating systems manage hardware and provide services for other software. Platforms, frameworks, libraries, and applications use operating systems, and each other.

Each piece of software can be complex, and working software can be a complex web of relationships between them. Just how complex can be highlighted by a seemingly absurd comparison.

‘For decades computer scientists have strived to build machines that can calculate faster than the human brain and store more information. The contraptions have won. … Biology does a lot with a little: the human genome, which grows our body and directs us through years of complex life, requires less data than a laptop operating system.‘

Mark Fischetti, Computers versus Brains
Scientific American, November, 2011

It takes time to understand technology, when to use it, and how to apply it, and people often come away with different understanding, and a different level of skill in using it.

Early in my career I was working on a project when it was decided a well known services company (name withheld) would come in, save the day, and show us how to do it properly. There was one recognised expert I will always remember. He sat quietly aloof at his desk every day, apart from the occasional very important closed door meeting. Each day he would take from his desk a book from the vast array of ‘software development’ books on it (his personal collection), sit there with it in front of his computer and implement some new form of best practice code or pattern for the part of the project he was working on. He was seen as highly productive.

When he left his masterpiece was given to me, with no hand over. You can guess the outcome. It was either the most brilliant code I have ever seen in my entire life, and I was unworthy to begin to comprehend it’s magnificence, or it was a pile of shit. The truth was probably somewhere in the middle.

The sad thing is I’ve seen this behaviour to varying degrees consistently over the years. It’s something I have been guilty of as well. People wielding technology, rather than collaborating with people using technology.


When I think about software development I think about the following words in a positive, great context.


I put people first. They are the enduring thread that ties man to machine, and getting it to do what they want. Process and Technology seem to forget that sometimes.

I put technology second last. Without everything before it, it is useless.

I put simplicity last. I believe it will emerge if you have the rest.

If only we had all of these things, all of the time.

You can create software without some of these things, but not great software.