If you are around my age, work as a programmer, and took some classes on the subject since 2009, you probably were subjected to Robert C. Martin's book Clean Code: A Handbook of Agile Software Craftsmanship. (As of this day, the first edition from 2008 hasn't been updated. Clearly, Clean Code must be a—nay, the—perfect book, which does not require improvements or adjustments after all those years).
I bought the book out of personal interest, probably in 2009. As many others, I first was fascinated by it, but never managed to read more than one or two chapters. I also rarely bothered to read specific sections offering solutions to problems I faced on a daily basis. Probably the solutions offered weren't too helpful, even though I worked as a Java programmer during that time, and Clean Code is very much about the kind of Java we wrote back then. So Clean Code collected dust on my bookshelf.
Clean Code and the Clean Code Cult
When I studied computer science at the local technical college, I was exposed to Clean Code again. Or rather it was shoved down the students throats as a gospel by disciples of the Clean Code Cult, as I like to call them.
The members of the Clean Code Cult strengthen their belief by joining up for a ritual called Clean Code Shaming, where they superficially look at a piece of code they don't understand on first sight, and then just yell «Clean Code!!!1» at its author in order to give proof of their superiority and sophistication.
Remember: Code you initially don't understand is always just bad code and certainly not a chance to improve your understanding of programming, especially if pointless techniques like Memoization or Lexical Closures are used, i.e. techniques you haven't been exposed to yet.
My friend meillo pointed out the cult-like nature of Clean Code roughly at that time when its disciples came after me. A leader being called «Uncle Bob», a scripture that doesn't require a second edition after many years (but spawns sequels such as The Clean Coder, Clean Architecture, Clean Agile, and Clean Craftsmanship), disciples willing to align themselves into grades and wear bracelets for self castigation: If you still don't think that Clean Code is a cult, just talk to one of its disciples, point out a contradiction in the Clean Code book, and witness the angry reaction caused by your blasphemous remark.
But wait, a contradiction in Clean Code? That's impossible! Or maybe not?
Bumbling Boomer Bob on Function Arguments
Working as a programmer for almost twenty years, I am still a layperson when it comes to the exegesis of Clean Code, because I obviously still haven't been englightened by this masterpiece yet. Trying to slay a straw man—and failing to do so!—will hopefully bring me back to the Right Path, so that I can finally abondon my wrongthink and give up on my hellish ends.
Let's hear what The Englightened has to say about function arguments (Clean Code, Chapter 3, p. 40):
The ideal number of arguments for a function is zero (niladic). Next comes one (monadic), followed closely by two (dyadic). Three arguments (triadic) should be avoided where possible. More than three (polyadic) requires very special justification—and then shouldn’t be used anyway.
This is misleading on so many levels that I need to dissect it in multiple paragraphs.
Uncle Bob™ uses the terms niladic, monadic, dyadic, triadic, and polyadic for functions with arities of 0, 1, 2, 3, and n, respectively. There certainly is a qualitative difference betweeen 0, 1, and n (nothing, something, and many things), the difference between, 2, 3, and n is only of a quantitative nature. But judging by the terms being used, the author sees a qualitative difference between all those arities, to wit (Clean Code, p. 42):
A function with two arguments is harder to understand than a monadic function.
Functions that take three arguments are significantly harder to understand than dyads.
This difference is clearly just of a quantitative, not of a qualitative nature, because adding another argument only makes the function «harder to understand», no matter if you go from one to two, or from two to three arguments. You just move up one step on a continuum.
Having been exposed to Haskell for a couple of hours, I'd expect to read about
Curried Functions here: functions of arity
n that return a function of arity
n-1 when being invoked with a single argument. But obviously those Haskell
guys must be stupid, because they also bother with Partial Function
Application, which only makes sense when you have multiple arguments, i.e.
diadic, triadic, or even—Bob forbid—polyadic functions!
This must also the reason why SICP makes for such a bad introductory textbook, because the Professors Abelson and Sussman clearly haven't read Clean Code when coming up with this abomination (Structure and Interpretation of Computer Programs, Exercise 1.32, Chapter 1, p. 61):
(accumulate combiner null-value term a next b)
Six arguments, are you kidding me? Polyadic ad nauseam! Bob hates it.
(If you didn't figure out where my exegesis went from serious to sarcastic, stop
reading this text and just forget about it. Put on the Clean Code bracelet of
the day and refactor that cryptic
this.x += 3; statement to a clean
increaseXByThree(); niladic method instead.)
But Bumbler Bob is here to help (Clean Code, p. 43):
When a function seems to need more than two or three arguments, it is likely that some of those arguments ought to be wrapped into a class of their own. (p. 43)
I wonder how many arguments a constructor for such a class might require. Certainly, the introduction of the Builder Pattern would be The Right Solution™ for this issue. Much clearer than having a function with four arguments. (Behold, the englightment is kicking in!)
Boomer Bob is clearly familiar with the concept of Pure Functions, otherwise he wouldn't object so strongly against side effects (Clean Code, p. 44):
Side effects are lies. Your function promises to do one thing, but it also does other hidden things. Sometimes it will make unexpected changes to the variables of its own class. Sometimes it will make them to the parameters passed into the function or to system globals. In either case they are devious and damaging mistruths that often result in strange temporal couplings and order dependencies.
Unless Bobby-O considers lies good, he clearly speaks out against side effects here, as he spoke out in favour of functions without arguments before. So we should all be writing side-effect free functions without arguments. But what can such a function return?
- A constant value
- A random value
Unless the function also operates on global variables or on the properties of an object. But then those functions (or methods) are not really side-effect free, because their semantics is influenced by side-effects of other functions/methods.
In order for a function to do something useful without side-effects, function
arguments are needed. The amount of arguments needed is determined by the
domain of the function, i.e. by what the function is actually supposed to
do. Shall a function compute the solutions to a quadratic equation, the
c are needed. Shall a function draw an arc on a
canvas? You need to define the coordinates of the circle's center (
its radius (
r), start and end angle, and whether or not the arc shall be drawn
clockwise or counter-clockwise.
Complexity: Inherent and Accidental
Admittedly, Clean Code offers some useful advice to make such APIs easier to understand, e.g.:
- Instead of having two paramters
Pointabstraction might come in handy. (see Argument Objects on p. 43)
- Instead of passing flag arguments (clockwise/counter-clockwise), provide two functions. (see Flag Arguments, p. 41)
But Boomer Bob entirely misses the point: The difference between inherent and accidental complexity. Solving a square equation requires three arguments. An arc is defined by its mid-point, radius, angles, and curve orientation. This is complexity inherent to the problem at hand.
The thickness, colour, and opacity of an arc being drawn on a canvas all have to
do with a specific application of the concept. So while I consider it good
advice to wrap drawing details (thickness, colour, opacity) into an argument
object, or to use a
Point abstraction instead of two loose
arguments, there's no reasonable way to deal with arcs using niladic or
monadic functions, save for Curried Functions, which clearly aren't on
Babbling Bob's mind here.
You might also separate the computation of an arc from actually drawing it.
Here, the computation returns the coordinates to be drawn, which you can pass
draw method of the canvas object, maybe together with the drawing
details. A pair of monadic methods for setting coordinates and drawing details
on that object won't make anything clearer, but only introduce more side
effects: accidental side effects this time, which change the state of the
object without any palpable benefit. Calling the
draw method to actually draw
on the canvas is the only desired side-effect: the complexity introduced
thereby being of an inherent nature.
More Arguments Can Make For Better Abstractions
Another example: Consider a
reduce function. (This is a higher-order function,
but obviously not at the height of Clean Code, for such concepts are not
mentioned in The Masterpiece.) Consider the following interfaces:
reduce(combine, values, initialValue)
combine is itself a function with the following interface:
reduce function must assume a value out of the given
(usually its first element) as the initial value to be used for the
accumulator. So the dyadic
reduce can only return something of the same
type as the elements of
values have. E.g.
values is the integer array
combine sums up the
accumulator with the current value
reduce must return an integer.
reduce function can accept any initial value, as long as the
combine function is capable of dealing properly with that type of value. E.g.
reduce can be used to partition the integer array of
[1,2,3,4] into two
arrays of odd and even numbers. The
initialValue then could be a tuple of two
(,). Those arrays are filled by the
combine function: odd
numbers in the first array, even numbers in the second array:
Not only is a triadic
reduce function more powerful than the dyadic one,
it is also more general, i.e. a higher abstraction. If you want to do repeated
modifications to a vector in Clojure, you need something like a triadic reduce
function! But I guess that Clean Coder Bob figured this out in recent years,
judging by his delight in
I ranted away half of my Saturday morning on roughly half a page of Bob's Timeless Wisdom. The problem is not that Clean Code is a book with advice that aged poorly, because it was written from a mid 2000s-Java perspective. The problem is its uncritical fellowship taking this advice at face value, because their perspective is too narrow.
I'm convinced that Robert C. Martin would write a totally different book on the subject nowadays than he did back in 2008. But a second edition of Clean Code would have very little in common with the first edition still being in print. Rewriting the entire book probably would be less work than re-editing it. And its fellowship would feel cheated by reading a book full of advice contrary to its original.
If you feel offended by this text, please take an hour to watch Brian Kernighan's lecture on The Elements of Programming Style and let his advice sink in. A former professor of mine once «improved» Kernighan's code from The C Programming Language (second edition, again…), for the reason you might guess: Clean Code!!!1
So reconsider your habit of yelling «Clean Code!!!1», «Train Wreck!!!1», or «SOLID!!!1» (what does the «L» stand for, again?) at other programmers without first having tried to understand their code and familiarized yourself with the concepts being used therein. Try out a functional programming language or two, e.g. Haskell and Scheme, and consider their up- and downsides compared to, say, Java or C#. Then read Clean Code again (or: actually read it), but with the grain of salt extracted from your recent encounters with different ideas and concepts. Read it critically, not as a gospel, and you'll extract some real value out of it: by carefully considering each advice and its proper area of application—as limited as that might be.