Hi there, lately some thoughts about various subjects have been percolating in my mind, and I think I should just write them out, to solidify what I’m thinking about and keep as a note to remember in future.
So this post contains some thoughts about programming or writing code. I think one of the important things I’ve learned during my bootcamp was code quality. I wrote very briefly about it some time back over here, but I should elaborate more this time.
Quality code is understandable, readable, tested, discoverable, changeable, documented.
The thing about programming or building software is you’re essentially writing text or words (code). Lines and lines of them. So these lines of code are the building blocks, imagine Lego blocks, that join together and are processed by the computer to perform functions.
In programming languages, there’s a distinction between higher level languages and lower level ones. Lower level languages are assembly language or machine code, they are not easily readable by humans because they are like 1s or 0s. Imagine seeing 1100011111100001010001010. It’s almost impossible to decipher and takes hours even if you try. Thus, higher level languages like C was invented, building on alphabets and mathematical symbols, things we can more easily read and understand. Then over the years, higher degrees of languages were invented, for example Ruby, which reads almost like normal English. This increases the speed in which humans can write, read and understand code, while making fewer errors.
Sorry for digressing, anyway, when programmers write code, they’re writing it for two audiences: (1) the computer to run the code, and (2) other programmers.
To dispel a myth that’s perpetuated by the media, of the lone wolf programmer who works in isolation, the reality is much programming work in the real world is done collaboratively. Each person builds a certain module or part of the software, and different parts are built at the same time by different people. It’s almost like construction workers building a HDB flat. So there has to be communication and cooperation between the programmers – they should at least have a general understanding of what the other person is writing, and thus, their code has to be standardised/formatted, and should be written in a way that’s understandable by others.
Just as in English, where you can express the same meaning using different phrases, it’s similar in code. You can get the same function or result using different paths, some of which can be complex or indirect. Usually, programmers will opt for a direct and understandable path, balancing computing speed and efficiency.
True, that are cases where one person builds the software, writing all the lines by himself or herself, so on the surface, if the computer understands the code, it doesn’t matter if other people don’t. But we are forgetting that software is meant to outlast the people who wrote it. Programmers will leave or pass on, and the new person who comes in will have a tough time trying to understand what the previous guy wrote. They’ll be facing a bomb, wondering which parts of the code you can change or remove without crashing the whole system.
In addition, software systems are not foolproof. They are designed by humans to manipulate data and information. Certain scenarios or edge cases might not have been considered when the original programmers wrote the code, like what if the user fills a form by spelling out his age as ‘Twenty’, instead of using the numbers ’20’. If the programmers had not considered how to deal with this anomaly, the software system might not be able to handle it, will throw up errors, and the user gets stuck.
Besides human errors, there are also computing errors. It’s rare but possible because computers are physical objects that experience wear and tear, and there’s limitations on the calculations they perform. I was surprised to learn of the floating point error when I started learning. Floats are numbers with decimal places. Take 0.1 + 0.2, the answer is simple, 0.3. However, you’ll be surprised because the answer in some situations if you’re not careful, is 0.3000000000001. Once you perform a series of calculations, the magnitude of errors become more significant, and if your software programme is handling salary payrolls for employees, they may find themselves receiving a few hundred or thousand dollars more than they should get.
To learn why this happens to computers, you can check out the link I provided. The short answer is computers perform calculations in binary : 1 or 0, while we use the decimal system – calculations in 10s. So our numbers need to be translated into binary form first, and in cases where numbers have decimal points, rounding errors are introduced into the process. So usually, to overcome this, the computer needs to be specified that arithmetic functions are performed on floating numbers.
What it all means, software systems are not perfect. They need to be maintained and monitored for errors, and this task takes time and eventually more than 1 person on it. So whatever code in it has to be readable and understandable by others. Thus, writing code is not just writing code, it’s also a form of communication.
You have to consider the point of view of the person who’s going to read your code, will they be able to understand the terms you’ve used? Is what you’ve written clear and yet concise?
Speaking of concise, sorry for the lengthy post. I’ll talk about other stuff next time. Like similarities between writing code and writing tv scripts. There’s parallels between the two processes, especially as it involves working in collaboration with others. Until then, have a good day ahead.