“In my vision, the child programs the computer and, in doing so, both acquires a sense of mastery over a piece of the most modern and powerful technology and establishes an intimate contact with some of the deepest ideas from science, from mathematics, and from the art of intellectual model building.”Seymour Papert, Mindstorms: Children, Computers and Powerful Ideas
At the beginning of 2020, I began teaching myself how to code. My motivation for doing so came from a variety of sources, but predominantly, it was undertaken as an intellectual exercise; something challenging to try and gain mastery over. Chess is so 1990s.
To me, programming was a project to undertake, which would add a new string to my professional bow and hopefully add a few new concepts to my catalogue of mental models. For the most part, it has been an overwhelming success in all regards. It is something I have become deeply infatuated by.
(It is unbelievable what “passion” and “flow” can be a remedy for. Few, if any, of life’s ails can stand strongly against them.)
Apparently, I should have seen this coming.
I remember a mid-lockdown walk-and-talk with my friend — and co-host on the PhilosphyAu podcast — Josh. I felt like I was baring my soul to him and was sheepish about how obsessive I had become; all I wanted to do was talk, think and read about programming. I divulged this and he said:
‘HA! You… Lyndon… like coding? Add that to the list of “Things That Don’t Surprise Me”!’
As I said, I guess I should have seen it coming.
This continued for a while, but after 10-12 months of tinkering and self-teaching, I wanted to step it up a bit. I decided to go back to school and enrolled in a bachelors of Artificial Intelligence at Deakin University. I am now nearing the end of my first trimester.
It’s not that I think there is something inherently better about formal education – and something I quite like about the software development world (from what I can tell), is that it rather meritocratic. You don’t have to have gone to an elite school, or even have a degree, to get a job. You just have to be able to write decent code and be team-player, for the most part.
The reason I went back to school was just to put myself on the line a bit more – have assignments, time pressure and student-debt be forces for motivation — and to push me beyond the ever-present risk of plateau that plagues the self-teacher. As it currently stands, it has been a wise and rewarding choice. I have learned an immense amount in the short time I have been studying so far.
To be frank, I am quite a skilled and self-motivated learner, and am proud of the progress I made on my own. Sometimes, though, there just really is value to be found in the structure of and experience captured within a tertiary education setting. It’s not always ideal, but it has worked for me thus far.
Another reason I wanted to do it the formal way, was because I wanted to get more in touch with the theory and the deeper concepts as well – how a computer works, what is computation, the mathematical side of things, algorithm analysis and design etc. In contrast to this, though, I have read online a number of posts or comments talking about how much of a waste of time/money a computer science degree is and that it is possible to snag a high paying job without even 12 months of learning under your belt. “You don’t need the theory, it only slows you down. Just do! Write code and build things.” is the general theme of many posts.
Whether this is true, or how representative these opinions are, I cannot say.
What I can say, though, is that financial gain is not my motivating factor. Would I love a high paying job? Sure, who wouldn’t! But, as I said, that wasn’t the reason I did this. More than anything, I wanted this process to expand my mind, to confront me with new intellectual challenges and, as a result, force me to think in new ways. I sense that I need mix of practice and theory to best achieve this. Besides, computers aren’t going anywhere soon — if ever. Learning how to think like, and about, them would be decently productive way to spend my time; especially if I find it enjoyable and intrinsically rewarding.
Why have I told you all this?
In short, I want to help you to learn some of these key principles and concepts, too.
Of all the responses I have gotten from people when I talk to them about learning to program, or the new degree I am undertaking, two responses have been by far the most common:
- “That’s so cool. I always wished I learned how to program. I would never be able to learn to do it, though; way too hard and wouldn’t know where to start.”
- “Woah… *Insert Matrix reference* *Insert iRobot/Ex Machina/Terminator reference*! So you basically just think in terms of 1s and 0s now, yeah? *Insert some other comment that exaggerates the magical and incomprehensible nature of computers*.”
Neither of these responses exemplify an accurate understanding of the situation.
To start with, pretty much everyone who has responded with #1 would be well and truly capable of learning to program, in my eyes. Now, I’m not going to go as far as saying everyone can and should learn to program; only that those who have expressed an interest in it, all seemed to be up to the task – again, in my mind.
If this is you, then I hope that you find the series that follows useful.
My goal is to communicate some of the core ideas, allowing you to taste some of the low-hanging fruit, without needing to become an expert programmer or theoretical computer scientist. If you find the content enjoyable, you too can explore the topic more deeply and integrate it further into your own life.
If, however, you are a respondent of type #2, then you will also likely benefit from the series – should it interest you. While computers, software and various forms of artificial intelligence can do some truly amazing things, they are still not “magical.” Computers are governed by (mostly deterministic) rules and principles, which can be studied, understood and then re-implemented — in creative and novel ways. Learning the basics of programming and computer science can take the mysticism out of computers, while preserving the awe.
This is my goal, at least. You shall be the ultimate judge of how well I fare.
What gives me the right?
By no stretch of the imagination am I an expert on these matters. Far from it. In fact, one the reasons I want to write these articles and create this series, is to help myself learn the content more thoroughly. Few things help you deeply internalise what you only partially know, and recognise what you actually don’t, like trying to teach it to someone else does.
Like I said at the start, I tinkered myself for about a year, and now am reaching the end of my first few formal subjects. I have not written any earth-shattering apps or even been employed yet as a junior programmer. Compared to the vast majority of the industry, I know nothing.
While that may sound pessimistic, and a horrible way of selling this series to you, I hope you can see it for the positive potential it represents; that I am going through this journey, too. I am not yet so blinded by my knowledge that I fail to recognise what the beginner needs or struggles with. I am still close to the start, and as such, can help you take the first few steps — without needing to rely on memories that are years or decades old to tell me what those steps are like.
To paraphrase the philosopher Dewey Finn – who espoused his pedagogical theories while posing as Ned Schneebly – “Those who can, do. Those who can’t, teach programming.”
In the coming weeks and months, I’m going to try and teach you (at least a little) programming.