As an active member of Dev Twitter, I'm more than familiar with the kinds of tweets circulating the tech community. Code snippets, portfolio highlights, and celebratory posts following successful job interviews are always a pleasure to see. However, a trend I've noticed among so-called "Code Influencers" (Codefluencers?) is the generic numbered list of "Steps to Become a Full Stack Developer", which usually go something like this:
Stage 1 – HTML, DOM.- Ghost Together (@GhostTogether) October 22, 2020
Stage 2 – CSS (Grid, Flex.)
Stage 4 – React.
Stage 5 – Node.js
Stage 6 – MySQL.
Stage 7 – MongoDB.
Stage 8 – The CRUD pattern.
Stage 9 – Back-end XP at your first job.
🏆 – Full Stack developer.
However, I'm thankful that I learned Ruby first.
For those that are unfamiliar with Ruby as a language, it's designed to be as readable to humans as possible. Ruby's creator, Yukihiro "Matz" Matsumoto, essentially took features of several languages that he found pleasant to use, and wrapped them all nicely in an Object-Oriented Programming Language that was pleasant to read and abstracted unnecessary complexities.
Thus, in 1995, Ruby was born. But why learn a relatively new and underutilised language over an in-demand technology?
Anyone who has learned to program can likely attest to the fact that learning your first programming language can be a frustrating endeavour. While I felt much the same way about Ruby in my first few weeks working with it, I am someone with no previous programming experience. I can say without a doubt that Ruby gave me the confidence with programming I really needed to start exploring other languages on my own, and it's all due to its simple syntax.
My advice to those learning to program: Try Ruby. If you find that it isn't for you, then there's nothing lost, and there's a multitude of other tech waiting for you to play with. But don't let Ruby's relative obscurity fool you into passing by an enjoyable and powerful language.