You only need two programming languages to thrive in the modern programming world. This is not to say you shouldn't look at other languages or that other languages aren't great (in fact my personal favourite language, Odin is not here), but if you only had to learn two, these give you both a foothold in the professional world and the knowledge to cause serious, innovative change in the general computing space.
C: The closest you can get
C is a 50-year-old tank that's still chugging through billions of bits every day. It's stupendous how well adapted this language is for tons of scenarios. It's used for operating systems, games, web browsers, compilers, text editors, and so much more.
The reason is because it's the closest you can get to the hardware. C itself imposes almost zero abstractions from the functions of the CPU. All the cruft used is chosen by the programmer, making the language one of the simplest languages in existence while allowing for ridiculous power through your choice of abstractions.
In addition, C provides an intimate view into the memory, which is the vehicle for change in computers. All programmers do is manipulate data in memory, so having immediate access to memory is one of the strongest advantages of C and the reason why it's language of choice for embedded and game development.
C's minimal abstraction and in-depth memory access allow programmers to do some ridiculously impressive things and innovate in spaces that have remained stagnant for decades (i.e. OS development). You owe all of the niceties of technology to C, so it wouldn't hurt to learn it.
The primary problems with C come from its old age. Those problems are:
- The dependency problem. All build systems for C are pretty bad and depend on external languages, so your best bet is to avoid anything that isn't a single-header library.
- The standard library is lackluster at best. In personal pet-projects when learning, it's passable, but as you grow as a developer, it's definitely better to develop your own header for your professional projects. See this video by Sean Barrett for why that's a good idea.
- It's pretty confusing upon first glance. Which compiler do I use? Which debugger do I use? What's an atomic volatile static function pointer? These are all valid questions (I still don't know the answer to that last one). I recommend just watching the videos I link in the conclusion below and then just get started making some projects
JavaScript: The everyman's language
In comparison to the brute strength of C, JavaScript seems a bit lacking. The language runs in a VM, far away from the memory of the computer. The language is often unspecific and really finicky. Debugging it is a bit of a nightmare, as well.
However, the language is beautifully simple, making it perfect for both website and desktop scripting. This stems from one reason: there are no types.
Having no types sounds like a complete net negative, and in scenarios where control and performance matters, it is. Lacking a strong type system requires a runtime, which impedes performance, and memory access is a lot harder. However, we lose these things in exchange for simplicity.
JavaScript only has ~5 types (excluding null and undefined):
- Number (same size as a double in C)
- String
- BigInt (it's really big)
- Boolean
- Object, which can contain all 5 types
By ignoring distinct user-defined types, and only supporting these 5, this allows for truly flexible and simple code that is perfectly readable without much context. This simplicity makes the language perfect for user-facing scripting, where the programmer desires immediate iteration over perfect performance.
On the web, this simplicity demonstrates itself on the frontend. See here and here for examples of minimal web frontend applications. The main characteristics of these are:
- The use of
<div>
in the HTML - The use of simple functions and var in JavaScript
As for server-side programming, I have previously pronounced my disdain for NodeJS. However, my opinions have become a bit more refined. I still think the way NodeJS is commonly used nowadays, acting as a diving board into a sea of endless dependencies, is still pretty bad, but using the runtime on its own with a minimal amount of dependencies shows its moderate simplicity. Personally, I'm probably going to stick with Golang on the server for personal projects, but I'll still give NodeJS and its friends a shout.
On the desktop front, I wouldn't recommend using it for full GUI apps but rather as an apt replacement for shell scripts. If a runtime is used that replaced Batch and Bash with an actual programming language that maintains the two's simplicity while being more cross-platform, it would perfectly fit as a shell scripting language. In fact, there's already something similar to this in zx, but maybe instead of depending on Node, I would depend on QuickJS.
Finally, JavaScript is the language of choice for most business applications, which are usually web-facing. So, if you know that language, you'll be in good shape to get a job. You don't have to conform to their Angular or React ideals, though. If you can show them (through your portfolio of projects) that writing minimal JavaScript with few dependencies works well, they're likely to hire you. If they don't, trust me, you didn't want to work in that bloated environment anyway.
The primary problems with JavaScript come from its ease of use. Those are:
- The performance problem. This is primarily an issue on the server side. Running advanced computations using a language not well-suited for advanced computations on top of a browser engine that's already running advanced computations doesn't sound like a good plan to me. I would recommend writing WebAssembly modules in C for slow computations on your Node server.
- There's too many dependencies within the ecosystem. JavaScript solved C's build system problem, which caused people to become dependent on other people's code in massive registries like NPM. This gets really hairy because most JavaScript developers don't know how their dependencies work and end up using them for things that would take only a dozen lines to implement themselves. My advice is to just ignore dependencies all together. The language is high-level enough that most of what you want to do can be done with your own knowledge and it is very portable, as opposed to C, where platform layers built on dependencies are more common.
Advice for learning
Stop wasting time and make something, you learn a lot more that way. Your top goal after watching a beginner series should be to make a real project ASAP.
Make a lot of small things. Each beginner project should only take up to a month to program. A good first project in JavaScript would be to make a portfolio site for yourself. Then, you can put it on a domain and link all your other projects to it. A good first project in C would be to make a game from scratch (no libraries other than a platform layer), which requires deep knowledge of memory layouts and tests your problem-solving skills.
Ask of what you don't know. There's plenty of great communities to learn from. Use them. Just... maybe not StackOverflow.
Learn as much as you can. By learning lots of different techniques, you can get a feel for what you like to program and in what environment you like to program in. You need a solid foundation of how the computer works first, lest you just follow dogmatic advice with limited substance.
I wish you well on your programming endeavours. Go make some programs!
tl;dr: it's C for knowing how the computer works, high-performance applications and low-level innovation, and JavaScript for web technologies and user-facing scripting