For what it's worth -- maybe not that much, but eh -- I'm back on Twitter, or X (an Elon Musk company), as they call it nowadays. Maybe it will turn out to be a waste of time, but then again maybe it won't and in either case there's only one way to find out.
One particularly interesting feature of this X is that you get to hear first-hand stories of decay. Take this for example:
@raysan5: Is there any Computer Science course teaching JS/Python but not C/C++/ASM at some moment? I can hardly believe that...1
Agree that teaching C + raylib is an amazing alternative, I'm probably biased, but I taught it for +10 years and my students love it!
Some of the replies include:
@GredinTom: Yes, in France there is accelerated graduation in 6 months for ppl who knows nothing and they only learn how to create react apps. It's sad...
@winger_hunter You would be surprised, my school switched to JavaScript and c# a few years back because c/c++ were “too hard for new students” thankfully I started at a community college first that taught me both languages. It’s ridiculous, seniors and half my class can’t work with pointers)
@kybernoita: I don't think during my BSc courses I wrote a single line of C or C++. It's always Python. I hate it here.
... and so on and so forth, I shan't press the point. But (yet again) for what it's worth, this is far from a new trend. To exemplify, I shall add to this my own experience from about a decade ago in Lausanne: while working at a local start-up, my colleagues, most of whom were educated in Eastern Europe, some of them well outside the Bologna Process, were complaining that most students attending the local job fairs only knew Java, no C. Why, do you wonder? well, because that's what they were teaching at the local tech university! And just in case you're wondering, the folks I worked with were for the most part a bunch of hardcore seasoned low-level systems OS/compilers/stack smashing nerds.
So in other olds, the field of education has been getting progressively worse in the last two decades or so, much to no one's surprise. Just so you know, in Romania university financing is allocated per headcount, so the trend has been to raise the number of students eligible for bachelor's studies, and at the same time to keep the graduation rates as high as possible, otherwise the politicians would deallocate money, i.e. the number of available places for study would yet again go down. But wait, there's more! pre-uni studies have consistently degradated, so the quality of the average candidate, for, say, ACS has in fact been going down. Now you do the math regarding grade inflation and the likes. They just had to make it easier for students to graduate, all the while making everyone else's life more complicated. Believe me, I know because I worked there.
So getting back to our Twitter thread, here's an interesting point of view:
@chapmanworld: Please forgive my ignorance, I never completed a computer science course. (self taught career dev)
Surely the point of a computer science course isn't to teach any given language, library or framework, but perhaps only to use them to teach the actual computer science?
He does have a point, doesn't he? Why should it matter whether you study JS, C or whatever the fuck other grotesque language in order to learn programming? Moreso that after enough programming languages, they kinda start repeating.
Anyway, I couldn't help it so I added my reply:
@spykeder: Sure, it's Turing machines all the way down, but contrary to popular opinion, the lowest level2 of abstraction are also the simplest.
@spykeder: Yossi Kreinin's "low-level is easy" comes to mind: https://yosefk.com/blog/low-level-is-easy.html3
I really don't think that "low-level is easy", but I do strongly believe that it's simple; and I also think that this confusion between simplicity and ease is why some programmers never seem to get it right. For the sake of example, let's take two opposites in terms of abstraction layering: x86 assembly and JavaScript. No, I'm not choosing JS by mere chance, you'll see in a moment why.
x86 is dead simple, although it isn't simple as far as assembly languages go -- that's why they called it a Complex Instruction Set Computer, didn't they? Yet it's still a whole lot simpler than JS, because it gives the programmer direct access to the CPU registers, while JS hides them beneath layers upon layers of JIT compilation and whatnot. This simplicity is the most important in the most critical of moments, say, when you want to have a fine-grained control over the performance characteristics of your system4. But at the same time this simplicity doesn't come with ease, in fact quite the opposite, because if you want to reproduce the behaviour of a JS system, you'll have to rebuild at least some of its abstractions.
Conversely, JS is dead easy: you can write a script and load it in a browser and you're set. But its abstractions are way the hell more complex, just look at the execution model: it's a pain in the ass to get your app to sleep, something which is unheard of in traditional imperative programming.
I'm not going to go deeper into examples, I think I've stated my point pretty well. But in any case, simplicity is the opposite of complexity, measured as, say, the number of variables in a system. Ease on the other hand is sheer power or readiness-to-hand -- when I press the "publish" button, I don't need to be conscious of the complexities of HTTP POST calls, PHP interpretation, SQL table insertions and so on and so forth, but regardless, the complexity is wholly there, intricately weaved into the implementation of my blog.
That's about it, really.
-
I'm going to skip the emojis, because what the hell. ↩
-
I meant levels, but whatever, this is X, so I'm allowed to be in a hurry. ↩
-
I do recommend that you (re)read Yossi's article if you haven't, I find it to be of extreme actuality. Also mind that this was written in 2008, way before Rust et al. started permeating the markets. ↩
-
Don't even give me that "compilers are just as good, or perhaps better". Yes, they're better than the average programmer, but no programmer I know, say, in the field of scientific computing, would ever use a garbage collected language, despite all the advancements and everything. ↩
Another useful reference for this topic is Joel Spolsky's article on leaky abstractions: https://www.joelonsoftware.com/2002/11/11/the-law-of-leaky-abstractions/
Any and all fields have this issue regarding entry-level education: needing to find a good balance between teaching the basics, and teaching present-day relevant skills. Oh and doing it in a reasonable timeframe, ideally with costs as low as possible.
Starting from both ends, there are challenges on both sides. Regarding the present-day status of the field, students need to learn tools of the trade to some degree so that they are not entirely useless in the workforce, if and when they decide to leave academia. Take Python, for example - sure, one starts with the easy stuff, imperative programming and OOP. But it's not a stretch to use the same language to touch upon deeper stuff, like what's MRO (a concept useful for all object-oriented languages), how constructors work, functions as first class objects (paving the way towards functional programming!). Get to debugging your code and you suddenly touch upon topics such as the call stack, understanding that both objects and code live at some address in the memory, stack heap etc. Of course, it takes some doing, but then again .. what doesn't?
At the other end of the sausage, one needs to teach the basics while trying their best not to weigh down the process with useless cruft. Sure, teach a student how logical gates work, and how C works, and how are the bits and bytes put on the wire in order to make computers talk to each other via networks. However, do they need to re-learn the entire history of the field? Do they need to know about every failed innovation attempt, and every dead language out there?
I can definitely agree with you that a form of decay is failing to teach the basics, on the premises that the current-tools will suffice. And yet, lurking in the background is the idea that the opposite is somehow also true: that teaching just the basics will suffice, and that once the simple building blocks are known, one can build tools that provide ease of use. Except no, not really, because present-day tools have a history of trial-and-error behind them, of people who tried to find the good ways in which to use the simplicity of the building blocks. Fail to take that into consideration, and you're bound to (inefficiently) re-invent the wheel most of the time.
All in all, until such a time when the tech field stops producing "innovation" at break-neck speed, this debate shall rage on, eternal.
I'll dwell upon the following question:
> do they need to re-learn the entire history of the field?
because I think it's central to the discussion. For example, take:
> Joel Spolsky's article on leaky abstractions
Heheh, the article hasn't aged all that well, has it? Kids nowadays by and large have no clue about COM, LPSTR, ASP.NET and all those other thingamajiggs that were somewhat current when he wrote the piece. It's not that no one programs using these technologies anymore, I'm sure there's a fair amount of web apps written using ASP, it's just... no one teaches this stuff anymore, just like no one teaches COBOL.
So this brings me to the following point: I don't think this needs to be a debate, and no, I don't think that the masses need to re-learn the entire history of the field. However, the folks who aim to be philosophers in their field are fucked if they don't, know what I'm sayin'?
The problem with technological stuff is that no matter how many abstractions one tends to pour upon them, they'll only end up propagating the problem to the next level -- that's what Spolsky tries to underline in his article. Take Python for example: it's great that it doesn't require you to do explicit memory management, I agree that's an absolute bitch. But just head a bit outside of the regular cases that you're used to and trust me, Python's garbage collection is gonna come and bite you straight back in the ass. For example what do you do when you have a HTTP library that instead of freeing your TCP sockets, keeps them in a sort of in-between hanging state (i.e. CLOSE_WAIT) indefinitely? You're going to have to be careful to call close() anyway, won't you?
So getting back to my point: sure, a good technician doesn't need to learn all the mistakes of people who came before. But unless he works along with someone who is aware of these mistakes, he'll necessarily end up repeating them, just like all other fools keep repeating the history they don't know.
Us computerfolk are somewhat lucky though: unlike, say, civil engineering, which dates back before the time of Apollodorus of Damascus, computer history is less than a century old. Sure, if you take the fact that you can't actually know computing unless you have a grasp of electronics, electrical engineering, hydrodynamics and... well, the point is that we never run out of things to learn, regardless of the end we approach this from. It's just that some of us get to reach a deeper understanding of their field, while others don't, and either way it's no tragedy.