One Felix Pleșoianu writes a few thoughts on so-called "software culture wars", and in particular, lays down a comparison of "users" of various operating systems, ending his article with the following observation:
People use what they want and make up reasons later. That's fine, really. Please stop pretending.
I somewhat agree with his conclusion, by the way, but I think he doesn't entirely sum up the whole situation. Unfortunately, he doesn't have a comment box on his blog1, so I will leave my own thoughts here, hoping that he'll see them.
First off, I'm quite convinced that for the most part there are no "Windows users", "Mac users" or "Linux users" per se. At the basis of this problem of self-(proclaimed-)identity lies a bigger problem with the (mis)education of Western kids, namely that they don't really get to choose to be "non-binary", "furries", "libertarians", "Metallica fans" or, last but not least, "Linux users" -- and in this sense, Linux users are way worse than Mac users, since the latter are just fashionistas, while the former also pretend to some imagined moral superiority. In any case, these pretend-choices are strictly imposed by the underlying social environment, i.e. kids will just end up "using" whatever other kids in their group "use", the problem of usage being just half of it, the other half being the sheer post-religiousness of being a "user of".
That aside, I will agree that the competition between various softwares has given birth to various sub-cultures, in the very same sense that the competition between '90s Romanian hip-hop bands gave birth to the same: at the time, kids (both band members and listeners alike) would find some arbitrary reason to differentiate, which was naught but a reflection of marketing upon them. Competing software suffers from the same spurious differentiation, and moreover, speaking of which, Linux is one of the worst subcultures to affiliate with among them, since kids will gladly "reinstall Arch" in order to "claim more control", and instead they will end up with Poettering's sadness, or worse, with Google's abomination. None will have at least learned how to bootstrap their systems, let alone own them in the only sense such ownership may take place.
This second layer aside, things aren't so much different "in the industry", only there, some folks are at least willing to put money (for some arbitrary value of "money") where their mouths (or asses, or both) sit. That is: accounting is to be done under Microsoft Excel's watch, video editing under Apple and/or Adobe's watch, while web sites and (for some really incomprehensible reason) automotive software stacks are to be obtained by raping Linux into some shape or another. Then there are some obscure software platforms capable of real-time processing2 and a few of them holding internet backbones together and... that about sums it up. I bet you've never heard anyone boasting about how they're a "VxWorks user" or an "IOS3 user", have you?
So generally speaking, "software culture" is rubbish. Yes, there might have been some greybeards in the '70s who actually knew what they were doing (though I have my reservations), but they're long gone now. Yes, after half a century of "operating systems" everyone's too tired to even make up reasons, they'll just use what Mother Tech is willing to provide.
-
Kinda reminds me of someone I know. ↩
-
Although they don't require all that much software -- actually this is one of the places where it's obvious that less "software" works much, much better than more of it. ↩
-
No, not that iOS. ↩
It may be that spawning „cultures” and subcultures is a hallmark of nascent mass technologies; at least while the commoditization process is still happening, knowledge is sparse (and more precious!), and various features are still tested out by being taken to their extremes.
As you said, that might just be an artifact of marketing doing its job, but I suspect it may also be consumer post-buy rationalization („I bought X, therefore X brand is a good choice, and other choices are poor”). Oh, and not to mention that some features may actually be useful to a subset of users.
I mean, automobiles went through a very similar process. From vanity stuff (muscle cars), through some brands having useful technical features ahead of time (e.g. BMW introducing xenon headlights or dual zone climate control), choosing a car brand used to be much more of a complicated decision, and a statement. Nowadays? Cars are a commodity, you can pick mostly on merits, brand irrespective.
With computers, things are just a tad hairier, because hardware is mostly invisible to the eye of the user. It does (and did) sometimes matter, e.g. some people turned into staunch defenders of Apple computers based on the quality and calibration of their monitors, for example. Same for software, some characteristics did make a difference (e.g. using a UI instead of the linux CLI). With a good heaping of marketing bullshit on top, naturally.
And, much like with cars, the computing landscape nowadays is slowly consolidating. Does it matter which OS are you using? Not to the „regular” user. All of them provide a user interface, all of them have some basic tools; a lot of power tools have been ported. And, true to my theory, the only place where wars are still happening is in the mobile device ecosystem, where there is still trial and error going on.
Hope this makes sense to more than just me.
> but I suspect it may also be consumer post-buy rationalization
Clearly, but one can't help but wonder: why did they buy the thing in the first place? Yes, Joe desires a certain set of characteristics, but once he searches for, say, "body thermometer" on Amazon, what is it that impinges him to choose brand X over Y? 'Cause I was Joe this morning and they all looked mostly the same to me. And from where I'm looking things aren't all that different when it comes to mobile phones, automobiles or what have you.
The problem with the commoditization you mention is that it gives birth to top-down markets. I don't get to choose a product with characteristics A, B, C and D (such as when I did when I built my own PC from scratch). Instead, I get to choose between maybe tens of products, each of which has a strict subset of {A, B, C, D} and quite often some subset of {E, F, G, H}, some of which are detrimental to me. Take for example the sad story of Blackberry, who now produces cheap Samsung clones, obviously, because democracy tends to eliminate differentiators.
Anyways, it looks to me rather like this commoditization process is past its days of glory, mainly because of the ongoing supply chain issues. Throughout 2021 I sat and watched Sony struggle with the PS5, Nvidia with its line of GPUs, and I'm still looking at industrial players (who pay hundreds of thousands of dollars per piece, not the mere thousands we're used to) struggling to get ahold of hardware. The smarter among them have integrated their production lines (e.g. Apple giving up Intel for ARM), but overall things are not looking great, such that pretty soon we'll be paying premiums for second-hand refurbished stuff. But anyway, this is just hardware we're talking about, commoditization works great in the software world.
Speaking of which, nowadays I'm playing with GarageBand for music production. It's a tad annoying at times, but it gets the job done and it does it fast. But that aside, again, one can't help but wonder: why did I choose a Mac and not a Windows computer for this stuff? Is it just because I heard some music producer praising it to me? And why GarageBand and not, say, Ableton? I'm certainly not part of any "GarageBand user group", but the fact remains: while using Apple's computing environment (read: pseudo-culture, with all its functional, design etc. choices), I'm subject to it and really not the other way around -- which is what Stallman was whining about back in the day when he was asking for "the ability to modify one's software".
[...] My question to Sandy would have then been: who's to say this "intentional stance" hasn't already manifested itself? Say, after I code an agent to solve a particular problem, who's to say that its so-called "intention" isn't then to solve the problem? By the way, can you see how this immediately falls into the discussion on wills and won'ts? Who's to say that an object, any object at all, "wills" something in a so-called "objective" sense, outside of any particular interpretation? Surely, this also runs into that old discussion of culture, as most bipedal monkeys actually learn things in a more or less Pavlovian fashion, through mere osmosis. [...]
[...] So you see, history is just repeating itself over here in our little Balkans. Soon enough Cheloo is going to start singing manele along with his workmate, 'cause really, there. was. nothing. else. The world has changed, indeed, and there's simply no point in resisting it. Sure, you may call it a "cultural war" if you will, since it's definitely way more "cultural" than the one in software. [...]
[...] Further below: the greatest from the carpati... also, fashuns. [...]
[...] time, it is up to each one to decide what constitutes "too much", either way "too much" is also a cost to be paid one way or [...]
[...] using "ideological" pejoratively, for reasons mentioned in the previous article in this series, or elsewhere, or [...]
[...] for that matter, I could have used a less modern compiler such as TCC. Instead, I used whatever was fashionable and whatever recommendations I got from my WoT at the [...]
[...] Well, enjoy whatever's on the menu! [...]
[...] their time on things they don't even make the effort to understand for the same reason some people use Windows, i.e. because it's the fashionable thing to do and because posting photos on their Arsebook will [...]
[...] of how fucked up it may be. Still, Gheboasă is not a name right now, he's merely part of some genre or another, which makes him somewhat of an uncanny [...]