On computers

April 11, 2020 by Lucian Mogosanu

The whole of computing rests upon two pillars; or rather, there's precisely two things anyone can do with computers.

The first thing anyone can do with computers is putting bits in and taking bits out, also known in the field as input/output. Yes, I'm aware "input" and "output" are in fact two distinct items, but one doesn't make much sense without the other except in very particular, degenerate cases, hence they're taken as a whole.

The category above subsumes all copying of data from one place to the other, and moreover, it includes all interaction with the computer, as well as, perversely enough, between computers; in the sense that from two of them connected together, there will fractally emerge yet another one, similar with its two subcomponents, yet different from each of them taken separately.

The second thing anyone can do with computers is perform fast arithmetic, also known in various fields as amplification. That's also why they call it a computer, by the way.

This second category subsumes pretty much everything expressible in quantitative terms. If, say, in 1980 you could fit a few thousands of books in a truck, nowadays you can fit a few tens of thousands in a few gigabytes stored on a two-inch-long USB stick. Or if the average human can do basic arithmetical operations in the order of a few per minute, a 1945 ENIAC went in the order of hundreds per second, while your mobile phone can reach maybe a few thousands of millions per second.

That's the whole of it, at least as far as computers taken alone are concerned. Surely, there's a lot of interplay between the two categories (e.g. "distributed systems" and the likes), but more interestingly, there's a lot "seeping out of" computing. Take playing for example: much of it relies on a specialized computer, the graphics processing unit, doing mind-bogglingly fast computation inside your general-purpose computer and delivering it rapidly to your screen (hence why all the "PCI express gazillionx" hardware); as well as, if we're discussing multiplayer, delivering some data to some other computers in a timely manner. However, and at the same time, a lot of it relies on how humans interact with each other and with the game, not even getting into the various techniques of displaying enjoyable graphics on the screen.

So ever since its inception, this whole copier/communicator and calculator/amplifier has been slowly eating into other human activities. First it was simulations of various natural phenomena and trying to obscure messages such that they're hidden from prying eyes, i.e. "encrypted". Then it took over telegraphy and radio to the point where all communication is digital. Then it took over various markets, killing, pillaging and raping dead tree books, music and movie distribution ("entertainment" as a whole, in fact), all news, the scientific method, money and finance, the market of jobs, and so on and so forth; last but not least, killing and reinventing its own markets at least a couple of times in the process. No, "it" didn't do all this by itself, the whole process was and is entirely driven by (some) humans and their stupid attraction to the siren song of "progress".

Obviously, this has already fractally gone past the Nth snowball and it can't be stopped, do or say what you will. Now, whether it will end up imploding under its self-perpetuated pile of complexity; or on the contrary, whether it will continue consuming more and more of every aspect of human life, thus leaving little to nothing in the place where humanity used to be... well, I don't know and maybe it doesn't make much of a difference either way. Do you remember that scene in Lana&Lilly's Matrix, by the way? the one depicting an endless grid of pods, each holding a human-like individual, each of those powering the whole machinery? I hear ye benevolent rulers finally have the v1.0 beta release rolling.

This about concludes my series of series of words on the subject of computing, or, for that matter, overall on the recently-halted "industrial revolution". Honestly, I don't see what else is there, especially given recent events. We'll see, I guess.

Filed under: computing.
RSS 2.0 feed. Comment. Send trackback.

6 Responses to “On computers”

  1. [...] for quite a while now, the subject has found them fashionable and it will continue to, towards consuming them [...]

  2. [...] of view, ways to look at) computer operation tools, mechanisms that may be employeed to steer the underlying machine one way or the other. Thus, assuming that the machine is capable of universal computation and that [...]

  3. [...] know it's not exactly the fashionable approach to computing but what can I say. It's [...]

  4. [...] in large quantities after a recipe defined with surgical precision, thanks to The All-Encompassing Automation. This inevitably forces those who sell said shit to spend resources into creating the illusion of [...]

  5. [...] relation between the human brain and computers reflects upon that between humans themselves and the darned machines. We reason here that computers are machines, but the human individual is not rational. In his [...]

  6. [...] them, among these maybe there's to be found a new energy, the seeds of new life -- and not just Pink Beetles in a Purple Zeppelin, as Arjen correctly intuits, but the seeds of actual meaning. It will be difficult for rock, as I [...]

Leave a Reply