Again on "general-purpose" tools

April 20, 2020 by Lucian Mogosanu

[ ... in computing, and quite possibly elsewhere. ]

So, as I was saying:

Which brings us to the semantic engine of computer languages, their type systems: any clear, concise expression of a problem from some given domain may be found in languages containing appropriate primitives. As an aside, this is why I don't believe there is such as thing as a "general-purpose language" in the true sense of the word.

This echoes precisely my later rants about operating systems. Perhaps now's a good time to restate my point, starting from the observation that in a certain (abstract, lol) sense, there's no fundamental difference between "operating systems" and "programming languages". They're both metaphors for (or, points of view, ways to look at) computer operation tools, mechanisms that may be employeed to steer the underlying machine one way or the other. Thus, assuming that the machine is capable of universal computation and that the language/operating system exposes the machine fully, then indeed, we can say that we're dealing with a "general-purpose" language/OS.

There's a catch here, though. Most computer languages make a very clear distinction between "reserved" (i.e. primitive) and general-purpose (and thus necessarily not pre-defined in the language) atoms; the distinction being that the former have syntactic meaning. Furthermore, this distinction stems directly from the engineering "limitation" described in footnote 7: you have to choose a name for an operation as simple as "add these two values and store them in a(nother) value", and you're going to call that "add" in x86 assembly and "+" in C or other high-level languages, what else is there? For one, this all becomes a problem when the abstraction also obscures some details in the underlying mechanism; for the other, the complexity of the language grows (exponentially?) with the number of such keywords added "for convenience". In other words, there's a gazillion ways to express any form of computation given to us by nature, as well as there's a gazillion ways to model this universal computation mechanism.

My point being that, from these gazillion representations, each particular one will bring with it its own baggage, which necessarily implies that certain hardware will solve some problems more efficiently, i.e. from the point of view of engineering process as well as program run-time. By the way, this limitation, largely ignored by orcs, led to an inflation in "high-level languages" and software in general, and a poor use of computing hardware; coupled with the "commodity general-purpose hardware" mirage, this is how we've ended up with the shit we have today.

Anyway, any particular representation will necessarily bring an intrinsic burden of usage limitations. This on one hand is useful as a guide to representing both the problem and its solution in correct terms; but on the other, it makes it practically impossible to represent all the possible problems; sure, maybe all of them are representable, but the cost of representing a large enough subset of these problems is larger than the Earth could afford, necessarily. Hence the pretense of "general-purposeness" in the computing tools one uses day to day. Then again, if the problems in question aren't easily representable, maybe that's a problem with the problems, n'est-ce pas?

By the way, I believe systems such as Lisp and Forth to have solved this problem in precisely the right way, by limiting the number of syntactic "keywords" to that which the hardware provides and exposing the hardware precisely as it is. "Common Lisp" on the other hand is yet another mirage; if anything, there's PC-Lisp, HPC-Lisp, Network-Processing-Lisp -- hell, I don't know why you'd need a Lisp for this, just saying -- and other such dialects, each stacked on top of whatever primitives the GPU or the NIC exposes.

Filed under: computing.
RSS 2.0 feed. Comment. Send trackback.

One Response to “Again on "general-purpose" tools”

  1. [...] this footnote evolved into its own article as I was writing [...]

Leave a Reply