"Open" "source"

March 11, 2023 by Lucian Mogosanu

Given that my recent adventures in computerized music are far from over, I've decided to step outside my comfy place in the world of old synths and do a more thorough scoping of the field. My resources are meagre in the sense that I don't really have access to a Moog or some other analog synth1, while my access to the various digital synths is limited by my time standing against a huge pile of items available on the market. I do however have plenty of experience with software sound processing tools, including MIDI synths, be they commercial or open source. Thus I've decided to continue my quest by revisiting an old program or the other2.

This time however I decided to do things properly, that is, by enumerating dependencies and attempting to sanely import everything by building from the actual sources. This might not sound like much to the naïve eye, so if it doesn't make sense to you, think of it as a process of slowly building a library of useful software, irrespective of any other considerations. I even started an article attempting to document this process of gathering and as my endeavour went nowhere I started documenting the failure, only as I revisited my first set of notes, I got so angry that I decided to throw the whole thing into the proverbial "Recycle Bin". Do you remember it, by the way? It invariably reminds me of Windows 95, the best Windows besides 3.1. And why do you think that is?

We've visited the subject of systemd in the past, so let's take this example again. It's no news that the systemdization of "GNU"3/Linux distributions turned the Linux init process into a sort of svchost.exe: there's definitely some good ideas behind it, but the execution is a bug-ridden single point of failure that cannot even fail deterministically4 -- and let's not even go back into the messy politics behind it. Maybe this wouldn't be that much of a problem if, as that old goofball Stallman insisted, one were able to hack on their software, as the GPL intended. The GPL however is a legalistic document that doesn't even account for the fact that the author of some code is exempt from the obligation of holding your hand while you attempt to compile it. Some of them are nice enough to help with this, but most of them simply work around it by having some maintainer package .debs, .rpms or what have you. This in turns creates a supply chain dependency, in the sense that you're now depending on a bunch of (in most cases unpaid) labour to provide you with products instead of the usual $evil_corporation. Regardless, your activity, whichever it may be, now imports this dependency.

This is a problem mainly because it points quite obviously to the observation that there's not that much left of what was once "the open source model". If it's easier for me to install a Debian package instead of procuring and building from source, then this "open source" is de facto a closed source model. Don't even come to me with "use Gentoo", this is well besides the point: if I'm forced to outsource my process to some repository where dependencies propagate non-linearly, then I'm fucked and I'm better off using ye olde Windows or MacOS than killing my time with various softwares written by "the community".

So, although the search for sane means is far from over, I'm afraid "open" "source" can't stand for much of a criterion to inform it nowadays. This realization may be sad on one hand, but on the other, if the console emulation folks managed to do it, who knows -- maybe a few years from now we'll run a PlayStation 3 emulator inside a Lindows emulator inside an iPhone; or whatever other sad piece of hardware we'll happen to own.


  1. VST simulations don't count, since they're in fact software solutions. Ceci n'est pas une pipe, irrespective of how accurately the image reflects the original object... oh, for fuck's sake, now I'm stuck explaining myself, ain't I?

    The point is, no matter how accurately some piece of digital equipment (be it software or hardware) is able to model its analog counterpart, the former remains a mere reflection of the latter, and not only historically -- the Moog came before digital synths altogether, let alone Moog emulators; I guess this counts for something, doesn't it? -- but also technically. I don't care how they did it: whether they took the original analog circuits and simulated them in a high-tech computerized setup, or otherwise they took a bunch of high-resolution samples and threw them into a machine learning model, either way it's all the same. The problem is first and foremost one of medium: whatever you'd do, you can't fully capture the analog information of, say, a sine (let alone another, more complex) wave in a digital environment, just as you can't compute derivatives using floating point formats like you do it with analog circuits. There's a bridge between the two contemplated items, and it is this bridge which renders them unequal. What can I say, not everything is reduceable to bits.

    So let us not confuse categories. 

  2. For the curious: ZynAddSubFx, an old synth from the SourceForge days, just like the SID; only as far as I know this one is still maintained and made available through "public" distribution channels, i.e. OS distribution "ports". 

  3. What sort of GNU is that operating system which maintains a bloated "init system" written by Lennart Poettering, ex-Red Hat employee, now a dude from... Microsoft! 

  4. What else would you call its numerous vulnerabilities?

    Sure, you folks'll rewrite systemd in Rust and all will be well, won't it? Well, thus far I'm unimpressed. Rust seems a definite step backwards, in terms of complexity if for no other reason. 

Filed under: asphalt.
RSS 2.0 feed. Comment. Send trackback.

8 Responses to “"Open" "source"”

  1. #1:
    Cel Mihanie says:

    > just as you can't compute derivatives using floating point formats like you do it with analog circuits

    I vehemently disagree with the entire bit in the footnote, and I think this very example undermines your argument. Both the floating-point and "analog" differentiation methods are provably flawed (each in their own way) approximations of the thing in itself, the "noumenon" as they say, which is, in this case, differentiation in the symbolic, abstract, mathematical realm (in practice, you can even sort of work within the latter, philosophical sperging nothwithstanding, as long as both the model and the data are closed-form). Point being, the "analog" is not the thing in itself, it too is just a model, an approximation. The very word "analog" implies this separation.

    More specifically, analog audio is often incorrectly imagined as being smooth in time and space, unlike digital which is quantized in those directions, with all the alleged loss of fidelity thereof [1]. But the reality is, analog signals in practice are already quantized in different ways. Their practical resolution is finite, whether we're talking about individual oxide grains on a magnetic tape, the finite number of receptors in the human ear as well as the inherent quantization in all electrochemical systems [2], the finite number of molecules in the medium through which sound travels, etc. We're not even getting into the nasty theories that posit even time itself might be quantized.

    Just like we pretend that analog resolution is infinite when in reality it's just "good enough", we can do the same for digital, as long as the number of bits is enough and the model is satisfactory. Sure, it's only an approximation of the noumenon, but so is any "analog" way of perceiving and recording it.

    Tbh much of the critique of digital systems I've ever heard, is either not based in reason, or can be summarized as people not being happy about a particular choice of resolution or a a particular model. Both those choices are a result of commercial, historical and laziness considerations and do not really invalidate the digital approach per se.

    TLDR: I posit it is absolutely possible to digitally simulate and represent analog synths, "tube sound" etc., to a degree of fidelity that cannot be rationally contested. That this has not happened yet (allegedly), is likely because, as usual, the people who care a lot about this thing don't have the know-how, and the people who have the know-how don't have the motivation [3].

    Might come back later with some thoughts on open sores if I haven't gotten myself banned yet :P

    --

    [1] I'm intentionally allowing "digital" to be conflated with "quantized". Usually it's the quantization aspect of digitality that people focus on, even though, ironically, it's precisely the other aspects that set analog and digital apart.

    [2] You'd be surprised at how many "analog" systems in the human body are actually quantized or binary. E.g. a muscle fiber, by construction, can only ever be completely on or completely off. The body achieves a seemingly smooth spectrum of light to hard contractions by just varying the number of fibers that get activated. The cells in the retina work by, essentially, counting photons per time in a really over-elaborate way, not too dissimilar to a CMOS sensor. Pretty much anything "analog" inside us is achieved by modulating binary things in time and space.

    [3] Sure, there's also some practical considerations sometimes. You often want a real-time simulation (for MAME etc), and even the beefy Ryzens that we middle-class folks have are, alas, still not beefy enough to do proper simulation justice.

  2. #2:
    spyked says:

    Whoa, wait a second there, I think you're arguing against a point that I haven't made.

    Far be it from me to think that analog computation is equivalent to its algebraic counterpart. I'm comparing analog and digital computation alone and what I'm saying is not even that one allows for better error rates than the other, but that their characteristics are sufficiently divorced from each other that it's impossible to meaningfully compare the two. So what I'm saying is that irrespective of "error rates" or any other considerations, the two cannot be considered equivalent, because... they aren't, what else can I say. Thus you can't say "analog synth" and refer to a software simulation thereof, just like you can't say "piano" and refer to its rather masterfully crafted "digital" copy.

    As to your other points: the Moog for example is what it is, it's an analog synth whose characteristics depend very much of the quality of the materials, its design and so on. I'm sure there exist some cheap and darn good Moog simulations out there, especially since it's such a famous item. I'm even sure that with today's means there exist very accurate simulations, especially in professional equipment in the higher price ranges. In fact that's my whole point at the bottom of the article: as soon as the original items become barely accessible, we'll be left with simulations/emulations which we'll have to do with. Quite a funny repetition of the Roman aqueduct trope, come to think of it, except what now: our nephews will end up running simulations of our simulations?

    Regarding MAME: I tried compiling it last evening and... failed. Must be because I don't have a C++17 compiler on my system, which is, again, my whole point: if I want a MAME on my system, I'm going to have to import the whole closed-but-pretending-to-be-open spittoon in my universe.

  3. #3:
    spyked says:

    In other lulz, LWN, or more precisely Jonathan Corbet attempts to discuss the issue earnestly and... fails. It's not necessarily his fault though, to the same degree and in the same way that attempting to "discuss" politics during ye olde communist times wasn't, say, Amza Pellea's fault.

    The problem with Jonathan Corbet's attempt is that this type of discussion cannot be waged using the symbols employed in today's so-called "public discourse". In order to have a meaningful discourse on "software during times of war", one must necessarily destroy these symbols, precisely because their engineered introduction into language aims to detract from reality.

    To put this plainly: of course such luxury as "free" software can be had, just as long as the public doesn't interfere with it. In other words, the best time to build and maintain a WoT is... well, yesterday. Maybe this WoT can be maintained through coopting Imperial tools, there's absolutely no problem with that, as technology is politically neutral -- which makes pretty much *any* tool a potential weapon. Which isn't to say that tools of any kind can be preconditions for a WoT; the only precondition for a WoT is a bunch of folks and any sort of tech shall only arise from the culture grown out of said WoT, not arbitrarily, but au contraire, pretty fucking intentionally.

    In other words, the times when we were all spherical humans who, in the words of the great George Costanza, were "living in a society", are nowadays a mere reminiscence of naïve modern times gone. In fact, absent actual human beings, "society" was never much to begin with -- maybe now this puts the whole "non-discrimination" nonsense into perspective for you.

  4. #4:
    Cel Mihanie says:

    Re: analog vs digital

    I remember I had a post on my old blog arguing that equivalence is relative, i.e. whether some things are equivalent is dependent on the context (dinna invent this ofc, I'm sure this is some well-treaded truth that some ancient Greek guy wrote back when the Sun was still a nebula). I claim that there are definitely many contexts where analog and digital are both comparable and often equivalent. If I'm interested only in the final output of an analog synth (the waveform or sound), this is definitely the case. With a bit more work, the physical experience of working with a real analog synth can also be simulated satisfactorily, unless you also want to go into taking it apart and tinkering with it Jarre-style. I'm sure there must also be some contexts where they're not equivalent/comparable, likely something having to do with the historical, spiritual or idiomatic, but I'm a mostly pragmatic creature so it's harder for me to think about those.

    Regarding having to make do with simulations anyway as the "real" thing becomes unavailable, yeah, I too often think about that. It's worse than you think (becoming a catchphrase, innit?). Entropy rots everything, not just metal and flesh but also minds, ideas and societies. You can simulate, rebuild or even restore a machine, but you can't restore the historical context in which it became valuable. You can play a real synth from the 70s that still works, you can wear the 70s clothes and hair (please don't), but you can't make it *be* the 70s again. If the historical context is essential to your experience, you're screwed. An argument for seizing the day, I guess.

    Re: open sores

    Yeah one can find a lot of problems with open source, many of which only became visible once it started being "successful", "non-niche", used for megaprojects a la OpenOffice, GNOME, KDE etc. (the kernel is also a megaproject but has always had special status). As the code and political dependencies inevitably multiply exponentially, so do the problems. But I think we need to make a distinction between all the stuff promised by open source luminaries drunk on their own socialist/libertarian/etc (or worse) Kool-aid, and what is practically possible and should be taken seriously. Frankly, the only practical promises made (and, I'd say, largely kept) by the open source movement are:

    - we won't try to hide the source code
    - we might even take explicit positive steps to make it available, as long as it's not too hard for us (e.g. if we're bankrupt or dead)
    - we won't go after you legally if you publish the source code or make use of it

    That's about it. There is nothing there about, and nobody should have ever been believed if they promised things as:

    - you'll always be able to find the latest source code, in perpetuity, and that of the dependencies
    - it will always be straightforward to compile it
    - you'll have software self-sufficiency (i.e. you'll be able to compile and maintain all the stuff you need without risking interference from anyone else)
    - the project will keep up with all necessary standards, paradigms etc. required to function in the $CURRENT_YEAR
    - etc etc

    All these things are HARD, really hard. You won't get people to do them unless:

    - you pay them (big time)
    - they're in a cult
    - it's one guy who wants to do it for himself (which is kind of a 1-person cult come to think of it, deffo not rational)

    We know all too well how cults end... TLDR, if you want these things, you're on your own and were always on your own. Best go for option 3 methinks, it's what I've been doing myself.

    P.S. Onea these days it might be nice to elaborate on what exactly you mean by Web of Trust and how that would work in practice for addressing the problems in the context of which you were mentioning it. Surely it can't be just the cryptographic concept. Cryptographic trust isn't the same as real trust. Creating a successful society/group isn't about picking which keys to accept. Is it?

  5. #5:
    spyked says:

    > If I'm interested only in the final output of an analog synth (the waveform or sound), this is definitely the case

    I'll take this as a valid perspective assuming the control theoretical premise that all ducks are black boxes that quack given specific inputs. As much as I like control theory, this isn't quite my perspective, mainly because I believe reductionist approaches tend to be useful only in very well delimited utilitarian contexts. This is also more or less why I don't "believe in science" -- all models are reductionist, including and especially scientific ones, aren't they?

    > If the historical context is essential to your experience, you're screwed

    But it is, isn't it? This is why early modernists have built entire institutions dedicated to preserving for example the so-called "classical" currents of music. Of course you can have analog synth music in the 2020s (even by using only simulated instruments), but in order to do that you need some way to bridge the two periods. Something will definitely be lost in the process, but what can you do, entropy rots everything, just like you said.

    I believe that the interesting part lies precisely in the preservation, which as far as I can tell has no measurable utility whatsoever. If I were to argue for it, the only tangible premise I could find would be religious in nature, as if Keith Emerson's voice came back from the past to push me towards the sound of his Moog.

    > But I think we need to make a distinction between all the stuff promised by open source luminaries drunk on their own socialist/libertarian/etc (or worse) Kool-aid, and what is practically possible and should be taken seriously

    I think we agree on that and I think the limitations you mentioned are also the reason why the open source (political, philosophical, ideological etc.) movement will end up forgotten in the proverbial historical dustbin. I certainly don't mind using a Linux that lies at the intersection of various corporate interests much like I don't mind using a Windows or Mac. Let's just not pretend the former is in any way superior to the latter just because Microsoft et al. are still sharing the Linux sources with us plebs -- they'll continue to do so just as long as it serves their interest, namely of employing unpaid labour to do an otherwise very costly job.

    Having said that, I still believe in political movements that use code sharing as a means to amass power (in this field of technology, at least), which brings us to your post-scriptum.

    > what exactly you mean by Web of Trust

    I'm too lazy to write a self-contained article -- and I can be blamed for the same laziness in the case of, say, V -- so I'll just give a brief here. You probably won't enjoy Popescu's take on this either, so that aside, while at the same time maintaining the solid influence it's had on me...

    The WoT is as far as I can tell the only type of social network that actually works. It's perhaps a pompous name, but think of Romania in the late '80s, when the economic-social organization enforced by the state broke down entirely and the only way to get ahold of... anything, really, was to know someone who could produce it (or get it directly from the factory, or however he could) for you. In other words, the web of trust is your network of friends/acquaintances whom you can readily evaluate. Now, with regards to implementation: whether you wish to make it public or not, whether you have friends who are willing to back their words cryptographically over the internet, whether you keep a piece of paper or a contact list somewhere, I guess that's entirely up to you. What I mean to say by my previous comment is that in the long term it always pays off to maintain a good quality social circle.

    I know that's just common sense, but a whole lot of this "common sense" has been turned upside down by today's publicity stunts, so I don't think it hurts to spell it out.

  6. #6:
    spyked says:

    Now check out this lulz:

    Remove some entries due to various compliance requirements. They can come back in the future if sufficient documentation is provided.

    So whaddaya know, things turned out just like I said: there ain't no "spirit of openness" anymore and open source contributions have become subject to the same bureaucratic process as every other thing in this damned modern world "of ours". I guess I can't necessarily blame this, since it's one (particularly twisted) way to implement a trust model. The unfortunate news for outsiders is that they're stuck forking the whole thing *along with* the rest of the hardware and software supply chain. It'd be nice, except it's expensive as shit, so at this point most likely the single actor who can afford to do this is China.

  7. [...] mine. Now let us go back in time to March 11, 2023 and do a quick recount of what I said, including commentary. I said that: a. the open source model [...]

  8. [...] I do recommend that you (re)read Yossi's article if you haven't, I find it to be of extreme actuality. Also mind that this was written in 2008, way before Rust et al. started permeating the markets. ↩ [...]

Leave a Reply