<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	>
<channel>
	<title>Comments on: From the fabulous world of fantasy consoles: PICO-8</title>
	<atom:link href="http://thetarpit.org/2022/from-the-fabulous-world-of-fantasy-consoles-pico-8/feed" rel="self" type="application/rss+xml" />
	<link>http://thetarpit.org/2022/from-the-fabulous-world-of-fantasy-consoles-pico-8</link>
	<description>"Now I feel like I know less about what that blog is about than I did before."</description>
	<pubDate>Sun, 12 Apr 2026 17:35:00 +0000</pubDate>
	<generator>http://thetarpit.org</generator>
	<sy:updatePeriod>hourly</sy:updatePeriod>
	<sy:updateFrequency>1</sy:updateFrequency>
		<item>
		<title>By: Sanitarium &#171; The Tar Pit</title>
		<link>http://thetarpit.org/2022/from-the-fabulous-world-of-fantasy-consoles-pico-8#comment-7134</link>
		<dc:creator>Sanitarium &#171; The Tar Pit</dc:creator>
		<pubDate>Sat, 07 Feb 2026 20:21:23 +0000</pubDate>
		<guid isPermaLink="false">http://thetarpit.org/?p=442#comment-7134</guid>
		<description>[...] just as I did last weekend, just as I expect I'll do in 2046. The GOG version comes wrapped in a ScummVM which works almost perfectly on any modern system. Make sure to get it before the rights to the [...]</description>
		<content:encoded><![CDATA[<p>[...] just as I did last weekend, just as I expect I'll do in 2046. The GOG version comes wrapped in a ScummVM which works almost perfectly on any modern system. Make sure to get it before the rights to the [...]</p>
]]></content:encoded>
	</item>
	<item>
		<title>By: spyked</title>
		<link>http://thetarpit.org/2022/from-the-fabulous-world-of-fantasy-consoles-pico-8#comment-1865</link>
		<dc:creator>spyked</dc:creator>
		<pubDate>Wed, 23 Mar 2022 17:57:04 +0000</pubDate>
		<guid isPermaLink="false">http://thetarpit.org/?p=442#comment-1865</guid>
		<description>Let's bring this full circle then.

&gt; That was explicitly labelled an experimentation, so don't judge it as something in which I've more confidence.

There's nothing wrong with that, I'm just pointing out that I've seen very few systems that lacked essential complexity and that at the same time were... useful. One of them was my old Romanian ZX Spectrum clone, which took sometimes an hour to load the larger 8-bit games off the magnetic tape, but that didn't even have an ethernet port; and I suspect the cost of adding ethernet alone (without the rest of the networking stack) would be non-trivial.

&gt; I try to prefer implementing for myself what I can, but it's not reasonable for everything, currently, no.

&gt; Every network operation is unreliable. No packet is guaranteed. Thus it follows that motion is impossible.

... theoretically. In practice, the particular HTTP server hosting this blog serves web pages just fine and the blog also has &lt;a href="http://thetarpit.org/2014/maybe-i-was-wrong-about-that-commenting-thing" rel="nofollow"&gt;a comment mechanism&lt;/a&gt;, which is what it was intended for in the first place. This supposed paradox actually ties pretty well into the mathematics thread above and below, thank teh Lords for infinitesimal calculus!

&gt; Just because a man can shit himself to death doesn't mean we should build computers to do the same.

Again, this must sound really annoying: I don't disagree at all, but let's consider that Nature (if anything deserves a personification, I guess it's "nature") has spent millions of years attempting to perfect autonomous systems that nevertheless have this failure mode. The Big Question is, can any non-trivial autonomous systems be built that *don't* have this failure mode?

&gt; I don't recall Euclid's Elements having these issues

I'm not saying *some* problems cannot be solved elegantly. I myself am more interested in problems along the lines of &lt;a href="http://thetarpit.org/2021/on-the-utter-death-of-musical-arts#comment-574" rel="nofollow"&gt;acoustic modelling&lt;/a&gt;, which are the subject of thick books comprising just as thick equations with partial derivatives that I'm not at all equipped to present here. Or more generally, take some of the fluid dynamics problems which can only be solved numerically... I haven't visited this field in a while, but to my eye it's just one of the many examples of impedance mismatch between the beautiful Platonic ideals and reality.</description>
		<content:encoded><![CDATA[<p>Let's bring this full circle then.</p>
<p>> That was explicitly labelled an experimentation, so don't judge it as something in which I've more confidence.</p>
<p>There's nothing wrong with that, I'm just pointing out that I've seen very few systems that lacked essential complexity and that at the same time were... useful. One of them was my old Romanian ZX Spectrum clone, which took sometimes an hour to load the larger 8-bit games off the magnetic tape, but that didn't even have an ethernet port; and I suspect the cost of adding ethernet alone (without the rest of the networking stack) would be non-trivial.</p>
<p>> I try to prefer implementing for myself what I can, but it's not reasonable for everything, currently, no.</p>
<p>> Every network operation is unreliable. No packet is guaranteed. Thus it follows that motion is impossible.</p>
<p>... theoretically. In practice, the particular HTTP server hosting this blog serves web pages just fine and the blog also has <a href="http://thetarpit.org/2014/maybe-i-was-wrong-about-that-commenting-thing" rel="nofollow">a comment mechanism</a>, which is what it was intended for in the first place. This supposed paradox actually ties pretty well into the mathematics thread above and below, thank teh Lords for infinitesimal calculus!</p>
<p>> Just because a man can shit himself to death doesn't mean we should build computers to do the same.</p>
<p>Again, this must sound really annoying: I don't disagree at all, but let's consider that Nature (if anything deserves a personification, I guess it's "nature") has spent millions of years attempting to perfect autonomous systems that nevertheless have this failure mode. The Big Question is, can any non-trivial autonomous systems be built that *don't* have this failure mode?</p>
<p>> I don't recall Euclid's Elements having these issues</p>
<p>I'm not saying *some* problems cannot be solved elegantly. I myself am more interested in problems along the lines of <a href="http://thetarpit.org/2021/on-the-utter-death-of-musical-arts#comment-574" rel="nofollow">acoustic modelling</a>, which are the subject of thick books comprising just as thick equations with partial derivatives that I'm not at all equipped to present here. Or more generally, take some of the fluid dynamics problems which can only be solved numerically... I haven't visited this field in a while, but to my eye it's just one of the many examples of impedance mismatch between the beautiful Platonic ideals and reality.</p>
]]></content:encoded>
	</item>
	<item>
		<title>By: Verisimilitude</title>
		<link>http://thetarpit.org/2022/from-the-fabulous-world-of-fantasy-consoles-pico-8#comment-1863</link>
		<dc:creator>Verisimilitude</dc:creator>
		<pubDate>Wed, 23 Mar 2022 03:00:42 +0000</pubDate>
		<guid isPermaLink="false">http://thetarpit.org/?p=442#comment-1863</guid>
		<description>&lt;blockquote&gt;I went through your easy-peasy-tcp example&lt;/blockquote&gt;

That was explicitly labelled an experimentation, so don't judge it as something in which I've more confidence.

&lt;blockquote&gt;Do you think me a fool for resorting to this, rather than someone who values his time?&lt;/blockquote&gt;

Neither of us knows the other well enough to call him a fool, so no.  Compare it to what I do, however.  I'll try to go without than get trapped.  I didn't write the HTTP server I use, but I wrote my Gopher server.  I try to prefer implementing for myself what I can, but it's not reasonable for everything, currently, no.

&lt;blockquote&gt;I will rather use C than resort to this sort of nonsense.&lt;/blockquote&gt;

Not everything is worth doing, I agree.

&lt;blockquote&gt;Precisely this -- I don't think C is particularly the problem here, rather than the bizarre design of TCP and all the implementation issues that derive from it.&lt;/blockquote&gt;

Every network operation is unreliable.  No packet is guaranteed.  Thus it follows that motion is impossible.

&lt;blockquote&gt;The "design" of humans allows each single individual to be deadlocked without recourse, that alone does not make it (or them) garbage.&lt;/blockquote&gt;

We're stuck with our design, for now.  Just because a man can shit himself to death doesn't mean we should build computers to do the same.

&lt;blockquote&gt;mkstemp may be of some help here? I really don't know.&lt;/blockquote&gt;

The point is it can fail, and the greater operation may have had no naturally intractable failure case.  This is condemnation of bad design.

&lt;blockquote&gt;This is a very naive view of the history of mathematics, which went through an evolution spanning centuries before reaching the distilled form currently present in the didactic material.&lt;/blockquote&gt;

I don't recall &lt;i&gt;Euclid's Elements&lt;/i&gt; having these issues, although I'm not finished reading my copy, I'll admit.

Still and again, I suppose this is a tangent. Feel free to have the last word.</description>
		<content:encoded><![CDATA[<blockquote><p>I went through your easy-peasy-tcp example</p></blockquote>
<p>That was explicitly labelled an experimentation, so don't judge it as something in which I've more confidence.</p>
<blockquote><p>Do you think me a fool for resorting to this, rather than someone who values his time?</p></blockquote>
<p>Neither of us knows the other well enough to call him a fool, so no.  Compare it to what I do, however.  I'll try to go without than get trapped.  I didn't write the HTTP server I use, but I wrote my Gopher server.  I try to prefer implementing for myself what I can, but it's not reasonable for everything, currently, no.</p>
<blockquote><p>I will rather use C than resort to this sort of nonsense.</p></blockquote>
<p>Not everything is worth doing, I agree.</p>
<blockquote><p>Precisely this -- I don't think C is particularly the problem here, rather than the bizarre design of TCP and all the implementation issues that derive from it.</p></blockquote>
<p>Every network operation is unreliable.  No packet is guaranteed.  Thus it follows that motion is impossible.</p>
<blockquote><p>The "design" of humans allows each single individual to be deadlocked without recourse, that alone does not make it (or them) garbage.</p></blockquote>
<p>We're stuck with our design, for now.  Just because a man can shit himself to death doesn't mean we should build computers to do the same.</p>
<blockquote><p>mkstemp may be of some help here? I really don't know.</p></blockquote>
<p>The point is it can fail, and the greater operation may have had no naturally intractable failure case.  This is condemnation of bad design.</p>
<blockquote><p>This is a very naive view of the history of mathematics, which went through an evolution spanning centuries before reaching the distilled form currently present in the didactic material.</p></blockquote>
<p>I don't recall <i>Euclid's Elements</i> having these issues, although I'm not finished reading my copy, I'll admit.</p>
<p>Still and again, I suppose this is a tangent. Feel free to have the last word.</p>
]]></content:encoded>
	</item>
	<item>
		<title>By: spyked</title>
		<link>http://thetarpit.org/2022/from-the-fabulous-world-of-fantasy-consoles-pico-8#comment-1840</link>
		<dc:creator>spyked</dc:creator>
		<pubDate>Thu, 17 Mar 2022 19:58:56 +0000</pubDate>
		<guid isPermaLink="false">http://thetarpit.org/?p=442#comment-1840</guid>
		<description>&gt; There are alternations. I've written about exactly this

I went through your easy-peasy-tcp example and... well, I'll digress and instead challenge you to consider a less trivial task, that of a WWW server &lt;a href="http://thetarpit.org/2019/cl-www-summary" rel="nofollow"&gt;written in Common Lisp&lt;/a&gt;. Consider that it took me ten humongous articles (minus the summary) to fully document the code, *minus* its dependencies; and consider that that task made me realize that I was attempting to rewrite Wordpress in Common Lisp, which task I gave up and instead moved on to the actual Wordpress, based upon that PHP abomination we all know. Do you think me a fool for resorting to this, rather than someone who values his time?

Similarly, consider my attempt to a much more approachable task, i.e. to write a &lt;a href="http://thetarpit.org/2020/briefly-on-programming-irc-bots-using-common-lisp" rel="nofollow"&gt;fault-tolerant IRC bot&lt;/a&gt; in the same language, and the reasons why I gave up. I wish all the luck to whomever may try this again in the future, and I don't particularly blame Common Lisp or SBCL for this failure, just don't come &lt;a href="http://thetarpit.org/2020/briefly-on-programming-irc-bots-using-common-lisp#comment-580" rel="nofollow"&gt;biting me in the ass&lt;/a&gt; about how "everything would be ok had I just used FFI". I will rather use C than resort to this sort of nonsense.

&gt; a TCP connection can fail for many reasons and may result in failure from which there's no reasonable recovery mode

Precisely this -- I don't think C is particularly the problem here, rather than the bizarre design of TCP and all the implementation issues that derive from it. I'm not talking out of my ass here either, as I've read reasonable chunks of the Linux TCP implementation numerous times, when dealing with some of those intractable failure cases. Those failure cases did not come from C or Linux or whatever, they came from the TCP implementation itself, which is an unmaintainable horror that should have never made it into the kernel, for it fails to adhere to even the most basic "guidelines" laid out by Torvalds himself.

&gt; the design of the system allows most any program to be deadlocked without recourse, meaning it's garbage.

The "design" of humans allows each single individual to be deadlocked without recourse, that alone does not make it (or them) garbage.

&gt; Some interface requires a file, so a unique file must be created

&lt;a href="https://archive.ph/w28My" rel="nofollow"&gt;mkstemp&lt;/a&gt; may be of some help here? I really don't know.

&gt; One of the primary issues with computing is how an idiot can build a broken interface, which humans must then work around indefinitely

Verisimilitude, I'm not disagreeing with you here, merely pointing out that quite often, folks tend to attempt to solve problems using the wrong means. Maybe you don't actually *need* to create a unique file for that interface? I don't know, I'm merely considering this side of the issue.

&gt; This wasn't an issue with mathematics

This is a very naive view of the history of mathematics, which went through an evolution spanning centuries before reaching the distilled form currently present in the didactic material. Consider the example that Lord Newton's calculus is as much a consequence of his Lordship as much as his genius, or in other words that politics can't be simply brushed aside from this history. Consider also some of the very unprincipled methods (e.g. approximations) used by some of the practitioners (e.g. physicists) and that some mathematicians had to literally invent new mathematics (e.g. the Dirac delta function) in order to model their experiments.

Bringing this back to computing, my best guess is that despite the obvious &lt;a href="http://thetarpit.org/2020/on-computers?b=second&#038;e=thing#select" rel="nofollow"&gt;amplification&lt;/a&gt; brought about by computers, the field, aged less than a century, is probably just about nearing its stage of puberty. Whether this stage will be further delayed by the incoming dark ages, or whether war will actually speed up the maturization process (as they &lt;a href="http://thetarpit.org/2021/on-technology#comment-1747" rel="nofollow"&gt;often do&lt;/a&gt;), I guess we'll see.</description>
		<content:encoded><![CDATA[<p>> There are alternations. I've written about exactly this</p>
<p>I went through your easy-peasy-tcp example and... well, I'll digress and instead challenge you to consider a less trivial task, that of a WWW server <a href="http://thetarpit.org/2019/cl-www-summary" rel="nofollow">written in Common Lisp</a>. Consider that it took me ten humongous articles (minus the summary) to fully document the code, *minus* its dependencies; and consider that that task made me realize that I was attempting to rewrite Wordpress in Common Lisp, which task I gave up and instead moved on to the actual Wordpress, based upon that PHP abomination we all know. Do you think me a fool for resorting to this, rather than someone who values his time?</p>
<p>Similarly, consider my attempt to a much more approachable task, i.e. to write a <a href="http://thetarpit.org/2020/briefly-on-programming-irc-bots-using-common-lisp" rel="nofollow">fault-tolerant IRC bot</a> in the same language, and the reasons why I gave up. I wish all the luck to whomever may try this again in the future, and I don't particularly blame Common Lisp or SBCL for this failure, just don't come <a href="http://thetarpit.org/2020/briefly-on-programming-irc-bots-using-common-lisp#comment-580" rel="nofollow">biting me in the ass</a> about how "everything would be ok had I just used FFI". I will rather use C than resort to this sort of nonsense.</p>
<p>> a TCP connection can fail for many reasons and may result in failure from which there's no reasonable recovery mode</p>
<p>Precisely this -- I don't think C is particularly the problem here, rather than the bizarre design of TCP and all the implementation issues that derive from it. I'm not talking out of my ass here either, as I've read reasonable chunks of the Linux TCP implementation numerous times, when dealing with some of those intractable failure cases. Those failure cases did not come from C or Linux or whatever, they came from the TCP implementation itself, which is an unmaintainable horror that should have never made it into the kernel, for it fails to adhere to even the most basic "guidelines" laid out by Torvalds himself.</p>
<p>> the design of the system allows most any program to be deadlocked without recourse, meaning it's garbage.</p>
<p>The "design" of humans allows each single individual to be deadlocked without recourse, that alone does not make it (or them) garbage.</p>
<p>> Some interface requires a file, so a unique file must be created</p>
<p><a href="https://archive.ph/w28My" rel="nofollow">mkstemp</a> may be of some help here? I really don't know.</p>
<p>> One of the primary issues with computing is how an idiot can build a broken interface, which humans must then work around indefinitely</p>
<p>Verisimilitude, I'm not disagreeing with you here, merely pointing out that quite often, folks tend to attempt to solve problems using the wrong means. Maybe you don't actually *need* to create a unique file for that interface? I don't know, I'm merely considering this side of the issue.</p>
<p>> This wasn't an issue with mathematics</p>
<p>This is a very naive view of the history of mathematics, which went through an evolution spanning centuries before reaching the distilled form currently present in the didactic material. Consider the example that Lord Newton's calculus is as much a consequence of his Lordship as much as his genius, or in other words that politics can't be simply brushed aside from this history. Consider also some of the very unprincipled methods (e.g. approximations) used by some of the practitioners (e.g. physicists) and that some mathematicians had to literally invent new mathematics (e.g. the Dirac delta function) in order to model their experiments.</p>
<p>Bringing this back to computing, my best guess is that despite the obvious <a href="http://thetarpit.org/2020/on-computers?b=second&#038;e=thing#select" rel="nofollow">amplification</a> brought about by computers, the field, aged less than a century, is probably just about nearing its stage of puberty. Whether this stage will be further delayed by the incoming dark ages, or whether war will actually speed up the maturization process (as they <a href="http://thetarpit.org/2021/on-technology#comment-1747" rel="nofollow">often do</a>), I guess we'll see.</p>
]]></content:encoded>
	</item>
	<item>
		<title>By: Verisimilitude</title>
		<link>http://thetarpit.org/2022/from-the-fabulous-world-of-fantasy-consoles-pico-8#comment-1817</link>
		<dc:creator>Verisimilitude</dc:creator>
		<pubDate>Sun, 13 Mar 2022 22:44:50 +0000</pubDate>
		<guid isPermaLink="false">http://thetarpit.org/?p=442#comment-1817</guid>
		<description>I suppose I'll continue replying then.

&lt;blockquote&gt;because all I/O (occuring between the von Neumann machine's "main memory" and "peripheral devices") was squeezed together into the abstraction called "file", which file doesn't even behave consistently&lt;/blockquote&gt;

Yes, exactly.

&lt;blockquote&gt;However, this is the cost of a self-proclaimed general purpose abstraction for I/O, the alternative being e.g. the DOS way, where each and every application came bundled with drivers for the peripherals they used, with the exception of... actual files, of course.&lt;/blockquote&gt;

There are alternations.  I've written about exactly this &lt;a href="http://verisimilitudes.net/2020-02-02" rel="nofollow"&gt;here&lt;/a&gt;:

&lt;blockquote&gt;I firmly believe proper system design takes pain unto itself to eliminate edge cases. A proper such system would provide the option of treating wildly different I/O sources as the same, but permit the reducing of failure cases for any single one by permitting avoiding this. It may seem reasonable to give a terminal, file system, and TCP similar interfaces, yet this group poses necessarily different failure cases. The source of input for a terminal is usually a human and so a read can't fail, with a potentially indefinite wait; a file system can make a request, yet fail due to recognized hardware failure or semantic issues with the underlying model; a TCP connection can fail for many reasons and may result in failure from which there's no reasonable recovery mode. Collapsing these under a lone interfaces collapses their failure modes into a single model, as well, and it should be realized the addition of invariants through a lack of genericity is valuable.&lt;/blockquote&gt;

We see similar idiocy with Unicode, and pouring the edge cases of all languages and other nonsense into one vase.

&lt;blockquote&gt;Now, did they work by accident?&lt;/blockquote&gt;

No, but the design of the system allows most any program to be deadlocked without recourse, meaning it's garbage.

As for pwd, I seemed to have outdated information.  Consider a different example: Some interface requires a file, so a unique file must be created, which opens it to naming conflicts, space exhaustion, and other nonsense.  To repeat myself, when this nonsense becomes an ingredient for the real recipe, it only gets in the way.

One of the primary issues with computing is how an idiot can build a broken interface, which humans must then work around indefinitely.  This wasn't an issue with mathematics, if only because the practitioners were their computers, and so valued concision and beauty, lest they rewrite something to have those qualities.</description>
		<content:encoded><![CDATA[<p>I suppose I'll continue replying then.</p>
<blockquote><p>because all I/O (occuring between the von Neumann machine's "main memory" and "peripheral devices") was squeezed together into the abstraction called "file", which file doesn't even behave consistently</p></blockquote>
<p>Yes, exactly.</p>
<blockquote><p>However, this is the cost of a self-proclaimed general purpose abstraction for I/O, the alternative being e.g. the DOS way, where each and every application came bundled with drivers for the peripherals they used, with the exception of... actual files, of course.</p></blockquote>
<p>There are alternations.  I've written about exactly this <a href="http://verisimilitudes.net/2020-02-02" rel="nofollow">here</a>:</p>
<blockquote><p>I firmly believe proper system design takes pain unto itself to eliminate edge cases. A proper such system would provide the option of treating wildly different I/O sources as the same, but permit the reducing of failure cases for any single one by permitting avoiding this. It may seem reasonable to give a terminal, file system, and TCP similar interfaces, yet this group poses necessarily different failure cases. The source of input for a terminal is usually a human and so a read can't fail, with a potentially indefinite wait; a file system can make a request, yet fail due to recognized hardware failure or semantic issues with the underlying model; a TCP connection can fail for many reasons and may result in failure from which there's no reasonable recovery mode. Collapsing these under a lone interfaces collapses their failure modes into a single model, as well, and it should be realized the addition of invariants through a lack of genericity is valuable.</p></blockquote>
<p>We see similar idiocy with Unicode, and pouring the edge cases of all languages and other nonsense into one vase.</p>
<blockquote><p>Now, did they work by accident?</p></blockquote>
<p>No, but the design of the system allows most any program to be deadlocked without recourse, meaning it's garbage.</p>
<p>As for pwd, I seemed to have outdated information.  Consider a different example: Some interface requires a file, so a unique file must be created, which opens it to naming conflicts, space exhaustion, and other nonsense.  To repeat myself, when this nonsense becomes an ingredient for the real recipe, it only gets in the way.</p>
<p>One of the primary issues with computing is how an idiot can build a broken interface, which humans must then work around indefinitely.  This wasn't an issue with mathematics, if only because the practitioners were their computers, and so valued concision and beauty, lest they rewrite something to have those qualities.</p>
]]></content:encoded>
	</item>
	<item>
		<title>By: spyked</title>
		<link>http://thetarpit.org/2022/from-the-fabulous-world-of-fantasy-consoles-pico-8#comment-1811</link>
		<dc:creator>spyked</dc:creator>
		<pubDate>Sat, 12 Mar 2022 10:03:24 +0000</pubDate>
		<guid isPermaLink="false">http://thetarpit.org/?p=442#comment-1811</guid>
		<description>I guess this is not so much a last word, rather than yet another tangent. 

&gt; Consider that the write system call can fail indefinitely, with no recourse

I agree; now let's consider that all software written on top of Unix (including, but not limited to, all emulators and high-level language implementations) is doomed to use it. And let us further consider why this "indefinite failure" is the case (at least as far as I can tell): because all I/O (occuring between the von Neumann machine's "main memory" and "peripheral devices") was squeezed together into the abstraction called "file", which file doesn't even behave consistently: sometimes it's a socket, other times it's a pipe, while other times who even knows what is it... a "device", eh?

I'm not attempting to exonerate the folks who made POSIX from their idiocy. However, this is the cost of a self-proclaimed general purpose abstraction for I/O, the alternative being e.g. the DOS way, where each and every application came bundled with drivers for the peripherals they used, with the exception of... actual files, of course. This is where the perversity lies: that even though POSIX *claims* to provide a general-purpose abstraction for I/O, in practice the application will be forced to implement an ad-hoc driver for whichever "/dev/X" it attempts to talk to.

As far as I'm concerned, it's liberating to spell this out, despite the fact that, as I mentioned, I did implement sane applications using write. Now, did they work by accident?

&gt; the pwd command may conform to POSIX by returning a single period

The spec pretty clearly &lt;a href="https://archive.ph/IfELO" rel="nofollow"&gt;states otherwise&lt;/a&gt;.</description>
		<content:encoded><![CDATA[<p>I guess this is not so much a last word, rather than yet another tangent. </p>
<p>> Consider that the write system call can fail indefinitely, with no recourse</p>
<p>I agree; now let's consider that all software written on top of Unix (including, but not limited to, all emulators and high-level language implementations) is doomed to use it. And let us further consider why this "indefinite failure" is the case (at least as far as I can tell): because all I/O (occuring between the von Neumann machine's "main memory" and "peripheral devices") was squeezed together into the abstraction called "file", which file doesn't even behave consistently: sometimes it's a socket, other times it's a pipe, while other times who even knows what is it... a "device", eh?</p>
<p>I'm not attempting to exonerate the folks who made POSIX from their idiocy. However, this is the cost of a self-proclaimed general purpose abstraction for I/O, the alternative being e.g. the DOS way, where each and every application came bundled with drivers for the peripherals they used, with the exception of... actual files, of course. This is where the perversity lies: that even though POSIX *claims* to provide a general-purpose abstraction for I/O, in practice the application will be forced to implement an ad-hoc driver for whichever "/dev/X" it attempts to talk to.</p>
<p>As far as I'm concerned, it's liberating to spell this out, despite the fact that, as I mentioned, I did implement sane applications using write. Now, did they work by accident?</p>
<p>> the pwd command may conform to POSIX by returning a single period</p>
<p>The spec pretty clearly <a href="https://archive.ph/IfELO" rel="nofollow">states otherwise</a>.</p>
]]></content:encoded>
	</item>
	<item>
		<title>By: Verisimilitude</title>
		<link>http://thetarpit.org/2022/from-the-fabulous-world-of-fantasy-consoles-pico-8#comment-1805</link>
		<dc:creator>Verisimilitude</dc:creator>
		<pubDate>Sat, 12 Mar 2022 01:26:05 +0000</pubDate>
		<guid isPermaLink="false">http://thetarpit.org/?p=442#comment-1805</guid>
		<description>An intractable failure case is a failure case for which it's either very difficult or impossible to account.  Consider that the write system call can fail indefinitely, with no recourse.  Apparently, the pwd command may conform to POSIX by returning a single period.  When this nonsense becomes an ingredient for the real recipe, it only gets in the way.

A machine code is more well specified than a Forth, generally.  When I write machine code, and when I read it, I evaluate it in my head first; if the true execution fails, I may have entered something incorrectly, or the implementation be flawed.  It will be one or the other.

Still, I suppose this is a tangent.  Feel free to have the last word.</description>
		<content:encoded><![CDATA[<p>An intractable failure case is a failure case for which it's either very difficult or impossible to account.  Consider that the write system call can fail indefinitely, with no recourse.  Apparently, the pwd command may conform to POSIX by returning a single period.  When this nonsense becomes an ingredient for the real recipe, it only gets in the way.</p>
<p>A machine code is more well specified than a Forth, generally.  When I write machine code, and when I read it, I evaluate it in my head first; if the true execution fails, I may have entered something incorrectly, or the implementation be flawed.  It will be one or the other.</p>
<p>Still, I suppose this is a tangent.  Feel free to have the last word.</p>
]]></content:encoded>
	</item>
	<item>
		<title>By: spyked</title>
		<link>http://thetarpit.org/2022/from-the-fabulous-world-of-fantasy-consoles-pico-8#comment-1804</link>
		<dc:creator>spyked</dc:creator>
		<pubDate>Fri, 11 Mar 2022 20:44:51 +0000</pubDate>
		<guid isPermaLink="false">http://thetarpit.org/?p=442#comment-1804</guid>
		<description>&gt; POSIX and the C language are grotesquely large

I know; and I'm not debating this aspect, as I know it having taken my time to go through the specs. I am however debating the argument against size as a principal issue, as it sounds too similar to the old high school moaning "I didn't read War and Peace because it was too long". The main argument against this problem of size is my having read the specs: yes, the items in question are full of flaws, but this may only be determined by reading them and pointing out the flaws in question, not just bitching about bloat.

&gt; and have intractable failure cases

I don't know what this means. I know what is an intractable problem, I know what is a failure case, but I have no clue how a specification/system can "have intractable failure cases". Perhaps an example or two would help here?

&gt; Both of these systems force irrelevant problems to be solved for which there's no good solution

I am not debating that said systems pose problems, but I'm not sure which of them you consider irrelevant (and more importantly, irrelevant *in what context*?) and to what extent there is no good solution available (and "good" *in what context*?). Can you give two examples of such problems?

&gt; [...] both as in CHIP-8

Then why choose CHIP-8 and why not, say, Forth?

&gt; At the lowest levels

My point, as stated above, is that the lowest level is always the physical iron. Simply because CHIP-8 ran on RCA hardware back in the '70s does not mean you can wave away that you are running it, say, on a *nix machine in the 2020s, nor that you are using a *nix machine to browse the web. Sure, it's nice to get rid of complexity, but one doesn't simply get rid of it by "changing the language", but by pushing it to another layer. Otherwise you are going to hit that complexity one way or another, simply by using the system, some systems just manage to hide it better than others. In other words, *everything* in engineering (and in economics and life in general) is to be paid for some way or another, there is no such thing as "&lt;a href="http://thetarpit.org/2016/freedom-is-slavery" rel="nofollow"&gt;for free&lt;/a&gt;".

From this point of view, one example would be that C and POSIX are not evil because they "force irrelevant problems", but because they force them upon the application developers, who need not be concerned with pointerisms and various undefined behaviours. Still, I personally have written correct programs, using subsets of C and POSIX, which did not require all the complexity specified in the standards and that, similarly, had clear, reasonable invariants. So to my eye the underlying issue is rather that some abstractions are better at solving *certain* problems than others and, circling back to that old problem of &lt;a href="http://thetarpit.org/2018/what-is-an-os" rel="nofollow"&gt;operating systems/language environments&lt;/a&gt;, that there is no such thing as a "general-purpose" tool, or in other words, that things are hammers only inasmuch as they're good at driving nails into certain materials specified beforehand.

TL;DR: I'm well beyond discussing this stuff in terms of hyperboles and I find arguments along the lines of "X is The Right System" to be a waste of time. If you still think this makes me obtuse, then let's leave it at that.</description>
		<content:encoded><![CDATA[<p>> POSIX and the C language are grotesquely large</p>
<p>I know; and I'm not debating this aspect, as I know it having taken my time to go through the specs. I am however debating the argument against size as a principal issue, as it sounds too similar to the old high school moaning "I didn't read War and Peace because it was too long". The main argument against this problem of size is my having read the specs: yes, the items in question are full of flaws, but this may only be determined by reading them and pointing out the flaws in question, not just bitching about bloat.</p>
<p>> and have intractable failure cases</p>
<p>I don't know what this means. I know what is an intractable problem, I know what is a failure case, but I have no clue how a specification/system can "have intractable failure cases". Perhaps an example or two would help here?</p>
<p>> Both of these systems force irrelevant problems to be solved for which there's no good solution</p>
<p>I am not debating that said systems pose problems, but I'm not sure which of them you consider irrelevant (and more importantly, irrelevant *in what context*?) and to what extent there is no good solution available (and "good" *in what context*?). Can you give two examples of such problems?</p>
<p>> [...] both as in CHIP-8</p>
<p>Then why choose CHIP-8 and why not, say, Forth?</p>
<p>> At the lowest levels</p>
<p>My point, as stated above, is that the lowest level is always the physical iron. Simply because CHIP-8 ran on RCA hardware back in the '70s does not mean you can wave away that you are running it, say, on a *nix machine in the 2020s, nor that you are using a *nix machine to browse the web. Sure, it's nice to get rid of complexity, but one doesn't simply get rid of it by "changing the language", but by pushing it to another layer. Otherwise you are going to hit that complexity one way or another, simply by using the system, some systems just manage to hide it better than others. In other words, *everything* in engineering (and in economics and life in general) is to be paid for some way or another, there is no such thing as "<a href="http://thetarpit.org/2016/freedom-is-slavery" rel="nofollow">for free</a>".</p>
<p>From this point of view, one example would be that C and POSIX are not evil because they "force irrelevant problems", but because they force them upon the application developers, who need not be concerned with pointerisms and various undefined behaviours. Still, I personally have written correct programs, using subsets of C and POSIX, which did not require all the complexity specified in the standards and that, similarly, had clear, reasonable invariants. So to my eye the underlying issue is rather that some abstractions are better at solving *certain* problems than others and, circling back to that old problem of <a href="http://thetarpit.org/2018/what-is-an-os" rel="nofollow">operating systems/language environments</a>, that there is no such thing as a "general-purpose" tool, or in other words, that things are hammers only inasmuch as they're good at driving nails into certain materials specified beforehand.</p>
<p>TL;DR: I'm well beyond discussing this stuff in terms of hyperboles and I find arguments along the lines of "X is The Right System" to be a waste of time. If you still think this makes me obtuse, then let's leave it at that.</p>
]]></content:encoded>
	</item>
	<item>
		<title>By: Verisimilitude</title>
		<link>http://thetarpit.org/2022/from-the-fabulous-world-of-fantasy-consoles-pico-8#comment-1793</link>
		<dc:creator>Verisimilitude</dc:creator>
		<pubDate>Thu, 10 Mar 2022 22:03:35 +0000</pubDate>
		<guid isPermaLink="false">http://thetarpit.org/?p=442#comment-1793</guid>
		<description>The issue is that POSIX and the C language are grotesquely large and have intractable failure cases.  Both of these systems force irrelevant problems to be solved for which there's no good solution.

There's a difference between that and having no instruction timing guarantees and so using the system timer to its fullest, or not assuming a certain value comprises a flag generated by an instruction, both as in CHIP-8.

The focus here is that, in such machine code hacking, I can establish invariant cases easily, more easily than in some higher-level languages, even good such languages.  At the lowest levels, I can compress a world of variation and uncertainty into a single known point.</description>
		<content:encoded><![CDATA[<p>The issue is that POSIX and the C language are grotesquely large and have intractable failure cases.  Both of these systems force irrelevant problems to be solved for which there's no good solution.</p>
<p>There's a difference between that and having no instruction timing guarantees and so using the system timer to its fullest, or not assuming a certain value comprises a flag generated by an instruction, both as in CHIP-8.</p>
<p>The focus here is that, in such machine code hacking, I can establish invariant cases easily, more easily than in some higher-level languages, even good such languages.  At the lowest levels, I can compress a world of variation and uncertainty into a single known point.</p>
]]></content:encoded>
	</item>
	<item>
		<title>By: spyked</title>
		<link>http://thetarpit.org/2022/from-the-fabulous-world-of-fantasy-consoles-pico-8#comment-1772</link>
		<dc:creator>spyked</dc:creator>
		<pubDate>Mon, 07 Mar 2022 08:40:27 +0000</pubDate>
		<guid isPermaLink="false">http://thetarpit.org/?p=442#comment-1772</guid>
		<description>Maybe I'm being obtuse, or perhaps it's rather that you're trying to be too clever and I'm not falling for it.

If specification is equivalent to "knowing the full system", then by this criterion alone I deem POSIX and C to be enough. So what are we talking about, then? If size by itself, without an underlying reason, is a criterion for this criticism of yours, then I am unimpressed and not interested.

Sure, I understand that small systems entail small specifications, which may make for good didactic exercises; I also understand that the discussion of size is absolutely needed when establishing economic limitations such as memory size; hell, I may even give it that it's fun to approach the field of systems design from a minimalist perspective, for purely artistic purposes, i.e. that "distilled essence" you mentioned. But you don't properly bother to put the subject of critique in context other than some vague mention of "machine code hacking", which isn't a subject in and of itself in the world I inhabit, but rather a small activity in an otherwise vast field.</description>
		<content:encoded><![CDATA[<p>Maybe I'm being obtuse, or perhaps it's rather that you're trying to be too clever and I'm not falling for it.</p>
<p>If specification is equivalent to "knowing the full system", then by this criterion alone I deem POSIX and C to be enough. So what are we talking about, then? If size by itself, without an underlying reason, is a criterion for this criticism of yours, then I am unimpressed and not interested.</p>
<p>Sure, I understand that small systems entail small specifications, which may make for good didactic exercises; I also understand that the discussion of size is absolutely needed when establishing economic limitations such as memory size; hell, I may even give it that it's fun to approach the field of systems design from a minimalist perspective, for purely artistic purposes, i.e. that "distilled essence" you mentioned. But you don't properly bother to put the subject of critique in context other than some vague mention of "machine code hacking", which isn't a subject in and of itself in the world I inhabit, but rather a small activity in an otherwise vast field.</p>
]]></content:encoded>
	</item>
</channel>
</rss>
