<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	>
<channel>
	<title>Comments on: Simplicity and ease</title>
	<atom:link href="http://thetarpit.org/2024/simplicity-and-ease/feed" rel="self" type="application/rss+xml" />
	<link>http://thetarpit.org/2024/simplicity-and-ease</link>
	<description>"Now I feel like I know less about what that blog is about than I did before."</description>
	<pubDate>Mon, 20 Apr 2026 12:50:24 +0000</pubDate>
	<generator>http://thetarpit.org</generator>
	<sy:updatePeriod>hourly</sy:updatePeriod>
	<sy:updateFrequency>1</sy:updateFrequency>
		<item>
		<title>By: spyked</title>
		<link>http://thetarpit.org/2024/simplicity-and-ease#comment-5344</link>
		<dc:creator>spyked</dc:creator>
		<pubDate>Fri, 27 Sep 2024 19:25:09 +0000</pubDate>
		<guid isPermaLink="false">http://thetarpit.org/?p=547#comment-5344</guid>
		<description>I'll dwell upon the following question:

&gt; do they need to re-learn the entire history of the field? 

because I think it's central to the discussion. For example, take:

&gt; Joel Spolsky's article on leaky abstractions

Heheh, the article hasn't aged all that well, has it? Kids nowadays by and large have no clue about COM, LPSTR, ASP.NET and all those other thingamajiggs that were somewhat current when he wrote the piece. It's not that no one programs using these technologies anymore, I'm sure there's a fair amount of web apps written using ASP, it's just... no one teaches this stuff anymore, just like no one teaches COBOL.

So this brings me to the following point: I don't think this needs to be a debate, and no, I don't think that the masses need to re-learn the entire history of the field. However, the folks who aim to be &lt;em&gt;philosophers&lt;/em&gt; in their field are fucked if they don't, know what I'm sayin'?

The problem with technological stuff is that no matter how many abstractions one tends to pour upon them, they'll only end up propagating the problem to the next level -- that's what Spolsky tries to underline in his article. Take Python for example:  it's great that it doesn't require you to do explicit memory management, I agree that's an absolute bitch. But just head a bit outside of the regular cases that you're used to and trust me, Python's garbage collection is gonna come and bite you straight back in the ass. For example what do you do when you have a HTTP library that instead of freeing your TCP sockets, keeps them in a sort of in-between hanging state (i.e. CLOSE_WAIT) indefinitely? You're going to have to be careful to call close() anyway, won't you?

So getting back to my point: sure, a good technician doesn't need to learn all the mistakes of people who came before. But unless he works along with someone who is aware of these mistakes, he'll necessarily end up repeating them, just like all other fools keep repeating the history they don't know.

Us computerfolk are somewhat lucky though: unlike, say, civil engineering, which dates back before the time of Apollodorus of Damascus, computer history is less than a century old. Sure, if you take the fact that you can't actually know computing unless you have a grasp of electronics, electrical engineering, hydrodynamics and... well, the point is that we never run out of things to learn, regardless of the end we approach this from. It's just that some of us get to reach a deeper understanding of their field, while others don't, and either way it's no tragedy.</description>
		<content:encoded><![CDATA[<p>I'll dwell upon the following question:</p>
<p>> do they need to re-learn the entire history of the field? </p>
<p>because I think it's central to the discussion. For example, take:</p>
<p>> Joel Spolsky's article on leaky abstractions</p>
<p>Heheh, the article hasn't aged all that well, has it? Kids nowadays by and large have no clue about COM, LPSTR, ASP.NET and all those other thingamajiggs that were somewhat current when he wrote the piece. It's not that no one programs using these technologies anymore, I'm sure there's a fair amount of web apps written using ASP, it's just... no one teaches this stuff anymore, just like no one teaches COBOL.</p>
<p>So this brings me to the following point: I don't think this needs to be a debate, and no, I don't think that the masses need to re-learn the entire history of the field. However, the folks who aim to be <em>philosophers</em> in their field are fucked if they don't, know what I'm sayin'?</p>
<p>The problem with technological stuff is that no matter how many abstractions one tends to pour upon them, they'll only end up propagating the problem to the next level -- that's what Spolsky tries to underline in his article. Take Python for example:  it's great that it doesn't require you to do explicit memory management, I agree that's an absolute bitch. But just head a bit outside of the regular cases that you're used to and trust me, Python's garbage collection is gonna come and bite you straight back in the ass. For example what do you do when you have a HTTP library that instead of freeing your TCP sockets, keeps them in a sort of in-between hanging state (i.e. CLOSE_WAIT) indefinitely? You're going to have to be careful to call close() anyway, won't you?</p>
<p>So getting back to my point: sure, a good technician doesn't need to learn all the mistakes of people who came before. But unless he works along with someone who is aware of these mistakes, he'll necessarily end up repeating them, just like all other fools keep repeating the history they don't know.</p>
<p>Us computerfolk are somewhat lucky though: unlike, say, civil engineering, which dates back before the time of Apollodorus of Damascus, computer history is less than a century old. Sure, if you take the fact that you can't actually know computing unless you have a grasp of electronics, electrical engineering, hydrodynamics and... well, the point is that we never run out of things to learn, regardless of the end we approach this from. It's just that some of us get to reach a deeper understanding of their field, while others don't, and either way it's no tragedy.</p>
]]></content:encoded>
	</item>
	<item>
		<title>By: Alex</title>
		<link>http://thetarpit.org/2024/simplicity-and-ease#comment-5342</link>
		<dc:creator>Alex</dc:creator>
		<pubDate>Fri, 27 Sep 2024 15:38:48 +0000</pubDate>
		<guid isPermaLink="false">http://thetarpit.org/?p=547#comment-5342</guid>
		<description>Another useful reference for this topic is Joel Spolsky's article on leaky abstractions: https://www.joelonsoftware.com/2002/11/11/the-law-of-leaky-abstractions/

Any and all fields have this issue regarding entry-level education: needing to find a good balance between teaching the basics, and teaching present-day relevant skills. Oh and doing it in a reasonable timeframe, ideally with costs as low as possible.

Starting from both ends, there are challenges on both sides. Regarding the present-day status of the field, students need to learn tools of the trade to some degree so that they are not entirely useless in the workforce, if and when they decide to leave academia. Take Python, for example - sure, one starts with the easy stuff, imperative programming and OOP. But it's not a stretch to use the same language to touch upon deeper stuff, like what's MRO (a concept useful for all object-oriented languages), how constructors work, functions as first class objects (paving the way towards functional programming!). Get to debugging your code and you suddenly touch upon topics such as the call stack, understanding that both objects and code live at some address in the memory, stack heap etc. Of course, it takes some doing, but then again .. what doesn't?

At the other end of the sausage, one needs to teach the basics while trying their best not to weigh down the process with useless cruft. Sure, teach a student how logical gates work, and how C works, and how are the bits and bytes put on the wire in order to make computers talk to each other via networks. However, do they need to re-learn the entire history of the field? Do they need to know about every failed innovation attempt, and every dead language out there?

I can definitely agree with you that a form of decay is failing to teach the basics, on the premises that the current-tools will suffice. And yet, lurking in the background is the idea that the opposite is somehow also true: that teaching just the basics will suffice, and that once the simple building blocks are known, one can build tools that provide ease of use. Except no, not really, because present-day tools have a history of trial-and-error behind them, of people who tried to find the good ways in which to use the simplicity of the building blocks. Fail to take that into consideration, and you're bound to (inefficiently) re-invent the wheel most of the time.

All in all, until such a time when the tech field stops producing "innovation" at break-neck speed, this debate shall rage on, eternal.</description>
		<content:encoded><![CDATA[<p>Another useful reference for this topic is Joel Spolsky's article on leaky abstractions: <a href="https://www.joelonsoftware.com/2002/11/11/the-law-of-leaky-abstractions/" rel="nofollow">https://www.joelonsoftware.com/2002/11/11/the-law-of-leaky-abstractions/</a></p>
<p>Any and all fields have this issue regarding entry-level education: needing to find a good balance between teaching the basics, and teaching present-day relevant skills. Oh and doing it in a reasonable timeframe, ideally with costs as low as possible.</p>
<p>Starting from both ends, there are challenges on both sides. Regarding the present-day status of the field, students need to learn tools of the trade to some degree so that they are not entirely useless in the workforce, if and when they decide to leave academia. Take Python, for example - sure, one starts with the easy stuff, imperative programming and OOP. But it's not a stretch to use the same language to touch upon deeper stuff, like what's MRO (a concept useful for all object-oriented languages), how constructors work, functions as first class objects (paving the way towards functional programming!). Get to debugging your code and you suddenly touch upon topics such as the call stack, understanding that both objects and code live at some address in the memory, stack heap etc. Of course, it takes some doing, but then again .. what doesn't?</p>
<p>At the other end of the sausage, one needs to teach the basics while trying their best not to weigh down the process with useless cruft. Sure, teach a student how logical gates work, and how C works, and how are the bits and bytes put on the wire in order to make computers talk to each other via networks. However, do they need to re-learn the entire history of the field? Do they need to know about every failed innovation attempt, and every dead language out there?</p>
<p>I can definitely agree with you that a form of decay is failing to teach the basics, on the premises that the current-tools will suffice. And yet, lurking in the background is the idea that the opposite is somehow also true: that teaching just the basics will suffice, and that once the simple building blocks are known, one can build tools that provide ease of use. Except no, not really, because present-day tools have a history of trial-and-error behind them, of people who tried to find the good ways in which to use the simplicity of the building blocks. Fail to take that into consideration, and you're bound to (inefficiently) re-invent the wheel most of the time.</p>
<p>All in all, until such a time when the tech field stops producing "innovation" at break-neck speed, this debate shall rage on, eternal.</p>
]]></content:encoded>
	</item>
</channel>
</rss>
