Check out this guy (archived):
Let's talk about Google's newest software supply chain product. Reading the GA announcement I had many mixed feelings. [...]
I don't know about Seth Michael Larson's feelings, but let's check out these guys:
Threats to the software supply chain and open source software (OSS) security continue to be major areas of concern for organizations creating apps and their developers. [...]
[...]
Jon Meadows, managing director and Citi Tech Fellow, Cyber Security at Citi said, "Citi has been an advocate and active leader in the industry's efforts to secure enterprise software supply chains. Both Citi and Google see untrusted and unverified open source dependencies as a key risk vector. This is why we've been excited to be an early adopter of Google Cloud's new Assured OSS product. Assured OSS can help reduce risk and protect open source software components commonly used by enterprises like us."
Assured OSS guards OSS packages against attacks and risk by:
- continuously mirroring key external ecosystems to manage end-to-end security without creating forks
- managing the security and integrity of the mirrored repos and end-to-end build tool chain with tamper-evident provenance and attestations
- continuously scanning for, fuzz testing, and fixing critical vulnerabilities, which are then quickly contributed back upstream to limit the exposure time and blast radius
- operating a critical patching team to support covered packages
"As organizations increasingly utilize OSS for faster development cycles, they need trusted sources of secure open source packages," said Melinda Marks, senior analyst, ESG. "Without proper vetting and verification or metadata to help track OSS access and usage, organizations risk exposure to potential security vulnerabilities and other risks in their software supply chain. By partnering with a trusted supplier, organizations can mitigate these risks and ensure the integrity of their software supply chain to better protect their business applications."
Emphasis mine. Now let us go back in time to March 11, 2023 and do a quick recount of what I said, including commentary. I said that: a. the open source model is for most intents and purposes1 indistinguishable from proprietary ones; and b. that the quality of (among others) software one uses is predicated upon one's "web of trust", i.e. their social circle2.
Google here isn't doing anything new, of course. In fact they're doing for various "open source software packages" precisely what Murdock et al. have been doing for Debian ports since their inception: they're positioning themselves as a broker for trusted code to be reused by other "organizations" like themselves, a code that otherwise was most likely developed by a rando coder in his spare time. Thus Google believes (and perhaps rightly so) that a third party is required to make informed assessments about the quality of said code, and that that party shall be Google. In other words, let the unpaid dudes use Microsoft's platform (i.e. Git plus bells and whistles) for their development process, but make no mistake, Google will vet what reaches the consumer's plate -- which doesn't really come as news, since that's how Google viewed their WoT relationships all the way since day one: they provide (granted, while swallowing some of the risks) and you consume.
Thus, Seth Michael Larson makes a good point when he observes that Google don't bother to maintain a relationship with software maintainers. However, he carefully avoids mentioning that all such brokerage services introduce the same single-server-multiple-clients model where clients are practically powerless to decide what actually goes into "their" software. At least he somewhat admits the reason behind his avoidance, namely that his favourite supply-chain-as-a-service pays him while Google doesn't.
Unfortunately for Seth Michael Larson, I have very little sympathy for software maintainers3. So instead of taking his perspective, and in closing this article, let us for once look at the so-called supply chain problem from the human perspective, from the bottom up. In other words, let us say you (or I) want to do something involving computing, and computing machines. What now?
Well, at the first and bottom layer, you'll require a piece of machinery that supports whatever it is you're trying to do. This layer alone comes with a pretty serious set of problems, since as of yet no one knows how to bake ICs without spending a fortune on masks4. FPGAs are relatively cheaper, but then we'll have to discuss how to bake those, which is something I don't want to get into. However, feel free to write an article on your blog on how to achieve this feat, I'll be the first to read it.
Anyways, at this first layer, the only practical solution is to buy something readily available on the shelves. If you're doing finance, military or some other security-sensitive field, then you'll likely want to scrape the second-hand markets for items lacking the usual deadweights such as UEFI, ME, TPM and other baked-in modules used for imperial vetting. This approach is especially useful if you don't need to run the latest and greatest software, especially since "open source" stakeholders are working hard on phasing out support for this sort of hardware. Bear in mind though that you'll have to maintain your entire stack yourself, meaning that if something on your motherboard fries or dries out, you'll need to have replacements readily available5. But to be honest, ThinkPads will likely last you a lifetime if you take care of them -- even some of the earlier shitty Lenovo ones.
At the second layer, you're likely going to need a bunch of pre-existing software (e.g. an operating system) to support the very same whatever-it-is-you're-trying-to-achieve. If for example you're doing gaming, then a Windows or Mac might turn out to be sufficient; if you're doing video editing or orchestrating 100-track songs, then a Mac would probably turn out to be better in terms of performance, although to be honest there's probably very little difference between the two ecosystems nowadays -- the problem with Macs is that they tie you into a bunch of very expensive pieces of hardware which however may be worth the money depending on what you're doing. If you're doing networking, then quite likely you'll need a Linux or BSD running on your box. In any case, you need to find out what environment works best for your thing, whether we're discussing programming languages, operating systems, compilers or the hardware lying below. I'd avoid postmodern hypes such as Rust, TypeScript and whatever other softwares make the front page of fashion tabloids nowadays, but that's just me.
In either case, regardless of source availability or any other considerations, the third layer necessarily consists of social relationships with the software provider(s). If you want to use Google's thing, then by all means, do that, just make sure you understand what that means; if you're using proprietary products, make sure you can file customer support issues and that there's someone on the other end to answer; if you're using and developing open source programs, then at least try to talk to the folks involved in the software that you're using. Who knows, maybe you'll find some like-minded folks who appreciate your stuff and this way you'll actually have a chance at contributing to one community or another6.
Finally, at the fourth layer, it's in your best interest to develop a certain level of awareness with respect to advice coming from randos on the interwebs. Do your own research and reach your own conclusions, as this is what helps you gain a meaningful context for your work in the first place.
Long story short, the so-called supply chain problem is most likely yours to deal with, not unlike the vast majority of things in life.
-
Yes, once upon a time I've hacked through the millions of lines of code that make up, say, LLVM or QEMU. So... what? I used them as aids in solving very specific problems, but other than that, what is to be said about the purported freedoms their source code enabled? A closed source compilation infrastructure with a strong plugin system would have served me just as well, or maybe better; or, for that matter, I could have used a less modern compiler such as TCC. Instead, I used whatever was fashionable and whatever recommendations I got from my WoT at the time.
Again, I can't help but ask... so what of that? ↩
-
This, by the way, also has very little to do with "open source". Let's take the hypothetical example where Joe maintains the proprietary software stack of Very Good Company, Inc. On a superficial glance, we may for example state that Joe is most likely not the only guy who maintains said software, and thus the overall quality of said software depends not only on Joe's engineering skills, but on the social and technical skills of his peers as well.
But more importantly, said quality is based upon his relationship with folks who never even touched the code (management, QA), since for example he'll have a very hard time keeping the damned thing in good condition if all he does all day is "write code for new featurez". I for one can safely say that at least 80% of my time at the day job is spent doing anything but that.
Reckless, trigger-happy development is a bad idea regardless of the code distribution model. Engineering, amirite? ↩
-
Just so that we understand each other, all this lack of sympathy comes from a software maintainer's perspective. Maintaining this blog and its underlying LAMP stack comes with a whole buttload of costs, not to even mention all the other stuff I'm keeping around, some of it published for free for allcomers to use and complain about. Regardless of any of this, it's my software, so it'd be quite fucking nonsensical to moan and whine about how you're not paying for it. Had I really cared about this, I'd put up a paywall like the NYT or whatevers.
I know, giving away code just like that makes it much, much harder for one to get this kind of leverage -- which, by the way, is why the only code Google shares with you is some client-side JS crapolade. My point is that if you're giving away stuff under a permissive license, you then have no moral ground to complain about the takers. You can't take back a loaf of bread you gave to some poor guy only because he didn't do that dance you liked. You either give him the loaf or you don't, end of discussion.
How did that Romanian saying go: with the cock up your ass and your soul up in Heaven. Da fuck is this shit? ↩
-
Actually... Sam Zeloof is attempting precisely that. I've got my eyes on him. ↩
-
In case you're wondering, this is precisely the approach that I'm employing for the laptop used to write this article. ↩
-
By the way, this is how open source worked back when it used to work. Some traces of this approach of doing things still remain for example in the BSD world -- but don't trust me, make up your own mind about it. ↩
Now that you've thought about securing your software dependencies supply chain, and your hardware supply chain, I eagerly await to see you tackle the question of securing your electrical power supply chain :)) Even the traditional Rahova/Ferentari solution might not be applicable if the folks upstream have no juice either.
This is obviously a matter of more efficiently taking advantage of unpaid volunteers. Just yesterday, I saw dipshits arguing in favour of some MicroSoft identity system over simple RSA, because all of the talking heads argue RSA to be bad. The field is being set for volunteers to be coerced or strong-armed into obeying someone else's rules for the privilege of handing over code, all without pay, as absurd as the idea is.
A number of changes are needed for a solution to this nonsense. Firstly, programs must be finished at some point, and they must be correct when finished. Secondly, there needs to be much smaller programs; this can be done through less code using more data. On the hardware front, it would be good to have uniform and interchangeable components; memory chips work, and are simple compared to processors; the FPGA is the closest thing for computation. Rather than learn how to cook, learn how to not be poisoned.
@Cel Mihanie: Jest all you like my good man, but there are actually straightforward, time-tested solutions to that problem, that go well besides the gypsery you mention. Par example:
1. Buy a plot of land somewhere -- there's plenty of acres going well under $50K a piece in this fine country we have over here in Eastern Europe
2. Stock up on diesel fuel
3. Buy a bunch of redundant generators
4. ... ???
5. Profit!
I'll also anticipate your next thought: the way this is going, you're going to need an armed militia to guard the land, a couple of iobagi to work it and so on and so forth. On the medium-term there's no alternative to rebuilding the supply chain from the ground up. The future of urbanity is quite simply sheer poverty, and the future is very-very close, if not already here.
@Verisimilitude: The debate you mention doesn't really surprise me. Just yesterday I heard some congresswoman supporting the introduction of female crash test dummies because they promote gender equality in car accidents. If this is not the mark of generalized schizophrenia, I don't know what is. Maybe life in 1980 USSR could equal this level of nonsense, although I doubt it.
I don't really want to discuss FPGAs, since for example the software toolchains supporting even the open source Lattice items used by Stan are as good as proprietary. I'm not sure hardware production economics can be scaled down this century, so our generation is stuck with ThinkPads or whatever.
I dunno about relying on dead dinosaur fuel, that thing is perishable and a bitterly fought-over resource. Also refining to usability it is a very high tech process. Lots of stuff in that chain that can break with falling IQs and societal trust.
Quite interesting to think about the shortest tech path to electricity tbh. Hydro seems like the way to go methinks. Even ancient peoples could throw together a water wheel, and a dynamo you can rig together with just wire.
Reinventing basic electronic control might also be not totally hopeless if we can write down the know-how at least. I once saw a video where some sort of soviet hag built an amplifier tube from scratch in her own garage. A shitty one, but working nonetheless.
So anywho, were I our esteemed host, I would ensure that Chateau Lucian is built next to a raging water source. Good also for disposing of unwanted visitors, in Minecraft.
Fresh flowing water (also abundant in these parts of the world!) comes with yet another advantage: if properly recirculated, it can provide a decent cooling source for computing equipment; conversely, enough computing equipment could provide a decent heating source in the winter.
The yet-undiscussed issue is whether we want to be able to access them internets or not. I'd give up continuous internet access any time for the cost of a self-sufficient home.
So... not sure how this blog will make it through the collapse.
Neither do I. Regardless, from uniformity comes better inspectability.
Yes. The only way to reasonably use untrusted hardware for trusted computations is to use a variety of multiple such machines, unlikely to be tainted in identical ways, and determine they return identical results.
In any case, we should focus on correcting the situation of piss-poor software first, since we can control that moreso than hardware.
The "dismal science" of "open" FPGA toolchains (not even speaking of "LSI in garage", where the same observations would apply, if someone were to actually get them off the ground) arguably reduces to Naggum's "All of this "code sharing" is an economic surplus phenomenon."
Baking a useful and reasonably general logic synthesis toolchain is actual work (and considerably harder than writing a traditional optimizing compiler for a von Neumann machine); the number of people qualified to do it -- likely numbers in the dozens; and all of these people have day jobs. Nor is there any incentive to carry out such a project commercially -- especially considering that the available quasi-documented homogeneous FPGAs are of "toy" size in re: LUT count, and this seems unlikely to change.
The more general underlying problem is that approximately no one in the commercial world (as far as the naked eye can see) actually gives remotely enough of a shit re: trustworthy hardware (or even software) to even contemplate the "DOS-like on a hand-sewn FPGA CPU" approach to computing (whether via "open" toolchains or otherwise) for any purpose whatsoever. There is no economic underpinning for any such work, even in industries which a naive outsider may imagine are strongly concerned with security.
And so, unsurprisingly, what there is -- is of decidedly "amateur-quality", riddled with piles of Open Sores dependencies, and largely unmaintained. Quite like the familiar Linux hell, but without a 1990s golden age to "run off the fumes" of.
@Verisimilitude: A while ago, someone -- someone who uses "his" computers in a very different way than I use my own -- told me "hardware is a commodity", meaning loosely: "I don't care if my Mac crashes, as I have all my bits backed up in Apple's cloud and I can replace my iron in a few minutes without noticing much of a difference".
I guess there's something to be said for this approach of doing things. My point is, the trust you put in your hardware is whatever trust you put in the folks who provide it to you.
@Stanislav Datskovskiy: Continuing on the same line as the reply to Verisimilitude above, for trustworthy computing to become an economic activity, you need askers on the market. As long as computers will be used to prop up all the various lolcat platforms and other assorted bits, scams, apps and so on and so forth; and as long as the customers on the hardware market place enough trust in the providers, whether said hardware goes into clouds, electric cars and what have you -- as long as all these hold true, things won't change. I'm also willing to bet this holds true in the Eastern hemisphere as well.
Regarding LSI in garage: I'm curious to see if they manage to pull it off, regardless of the amateurism. I certainly remember the days when I was an amateur and even if it took me about twenty years... well!
@spyked #8:
Entirely correct, and interestingly, the observation applies not only to the "lolcat", "bits, scams, apps" industries, but in fact everywhere -- in finance (both "TBTF" and otherwise), manufacturing, etc.
Re: "garage LSI": IIRC already "pulled off" -- for certain values of "garage" (some experimenters did get hold of surplus industrial gear that would otherwise cost 7-8 figures, the necessary caustics, etc.) But the software problem remains -- to this day there is not AFAIK even a reasonable (i.e. effectively auto-routing) open PCB CAD, much less a useful logic toolchain.