No
Should … Should we tell OP that nobody understands all of any moderately large codebase, especially the sub-dependencies … or that even the thousands of developers who wrote most of that code don’t understand how their own code works anymore?
I could read the same book every year and I still won’t remember most of the minor events on my deathbed. Doesn’t mean I won’t remember the key components that make up the story — coding is like that, except the minor events and key components can be rewritten or removed by someone else whenever you go to read them next.
that even the thousands of developers who wrote most of that code don’t understand how their own code works anymore?
The bugs I have fixed that were written by that idiot “me from a few weeks/months/years ago”…
several times I have searched for a problem and found my own stackoverflow question with no replies.
The worst thing is when it happens in this way and you can’t remember even though it was your own question https://xkcd.com/979/
That guy sounds like the dude that “works” at my house too.
It’s a web of trust. If the package maintainer is doing due diligence they should at least be aware how the upstream community runs. If it’s a one person passion project then it’s probably possible to give the changelog and diffstata once over because things don’t change that fast. Otherwise they are relying on the upstream not shipping broken stuff.
No, distros instead use a web of trust in the maintainers. New maintainers are vetted and established ones are assumed to not suddenly turn into malicious actors.
There’s an ongoing project that tries to bootstrap a Debian system from a seed that’s small enough to be read, but it’s more of a proof of concept at this stage, and even this project requires trust into a few parts up front (like xz of all things).I wouldn’t place too much faith in the vetting process. As of right now, there are 2,034 members of the packager group of Fedora. None of them are required to have 2FA (or any real account security past a password), and the minimum requirements to join the group aren’t very high (contribute a package, pick up an unmaintained one, etc). Any of those 2,034 people can push malware to Fedora, and within a week, it’d be in stable repos.
Most of these distros are volunteer efforts. They don’t have the manpower to ensure the software supply chain remains secure.
Any of those 2,034 people can push malware to Fedora
Maybe, but that is still a significantly higher bar than allowing everyone to publish a package the way most language specific package repositories work (or just use any random github repo even like some others).
Why would they do this?
Also, if you wanted to do this yourself, it is technically possible. Go build LFS and read every single LOC.
The kernel alone has more than 30,000,000 LOC. Alone reading would take forever and a day for a single person, yet understand it.
That’s barely the tip of the iceberg, too. Currently, popular projects sit at:
31M for KDE
25M for GNOME
41M for Chromium
42M for Mozilla Firefox
17M for LLVM
15M for GCC
(Note that this metric includes comments and blank lines, to which Linux would count at 46M lines. Counts with blank lines and comments removed are also in those links)
Even if a package was completely vetted, line-by-line, before it made it into a repo, would the maintainer need to get every update, too? Every PR? Imagine the maintenance burden. This code QA and maintainer burden discussion was the crux of one of the most popular discussions on the Fedora devel list.
Finally, presumably if anyone added some malicious code in a their program, it would be sneaky and not obvious from quickly reading the code.
I’d expect them to properly comment it with “#-------Begin malicious shit--------”.
COMMENT YOUR CODE, PEOPLE!The exploit should be written in a way that it is obvious and doesn’t need commenting!
Oh, in that case we don’t need to read either - just run a simple grep!
Those malicious coders are too sly for that. Some write “Sh1t” to throw grep off, others even do a “B3g1n”… They are always one step ahead!
Good point. I’d try to grep for something like
[Bb3][Ee3]g[Ii1][nη]\w+<and so on>
but I just know I’ll miss something
Well yeah, the recent xz vulnerability was not present in the source code at all. Any amount of code reading would not have caught that one.
Wasn’t the problem that
itthe backdoor was not present in the source code on GitHub, but was in the source tarball? So as long as one reads the code that one actually builds from should be fine.A line of code that enables the backdoor was out present in the tarball. The actual code was obfuscated within an archive used for the unit testing.
OK. So simply reading what was readable wouldn’t have helped. Thanks.
It is bonkers that a fucking Web Browser has more LOC than a Desktop Environment
It’s even more bonkers than it sounds. If you look at the code locations for that KDE count, you’ll see it also includes just about every KDE project. That’s not just Plasma, that’s hundreds of projects, including some really big ones like Krita, Kdenlive, Calligra, LabPlot, Kontact, Digikam and Plasma Mobile. Hell, it even includes KHTML/KJS, KDE’s defunct web engine as well as the ancestor of WebKit and Blink. It even includes AngelFish and Falkon, KDE’s current web browser frontends.
Same deal with GNOME. It includes just about everything on GNOME’s GitLab, even things that are merely hosted there without strictly being GNOME projects, like GIMP and GTK.
And yet still they are both that far behind Chromium and Firefox. Modern web browsers are ludicrous.
Not really the browser does everything these days.
OpenBSD probably comes closest.
I would argue NetBSD or plan9 is much smaller and easier to read, but of course this comes with potentially decreased usability.
Dude. I actually have sources for most of my installed packages lying around, because Gentoo. Do you know how much space that source code takes up?
Just under 70GB. And pretty much everything but maybe the 10GB of direct git pulls is compressed, one way or another.
That means that even if your distro is big and has 100 people on development, they would each have to read 1GB or more of decompressed source just to cover the subset of packages installed on my system.
How fast do you read?
Lol read and understand it.
And have eyes good enough to look very closely and detect any small . or `s that are out of place, and be current on all methods of sanitization, catching any and all confusing variable names doing funny things, and never getting mentally overloaded doing it.
I wouldn’t be surprised at all if teams at NSA & co had game months where the teams that find the highest number of vulns or develop the most damaging 0day exploits get a prize and challenge coin. Then you have the teams that develop the malware made to stay stealthy and intercept data for decades undetected, and the teams that play mail agent and intercept packages containing core internet backbone routers to put hardware ‘implants’ inside them.
These are the things Snowden showed us a small sliver of in 2013, over a decade ago, some of which was well aged by that point.
The days of doing illegal things for funsies on the internet, like learning how to hack hands-on, are over if you don’t want to really risk prison time. Download vulnerable virtual machines and hack on those.
But if you’re worried about a random maintainer or packager inserting something like a password stealer or backdoor and letting it hit a major distro with a disastrous backdoor that doesn’t require a PhD in quantum fuckography to understand, chances are likely big brother would alert someone to blow the whistle before it hit production, as they likely did with xzutils.
I’m looking at bringing Dillo back into Gentoo atm. I had to read 15k lines of code, and that’s just what’s different since the last release…
Man, I forgot what it was called, but once I was on the website of some batshit paranoid linux / bsd distro, which had a list of argument for why they removed certain packages.
It definitely seemed like the maintainers were reading the source, but some of the arguments were also really out there.
Hope somebody can remind me of what it was called.
Edit:
found it
https://wiki.hyperbola.info/doku.php?id=en:philosophy:incompatible_packages
That is, ummm, interesting. Can their installed system do anything, though? There are so many restrictions, it seems like it would be a difficult installation to daily drive.
And some of the justifications are really confusing. I realize some are probably typographical errors, but I can’t figure out what a few of them are saying at all. It reminds me of the people that invent their own lexicon and just expect everyone to understand what they are saying.
Package has different security-issues and is not oriented on the way of technical emancipation as Hyperbola is trying to adapt lightweight implementations.
It sounds like something chatGPT would hallucinate.
if you think that’s insane, you should see LibreJS.
Well, it is basically LibreJS logic applied to an entire distro, like your typical FSF-approved distro. It’s a distro by free software extremists for free software extremists and no one else. It’s is completely impractical for actual use. But really it’s worse than the average FSF-approved distro. It takes things several steps further by removing many things that are 100% free software, but just subjectively disliked by the maintainers, including even the Linux-libre kernel, as the project is in the process of moving to an OpenBSD base. The OpenBSD people naturally want nothing to do with them, so I’m looking forward to seeing that play out.
This is unrealistic. Read everything represents too much work.
These days you are likely running some code nobody read closely.
The author trusted AI and didn’t fully understand it.
The maintainer trusted the author and merged because the change sounded good and the tests passed and they are grateful anyone contributed at all.
The packager trusted the maintainer. The security team trusted the packager. The user trusted the distro.
Tim Cook reads every single LOC submitted to his OS.
He does? Highly unlikely.
TempleOS doesn’t have a repo I think so its dev writes and reads 100% of the code. I also heard of a tty-only Linux distro project that didn’t have a repo but I’m not sure
Boy do I have news for you…
deleted by creator
Probably Guix, and GNU endorsed distributions. Binary blobs are not allowed on free/libre distributions, or not on their official repos. That said, most gnu + linux distributions don’t care about those. Most will take care, if they get to realize it, about distribution licenses, so if something has some sort of legal issue to be distributed, that will get purged from its repos most probably…
Aaah, so that’s why it takes them so long to update packages.
I’ll bet you anything they’re not reading the code for every random package and dependency. But yeah, with free distros it’s at least possible to read everything that’s on your machine.
I’d bet you they don’t. It would be an impossible task to expect people to do.
IT might be, but librelinux for example really removes all binary blobs, although there’s some tooling around doing that, so new cases might be missed without human inspection, but they are careful about binary blobs… So from the whole spectrum of open source stuff, if you care about binary blobs, chances are better on the libre/free SW side.
Explain your username, please. I have a hard time with someone using that handle making posts in here.
It was the first thing that came to mind when I had to pick a username. I hope this explanation will suffice, as there isn’t anything more to it.
I get it. Sometimes thinking of a handle is all mental block. Thank you for responding.
But you next, please.
deleted by creator
Diffs are examined for sure. I remember that firsthand.
no