This is a technical article that points to something that’s not just about technology: researchers at the University of Minnesota have been caught purposefully submitting bad computer code for use in the Linux operating system and were able to get some of their submissions approved:
The Linux kernel developers are pretty ticked off at the researcher wasting their time; but I contend that the work the University of Minnesota is pushing them to is unavoidable:
The risk of malicious commits by those you thought you could trust is both real and ongoing, and the kernel team’s practices need to take that risk into account. As distressing as it is be to have to at some level be suspicious of every commit, the need for such scrutiny is as unavoidable as it is critical.
The kernel team should be thankful to have been pushed to perform this scrutiny - today it is a researcher with a proof of concept, but tomorrow it may be a sophisticated hostile government actor, and the University of Minnesota’s work helps the kernel team be ready for that. It is not a waste of their time or energy at all.
Again, please pardon the jargon. The reason I think this is of general interest is because it deals with the tension between there needing to be a level of trust among those who work together in order to get anything done, and the need to nevertheless have eyes to see, the will to investigate when something doesn’t seem right, and the heart to take decisive action as needed.
This has application everywhere - at the job site, in our sessions, and in our families, no?