Hacker News new | past | comments | ask | show | jobs | submit login
'Securing Open Source Software Act' introduced to US Senate (senate.gov)
243 points by di on Sept 23, 2022 | hide | past | favorite | 182 comments



Reading the comments so far, I'm genuinely surprised that more folks haven't applied a "follow the money" lens to their analysis.

To me, it reads as a bald-faced attempt to discourage public sector entities from using OSS solutions, when in fact there are perfectly good and definitely >100% secure proprietary offerings that cost a reasonable amount when purchased from the sorts of vendors that pay lobbyists to "help" senators write OSS bills.

Do you honestly think Rob fucking Portman woke up one day with strong opinions about FOSS?

Make no mistake: this is a thinly veiled late-stage attempt to displace the growing dominance of OSS-based solutions to the sorts of problems that the government and military used to pay 8 and 9 figures a year to EDS to solve.

An actual, good-faith bill that seeks to address these issues would attempt to incentivize/punish orgs that use FOSS without making meaningful contributions to it.


They're trying to destroy FOSS ...by hiring FOSS developers?

I don't buy it. More like, log4j was an actual real big issue for government agencies because they use rely on tons of open source projects and haven't previously done much to make sure that that supply chain is robust. This would help to change that.

Federal contractors don't need to sell proprietary software to make money -- they make more money selling FOSS software.


> Federal contractors don't need to sell proprietary software to make money -- they make more money selling FOSS software.

tech companies in general are making billions using FOSS.


Of course, which is why I think viewing this as a push for proprietary software is not a fair assessment. Increasing your company's development costs is not a way to make more profit.

The point of this is that you can't treat proprietary software dependencies the same way you treat FOSS. Community FOSS projects just don't have the same development process and governance model that proprietary software does. And so, the assessment of these projects is necessarily different.


Respectfully, you have a too-reasonable hunch of how decisions are made at the executive level, and because of that, you're reaching an incorrect conclusion.

The primary factor driving the decision making process is not cost but risk.

Many fail to remember the lengths to which companies like Microsoft, Oracle, Sun and others went to create FUD around the adoption of OSS in the public sector. It involved lobbyists, marketing campaigns and a whole certifications industry.

When a CIO selects a technology, they aren't just weighing its merits and price tag; they are actively looking to leave a glowing legacy while actively doing everything they can to avoid being humiliated if the technology fails. See, the CIO knows that, as far as their board of directors is concerns, it's not the software that failed but the executive.

That's why enterprise support packages exist; it's an insurance policy and a promise that if something goes terribly wrong at 2am, someone is going to pick up the phone when you call.

Companies with expensive tools have an obvious interest in convincing their best customers that the competition in inherently risky.


Government contractors expensive tools are already heavily using open source tools. The whole log4j scandal that this bill is in response to is perfect evidence of exactly that.

It is not in the interest of 'proprietary' software vendors to fearmonger about OSS libraries. They too use OSS libraries just like everyone else does today.

It isn't 1995 anymore where a proprietary piece of software can run on a full stack of code written by the people who work locked away in a single building at one company. Everyone is using open source dependencies. There is nobody left that would benefit by demonizing open source libraries.


It's not OSS libraries they'd be worried about, as competitors, but OSS applications, I d think.


This bill is in response to a recent issue with an OSS library: log4j.


Maybe. At least that's what they want you to think


>Companies with expensive tools have an obvious interest in convincing their best customers that the competition is inherently risky

Honestly I’m embarrassed at how revelatory this line is.


"Companies with expensive tools have an obvious interest in convincing their best customers that the competition in inherently risky."

The problem is that they don't need to do this. In the open source community we're doing a great job all by ourselves. I've written open source libraries and platforms, I've also written proprietary software that uses open source. Log4j, Heartbleed, half of NPM etc weren't the work of Oracle lobbyists, they were avoidable fuckups caused by everyone relying on critical infrastructure that wasn't really maintained.

I'd actually go further and say you overestimate the power of lobbying and underestimate the possibility of lobbyist's having a point. The primary argument these firms made against relying on OSS was that it's often just a bunch of random people with no incentive to do the un-fun work like security audits or patch backports. There's nobody who takes responsibility for things. That's true and is pretty much the core of Red Hat's business model so it's not like anyone can really dispute this.

The software industry does need new approaches to how we use open source stuff, IMO. Sandboxing of libraries would go a long way. But we can't just pretend there's no problem here and it's all the work of shadowy corporations, it's naive ideological stuff.


> they were avoidable fuckups caused by everyone relying on critical infrastructure that wasn't really maintained

uhh rewriting history much? the reason the entire world uses openssl is because it operates correctly.

The judgement here is diluted by exaggerating the worst moments of an otherwise profoundly performant ecosystem. btw- I turned my outward facing servers off for three days for Heartbleed, so .. how to balance a serious development, with the growth of billions of devices working correctly each day for years to get here.

In a previous life I did work on internal proprietary C/C++ code bases and yes, there were ridiculous technical debts, bugs, and missed opportunities right along with correct code, insights reduced to code, and solving hard problems well.


> process and governance model that proprietary software does.

except that a lot of proprietary software doesn't has anything like that either

there is nothing in common proprietary dev practices which would have e.g. prevented log4j

the main difference is that you can hold someone financially responsible in one case and not in the other (but in turn you can always fix any problem yourself, good luck fixing any proprietary software after it's support runs out).


It does when you sell it to the government, which is what this is about.

The federal government has rules regarding the acquisition of software products, and they have various governance and risk management requirements for various scenarios. Many of these were designed with proprietary software in mind.

The point of this bill is to update some of those requirements to make sense with the way OSS works.


They could verify the security of the oss, look for vulnerabilities, run scanners. This is possible because log4j is OSS.


That’s assuming these individuals aim to make profit for their entity. The risk raised above was corruption, with tax paid development cost going to their uncles company, with a big fat kickback under the table.


Dependency management is a real security concern. There’s no corruption there.

The root comment suggested that somehow this was a smear on FOSS, intended to sell more proprietary software. However, this doesn’t hold up, because FOSS components are widely used by all of the big proprietary software vendors in this space too.


> haven't previously done much to make sure that that supply chain is robust

I'm doubtful that anything short of requiring thorough security/pen testing on everything in the dependency tree would have prevented log4shell. And if that is the goal here, who is going to pay for that? Most open source projects don't have that kind of funding.


> everything in the dependency tree

With reasonable prioritization, we can probably do a lot to reduce the threat long before we get near everything.

> who is going to pay for that?

Well, this bill does seem to involve hiring FOSS devs, maybe that's partly what they'd do?


Seriously, the government would rather burn $50k on the manpower that makes the decision to not pay $10k in software licenses. The beltway contracting space is about selling a lot of butts in a lot of seats, and a bit of hardware that goes boom. FOSS is AMAZING for govt contractors, they can bill $250m building something using FOSS that's available as a commercial-off-the-shelf (COTS) product. They put a lot of work into ensuring the govt requirements are rigid enough that the COTS alternative is deemed unacceptable during the AoA process.


More or less of a big issue than the revolving door of Microsoft bugs?


Different types of issues with different solutions. When you have a support contract with the original developer of a piece of code, you can demand the original developer fix the code.


That same solution still works for FOSS, you just actually need to pay for it. The situation for FOSS is even better because you also have the option of paying any other person to fix the code. Hopefully that's kinda what this bill does, though?


It’s “the same” in a really broad hand-wavy sort of way. It’s not the same practically speaking.

FOSS usually does not have warrantees, SLAs, or developers in particular locations with particular credentials. The government has processes that they must follow for paying contractors, so “just pay them” is not something that can easily be done last minute, (and sometimes, not at all without a literal act of Congress.) Also the financial and obligatory relationship between them and the developer is much different, and must be managed differently.

Really, it is very important for people doing technology projects at the government to understand the difference between calling up the developer of Windows and calling up the developer of Git. Those are two very different relationships.


You seem to be reading me as implying any use of FOSS is equivalent to having a support contract; it's not. I was claiming that having a support contract for FOSS is equivalent to having a support contract for proprietary software. If you need "support contract" level support you should probably be paying for a support contract. At least with FOSS that can be competitive for the same piece of software (although that probably also means that some of the services on offer are quite a bit worse, having not had to organizationally have passed the hurdle of "actually making the software" at any point).

Finding out that you have a problem when you don't have a support contract and then looking around for someone to work on the thing is not the same thing as having a support contract, although in some cases it can be a sufficient substitute and it's certainly cheaper in the best case (like any other form of skipping insurance).

Depending on context, providing the "support contract" internally is also an option.


I said the issues are different and you disagreed, so I detailed what I meant.

It’s also not reasonable to require the government to just get 10,000 support contracts just to implement a single application.

What makes the most sense is what they’re doing here:

1. come up with a strategy for managing these risks

2. Collectively work with OSS developers instead of treating every one of the governments 10 bazillion projects like it needs a separate support contract for a component that is shared


> I said the issues are different and you disagreed

I disagreed that things are all that different "if you have a support contract."

> It’s also not reasonable to require the government to just get 10,000 support contracts just to implement a single application.

I agree, but I'm not sure that's relevant? If a single support contract is sufficient for proprietary software - making them responsible for addressing (incl. possibly working around) issues in any dependency - why is that not also viable for FOSS software?

I don't disagree that what they're doing here seems likely to be a good idea, I just think you were initially selling "pay for support for the software you rely on as a big organization" a little short in its general applicability; indeed, this probably be viewed as providing that service internally.


It's not so simple. The developers are not necessarily available for this kind of support. For example, I maintain an FOSS library, but I already have a full time job, and I'm not interested in working more hours. Unless someone hires me to work full time on this library, there is no chance that I provide paid support. I suspect that many other maintainers are in the same situation.


That's also potentially true of proprietary software, though. Not every company will offer a support contract, so if that's what you need you need to pick other software. I agree that probably a smaller fraction of FOSS projects offer first-party support contracts, but on the other hand third-party support contracts are a lot more reasonable in the FOSS context.


Ehh, I don't disagree with where you start but I do with where you end.

If it is a money thing then it probably has more to do setting up "standards" and "compliance" requirements that you must me to use FOSS software in the government. Then federal contractors and other big FOSS organizations repackage their existing solution as "Government ISO-MITRE, PCI, Whatever-BS-Acronym-we-can-come-up-with" compliant and charge a premium over something they already sale.

I don't think this will hurt FOSS, at the end of the day FOSS is nothing more than someone saying "here's something I wrote or whatever." and sharing it, anything that tries to make more than that isn't talking about FOSS anymore.


These corporate sponsored legislators are really good at writing policy that sounds good to the public but really helps their corporate sponsor's bottom lines in practice. This bill wouldn't exist if it wasn't designed by large corporate software firms looking to taint, or profit from, FOSS in some way.


Isn't it very obvious that the US government has a vested interested in software security? I see no reason whatsoever to believe that this is some sly attempt to make FOSS more expensive as opposed to the US trying to solve a very real problem.


If it were truly about software security it wouldn't be limited to open source. Whatever standards make a piece of software secure doesn't change when it's developed by a company or a community, so what's the reason they're making the distinction in law?

For that reason I agree with the conclusion of the person above, things don't happen in capitalist societies without someone getting paid for it.


> things don't happen in capitalist societies without someone getting paid for it.

Yeah? Someone paid you to post this?


well, we can't leave it to pesky kumbaya, drum circle, communist software devs that work for free to do good security. only corporate capitalist software devs care about security. duh! now help me get this encryption bill passed so we can put back doors in our encryption, you know, for the kids' safety.

/s


100% sure? Why not find out if that's true because I'm sure there are exceptions.


100% sure.


You're actually dead wrong and this is spoken like someone with no experience in this space. What actually happened was not only log4j, but then multiple pre-auth CVEs in Atlassian suite products, effectively brought down DI2E, a collaboration and developer tooling platform that was open to the Internet and used by huge numbers of unclassified DoD and IC development projects. Since it was already sunsetting and had reduced staff, most programs still haven't gotten back data that was lost when the web fronts went offline suddenly to mitigate ongoing attacks. This wasn't catastrophic, but months of work were lost

The reality is, the greatest vulnerability here isn't in government-procured software that uses OSS libraries directly. The DoD's own critical custom applications are largely as secure as secure gets. The much bigger problem is in proprietary software, like the Atlassian suite, that uses OSS libraries in its own supply chain.

In fact, the DoD has had mandates from the highest level for years to use FOSS everywhere it is available, and to require all new contractors to open source anything they create for government use that isn't classified. The only reason something like the Atlassian suite is still used is a combination of institutional inertia and the lack of comparably feature-complete FOSS alternatives (the vast majority of this kind of space going to SaaS companies that don't allow you to self-host).


While skepticism is fair that should be motivation to do research to possibly find evidence of an ulterior motive.

However just saying that all legislation is some scam without evidence doesn't make sense.

For example, and I don't know the exact name, but the health care transparency act where pricing for treatments or whatever must be published. Who is that helping if you follow the money?

Edit: I should have been more clear but ulterior motive to benefit large companies who sell software, or something that harms OSS.


Maybe increased popularity to the legislator in question? Surely no bill comes without some ulterior motive.


Increased popularity isn't an ulterior (meaning hidden) motive, it's explicitly the foundation for democracy. The whole premise is that we exploit human selfishness for net good by setting up a system that (in theory) rewards making the largest number of people happy. There's nothing ulterior about that motive, it's the point.

Where things go wrong is when money becomes involved, because then the system rewards making the wealthiest people happy, not the largest number. That's an ulterior motive.


Why is buying votes by using government power/money good, and buying congress votes through lobbying bad?


I didn't say that buying votes with legislation was good, I said it's the premise of democracy. Whether democracy is good is an open question. For myself, I'm with Churchill:

> Many forms of Government have been tried, and will be tried in this world of sin and woe. No one pretends that democracy is perfect or all-wise. Indeed it has been said that democracy is the worst form of Government except for all those other forms that have been tried from time to time.


I don't think the premise of democracy is using money to influence politicans. If that's true then the rich have more power.


Please read my original comment. That's exactly what I said:

> Where things go wrong is when money becomes involved, because then the system rewards making the wealthiest people happy, not the largest number. That's an ulterior motive.


Of course one or more people benefit from every piece of legislation. If a legislator does it to get elected then that's the best case scenario for democracy, right?


> An actual, good-faith bill that seeks to address these issues would attempt to incentivize/punish orgs that use FOSS without making meaningful contributions to it.

Holy shit this is a terrible idea. I can’t think of a better way to discourage people from using open source than to punish organizations for “not making meaningful contributions” to it. As a heavy open source user, contributor, and author, I’m begging you to stay the fuck away from regulating users. That’s the fastest way to destroy open source.


If they were really trying to secure the code, shouldn't the bill be called the "Securing Software Act"?

It's not like closed source software is magically immune to vulnerabilities


This isn't about fixing anyone's code, it's about securing the software stack that the government is running. They already have well developed processes for addressing closed source software.


Can you provide a reference to the processes you are referring to? I'm only familiar with EAL and FIPS which require theories that there is maintenance where they apply in the lower part of the stack. They are applied to open source like OpenSSL and SeLinux, probably with better quality than a lot of niche closed source government procures.


Why can't those same processes be used for open source code then? If we assume those processes can't(or don't) apply to open source software then am I to believe that OpenSSL has been running the internet without any scrutiny for the last decade?


This is about risk management process, not computer science process. The businesses processes are different.

For example, if your proprietary software has a bug, you call the developer and demand they come into your office and fix it under warranty.

Doesn’t work that way for some dependency downloaded from GitHub.


Practically all software is licensed/sold without any warranty. I challenge you to find even a single counterexample.


Challenge accepted, here’s one that’s publicly posted. Most are not on public websites:

https://www.vmware.com/solutions/industry/government/warrant...

Warranties aren’t common in B2C or cheap boxed software. A few hundred or a few thousand dollars is not worth anyone’s time to negotiate special terms. In big dollar B2B or B2G, software isn’t usually as-is. Contracts are negotiated that specify what will be delivered, and what remediations exist if those deliveries fall short. If you spend 7 or 8 digits on software, you can easily get a warranty.


Great example. Key quotes:

> your sole remedy will be that VMware shall, at its option,

> make a U.S. Person on U.S. soil available to provide technical support (in the case of non-conformity to the aforementioned Section (b)) or refund the license or service fees you paid

> VMware receives prompt written notice of the non-conformity following delivery (in the case of Software

So,

If the product isn't accessible: they'll write down the way it isn't. Period.

If the product doesn't do what it says it does, and you notice it quickly after purchase: Either 1. they provide technical support, OR 2. they refund your purchase price. And it's VMware who decides.

If the product doesn't do what it says it does, but you don't know until later: You're still screwed.

There is no guarantee of performance or suitability for purpose here. There is no guarantee of fixing anything. At worst, "you weren't prompt". At best, yes you get your money back. On the average, maybe you're given a phone number to call, and that person will ask if you've rebooted your computer.

This is actually still really bad for risk management; you have no guarantees that anything will keep working, just that after several months of running your platform on this stack, you might get refunded the purchase price. (Which leaves open the question, does that invalidate your software license, and do you now have to emergency migrate to a different stack?)

So, what you said earlier really isn't true:

> For example, if your proprietary software has a bug, you call the developer and demand they come into your office and fix it under warranty.

Also, your earlier

> Doesn’t work that way for some dependency downloaded from GitHub.

The above is true for every single open source dependency you download from Github! You'll be refunded your $0 purchase price, immediately!


The VMWare warranty I linked was an example of one explict warranty, it is not representative of what all software warranties look like.

Some warranties are implied, as well, and you won't find them written down anywhere.

See this article for several further citations:

https://corporate.findlaw.com/litigation-disputes/performanc...


My point is, all software warranties tend to be weasel words that don't amount to that much, as far as risk management is concerned. They especially don't usually obligate the seller to fixing anything, at most they undo the purchase.


At most, a warranty failure may have associated damages that could far exceed the purchase price.

(This is why FOSS licenses almost universally disclaim both warranty and liability for damages, rather than ignoring the issue. Refunding the purchase price is not the worst that can happen.)

Usually in very large software purchases, fixing problems is often much cheaper than undoing a purchase.

While technically, you may be able to legally refund a $10,000,000 software purchase in lieu of buying a plane ticket for a developer, in practice, that's not really an option anyone takes.


Sure, sure, now note the part where VMware is not obligated to do either, they can just give you a helpdesk number and drown you in bureaucracy. And that is still only necessary if you noticed the problem "promptly" after delivery.

I think if you want to argue damages is a common contract clause in the software industry, it's only fair I ask you to name examples. Because it really isn't common.


What they are really after is having someone to assume risk and take blame.


They'll change the name as soon as FOSS developers spend millions to hire lobbyists.


Paraphrasing your closing paragraph: “you may not use this software if you have not already contributed to it” (because enforcing a future is hard).

Seems a bit contradictory to the license terms. Gonna be a trick to get legislation to enforce something that isn’t in established contractual language.

Edit: on the positive incentive approach, getting double-funded with discretionary budget for every FOSS developer they hire might be a viable strategy.


> and definitely >100% secure proprietary offerings

how is that even possible


You can manage it if you plant your tongue firmly in your cheek


> definitely >100% secure

Did you mean <100%?


> perfectly good and definitely >100% secure proprietary offerings that cost a reasonable amount

That part confused me too - I believe it's meant to be sarcastic. There's a kind of humor that I often don't get until further reflection, which is a way of saying things that are so overboard and absurd that the speaker cannot possibly believe what they're saying. Or rather, the speaker is saying the opposite of what they mean, in order to make their point. Might be a cultural thing.

This kind of humor is even harder to recognize these days, when people honestly do believe the absurd things that they're saying.


For those curious about what it actually is:

> The Securing Open Source Software Act would direct CISA to develop a risk framework to evaluate how open source code is used by the federal government. CISA would also evaluate how the same framework could be voluntarily used by critical infrastructure owners and operators. This will identify ways to mitigate risks in systems that use open source software. The legislation also requires CISA to hire professionals with experience developing open source software to ensure that government and the community work hand-in-hand and are prepared to address incidents like the Log4j vulnerability. Additionally, the legislation requires the Office of Management and Budget (OMB) to issue guidance to federal agencies on the secure usage of open source software and establishes a software security subcommittee on the CISA Cybersecurity Advisory Committee.

So basically just another framework to evaluate risk for use by the Federal Government. A nothing burger as it were. Which I am on one hand glad about, because I don't like the government starting to get involved in Open Source which is at it's core "Here's some code I wrote or whatever", but it also isn't doing anything for security.


Not a complete nothing burger; a lot of people here work for companies that sell to the Feds or host FedRAMP-authorized SaaS solutions. There will definitely be private-sector impact from that risk framework, though I'm not saying that's necessarily a good or a bad thing.


Additionally,

“The legislation also requires CISA to hire professionals with experience developing open source software to ensure that government and the community work hand-in-hand and are prepared to address incidents like the Log4j vulnerability.”

So we should definitely expect at least some minute changes to the open source economy, itself.


This is the worst part. "Experience developing open source software" is both entirely vague and specific at the same time, likely conjuring up an image of some developer with green boxes on a GitHub repo or something, which is terrible. This is going to force the creation of some sort of silly criteria for what constitutes that experience, of which suits in federal agencies, and the political pressure and politicians they are behest to, will likely have no concept of less-popular open source communities, which will detract from the ethos of open source and ultimately, and more importantly, freedom.


Anyone owning at least three Hacktoberfest t-shirts qualifies.


> "Experience developing open source software" is both entirely vague and specific at the same time

Good. CISA is better equipped to be refining those specific requirements than Congress.


You are focusing on the hiring requirements which is very vague. What would be a better way to define hiring requirements?


What criteria would you like to see here?


It sounds innocuous enough, but could the real motivation be to make open source software so expensive to use that all government agencies "choose" to use closed source software?

(This is a genuine question, I'm honestly not sure what the consequences, intended or otherwise, could be?)


Nah. You'd just need to use the old versions that had gone through a security audit and had some enterprise level Long Term Support contracts available.

IBM will offer quotes for whatever is required within 2 quarters or less.


"Oh, great, more STIGs." How about they take some millions and PAY THE OSS PROJECTS?


In my experience, the hesitancy to contribute/ pay open source projects comes more from the contractors versus the government.


That could happen. One of the easiest ways of handling the situation would be to sign an agreement with an organization like Tidelift, who pass money on to OSS projects to keep handling the dev work.


the legislation seems a little pointlessly broad. "open source" is just software at the end of the day so it can easily be covered by existing STIG guidelines. these already work with Ubuntu and Redhat.

https://en.wikipedia.org/wiki/Security_Technical_Implementat...

Open source doesnt need a special response process, and the only reason you'd want one is if youre old guard like Symantec, F5, VMWare, or Veritas and starting to become alarmed at the amount of business you're losing to open source now that "devops" is starting to catch on and a recession is in effect.


This will result in "we can't use open source because the boss doesn't want to deal with the paperwork. We'll get John from X massive company on the phone and license their version."

Imagine a job where all you do is meetings, and never have to deliver anything. That dream is a reality in government. There's a problem with something? Let's make a committee. Let's create paperwork. But now we're managing more things, so we get bumped up to a more prestigious position. - this goes on and on.

It operates despite inefficiency. Because it's artificially propped up by tax dollars. No one is responsible, it's free money.


I'm really curious how this would have protected the government from log4shell. Log4j is (or at least was) one of the more reputable open source projects.

This kind of feels like doing something for the sake of doing something about log4shell, without actually solving any problems. And will undoubtedly result in the government paying more taxpayer dollars for software that complies with this new framework.


Would it be possible that the US government create its own open source software? has that been done before?


It sounds extremely similar to the executive order from Biden last year. For what it's worth, I think some parts of that are valuable such as productive a bill of materials for all the software that gets shipped. That way figuring out of if some product uses a vulnerable version of log4j is very simple and independent of particular programming languages.


5 paragraphs of Silver Bullet Magical Thinking nonsense to get to the actual bill, which is basically calling for a study and recommended guidelines for OSS security - of which there are several already, including one from the DoD [1].

I see nothing new or useful here, what am I missing?

"The Securing Open Source Software Act would direct CISA to develop a risk framework to evaluate how open source code is used by the federal government. CISA would also evaluate how the same framework could be voluntarily used by critical infrastructure owners and operators. This will identify ways to mitigate risks in systems that use open source software. The legislation also requires CISA to hire professionals with experience developing open source software to ensure that government and the community work hand-in-hand and are prepared to address incidents like the Log4j vulnerability. Additionally, the legislation requires the Office of Management and Budget (OMB) to issue guidance to federal agencies on the secure usage of open source software and establishes a software security subcommittee on the CISA Cybersecurity Advisory Committee."

[1] https://insights.sei.cmu.edu/blog/taking-up-the-challenge-of...


This is not a push for proprietary software. It's a prelude to regulatory capture where a bunch of highly paid consultants will need to bless your open source solution for big money.

It's going to be like electrical contracting. You get someone cheap to do the wiring and then a union guy comes in to sign the papers and take a pound of flesh.


We already have this. You hire some idiot to sign off on your FIPS-140. They are getting paid under the table by IBM. They swear that the only way to comply with FIPS-140 is to use RHEL. The only company I ever worked at that hasn't fallen for this scam is Google, who self-certify everything including their crypto stack. But every other smaller company (i.e. all other companies) are just terrified of not getting government contracts and are in thrall to these consultants.


FIPS validation does not have a self certify path


> Google, who self-certify everything including their crypto stack.

This probably means paying for an independent lab's FIPS validation of Google's crypto stack, rather than using one from RHEL (where RH/IBM paid for the FIPS validation).


Maybe, if electrical contracting used blockchains like sigstore, https://www.sigstore.dev/ from OpenSSF, https://openssf.org.


It seems most commenters want to interpret this as a threat, or as a way to discourage use of FOSS by government. I don't think it is either one. The US government isn't a monolith, it's thousands of semi-independent fiefdoms all doing different things, and some know what they are doing and others have horrible security practices. This seems to be one of several efforts to fix that.

Looks to me like it will wind up making more money available for developers, mainly outside government, to audit and improve important free software that the feds are currently using. Unfortunately because of the way that government contracts work, companies that are already experienced at doing government contracts might wind up with the bulk of the money. But it isn't going to make things worse and might actually make things better.


So will they help fund the projects now, or will they just express their opinions on how your unpaid work should be done?


There is this little nugget:

“The legislation also requires CISA to hire professionals with experience developing open source software to ensure that government and the community work hand-in-hand and are prepared to address incidents like the Log4j vulnerability.”


Unless it specifies who this just means they'll throw a few billion to Teksystems, CGI or whoever and then say that it was a failure like HealthCare.gov


That spawned a lot of good things. Check out 18F and USDS.


There are special exceptions, but definitely not the norm.


Except for the change in acquisition and contracting behind the scenes that was driven by that colossal failure.

Have you noticed that the IRS website is accessible and searchable? That SBA is functional? Conpare that to, say, Department of Commerce, where things are a mess (currently).

The US realized it needed to pick things up a bit and, while it's still a hard space to work in, started to shift to a better playbook.

https://playbook.cio.gov/


Without the text of the bill should show up here eventually: https://www.congress.gov/bill/117th-congress/senate-bill/491...

There's no way to actually know. Everything not based on the text of the bill is pure speculation.


That’s really the question, isn’t it? The article makes it sound like hiring “open source devs” may be part of the strategy, which essentially amounts to anyone with a public GitHub repo.


No the people hired will be IBM consultants and such with long lists of meaningless qualifications and no GitHub profile.


No the people hired will be IBM consultants and such with long lists of meaningless qualifications and no MicroSoft profile.


Oh no, they’ll have all the Microsoft certifications too.


Do licenses like the GPL even apply to TLA like the NSA and CIA? Or could they just make patches for themselves and not release them?


Yes. The US federal government has voluntarily relinquished their sovereign immunity and they can and often do get sued for copyright violations. Of course, given that the NSA/CIA have the luxury of classification, the resulting lawsuit will be even more complicated than a normal copyright lawsuit (which is already one of the most expensive and frustrating parts of law to litigate).

States are where you need to worry. Occasionally a state decides to pass a law[0] saying they can't be sued for certain copyright violations. Because of how the US constitution is set up, states (and nothing smaller than them) are allowed to just say they can't be sued, which lets them crime with impunity.

[0] https://www.npr.org/2020/03/24/820381016/in-blackbeard-pirat...


Anyone can make open source code changes and be license compliant as long as you don't release the binary (YMMV depending on specific license). However as soon as you give someone binaries they're entitled to request the source. I'd love to see a future with goofy situations like Iran suing for the stuxnet source code because it statically linked a GPL library.


where it gets really goofy is the US gov work doesn't have copyright itself, so wouldn't any work done by them be PD, regardless of the underlying license?


My gut feeling is no, because the work is still a derivative work of a privately-owned copyrighted work, and private copyright does not dissolve when the government touches it. If you could separate the governments' code from the GPL code, then it would be automatically public domain, but the combination is still GPL.

However I'm not aware of any case law proving this.


If they extend an existing GPL work, the work as a whole is GPL, even if their added code is PD.


My first instinct is that sovereign immunity applies. GPL is a license designed to protect the copyrights of the creator or copyright owner, but copyright itself is a privilege granted by the Federal government and protected by law. So if you sued them for violating the GPL, I’m fairly certain they could just claim sovereign immunity in court.


There is a comment about that up-thread:

https://news.ycombinator.com/item?id=32957861


Might depend on whether copyright applies to the agencies, since the GPL depends on copyright itself.


GPL only requires you to provide source code when you provide an executable program. If their internal software isn't released outside the organization, then source code does not need to be either.


google famously used this clause and that was near twenty years ago.. things have evolved in GPL land

https://en.wikipedia.org/wiki/GNU_General_Public_License


When government agencies behave within the law, it is by coincidence, if not an accident which will soon be rectified.


Don't worry about it. It's just code for 'government departments only get to use software from giant corps with well known and unpatched bugs now'.


We do B2B software in banking and this is something we've been anticipating for quite some time now. We were implicated in that log4j exploit via a (very) transitive, cross-language dependency.

We killed 100% of our Java usage over this. We simply don't have enough in-house talent to make sure things are safe in that bucket. Our customers thought this was a glorious plan as well.

I do think most of the pain should fall to the vendors of the end product, not their oss suppliers. If your shop doesn't have enough resources to validate all vendors are safe, maybe figure out how to do it with fewer vendors.

At a certain level, if you are selling deficient products to sensitive customers, you really need to be stopped. Anything impacting finance, PII, safety, infrastructure, defense, etc. Some extra regulations could go a long way in these areas.


You ditched all java over a single bug? That seems extreme.....

Did you ditch ssl over heartbleed?


At the time, Java accounted for <1% of our codebase/product, so it made sense to do so.


Did they ditch CPUs after spectre?


Finally! I can go outside! : ' )


> You ditched all java over a single bug?

A bug in a third-party dependency, no less.


Seems like they are taking the right approach. Instead of trying to regulate OSS, they're funding CISA to help make it more secure.


What will the CISA actually do?


It turns taxpayer money into individually owned Teslas.

Joking aside, they do actually have a github: https://github.com/cisagov


Lol.. they'll pay contractors from big companies to come in and tell them why they shouldn't use open source.


get funded


LF OpenSSF "criticality score" for 100K Github repos, https://github.com/ossf/criticality_score & https://docs.google.com/spreadsheets/d/1uahUIUa82J6WetAqtxCM...

> Generate a criticality score for every open source project. Create a list of critical projects that the open source community depends on. Use this data to proactively improve the security posture of these critical projects ... A project's criticality score defines the influence and importance of a project. It is a number between 0 (least-critical) and 1 (most-critical). It is based on the following algorithm by Rob Pike..

Top 20 projects, based on "criticality score" algo output, you can run the script on your favorite OSS project:

> node, kubernetes, rust, spark, nixpkgs, cmsSW, tensorflow, symfony, DefinitelyTyped, git, azure-docs, magento2, rails, ansible, pytorch, PrestaShop, framework, ceph, php-src, linux


Oh fun! At my employer the "security team" misunderstands that criticality score and takes it as a "vulnerability score". Everything scoring high is a security risk, everything scoring low is secure.


OMG :) OpenSSF will love that story! Straight into the documentation Hall of Fame.

They can check out the Securing Critical Projects working group, https://github.com/ossf/wg-securing-critical-projects


They link to that page in their documentation.

I can only assume that they misunderstand this:

> 1. Identify critical open source software (OSS) projects.

> 2. Secure those projects.

Not as "those projects are widely in use, we should really make sure that they are looked at", but as "we have identified those projects as insecure and will secure them".


Why limit it to open source? You wouldn't let an engineer build a bridge with car-sized holes just because the blueprint is not open.


There are different questions that should be asked when choosing open source software vs proprietary software.

If you're using proprietary software, you might ask things like, "what's your SLA?", "can we review the source code?", or "how much is a license?"

If you're using open source software you might ask things like "is the project maintained?", "who developed it?", or "do we have anyone on payroll who knows how thing works?"

So yeah, it makes sense to treat them differently, at least in some respects.


For the sorts of purposes described in bill-blurb, the questions would be essentially identical.


Not really. The managerial process for fixing a vulnerability or mitigating supply chain attacks in an open source project is functionally different than for a proprietary product. So much so that it’s often the subject of memes (and frustration) in the FOSS community when people don’t understand the difference.

Stuff like this:

https://daniel.haxx.se/blog/2020/12/17/curl-supports-nasa/

Or this: https://daniel.haxx.se/blog/2022/01/24/logj4-security-inquir...

… is exactly why you can’t treat them the same in these types of situations.


Why the focus on open source?

The reason log4shell had such a big impact is because of how ubiquitous it was. Sure being free gives OSS a bit of an advantage in becoming ubiquitous, especially as a library.

But there's also plenty of proprietary software that is ubiquitous as well. And proprietary software has plenty of bad security bugs too.


So… I’m reading this from the perspective of working 100% professionally on open source software and I don’t understand at all what implications this has for my work being declared public infrastructure. I don’t think it’s being funded, which I guess isn’t surprising because not funding infrastructure is literally a meme which has been going for years. I don’t think anything is being offered to help secure the software I work on. It kind of reads like a vague threat? I don’t mean to be glib at all, but if you’re declaring something public infrastructure and you’re ostensibly a public servant, maybe making me feel scared of you isn’t a great look?


DHS (Dept. of Homeland Security) CISA (Cybersecurity and Infrastructure Security Agency) CSAC (Cybersecurity Advisory Committee) TAC (Technical Advisory Council) subcommittee report, June 2022, https://www.cisa.gov/sites/default/files/publications/June%2...

> The Technical Advisory Council Subcommittee was established to leverage the imagination, ingenuity, and talents of technical experts from diverse background and experiences for the good of the nation. The subcommittee was asked to evaluate and make recommendations tactical and strategic in nature. These Cybersecurity Advisory Committee (CSAC) recommendations for the June Quarterly Meeting focus on vulnerability discovery and disclosure.

  Mr. Jeff Moss, Subcommittee Chair, DEF CON Communications 
  Mr. Dino Dai Zovi, Security Researcher
  Mr. Luiz Eduardo, Aruba Threat Labs
  Mr. Isiah Jones, National Resilience Inc.
  Mr. Kurt Opsahl, Electronic Frontier Foundation 
  Ms. Runa Sandvik, Security Researcher
  Mr. Yan Shoshitaishvili, Arizona State University 
  Ms. Rachel Tobac, SocialProof Security
  Mr. David Weston, Microsoft
  Mr. Bill Woodcock, Packet Clearing House 
  Ms. Yan Zhu, Brave Software


one of those is a political liberal turned spy-for-hire .. this does not look like a list of Fortune 500 shills from here


"securing open source software act" would rationally mean funding the NSA or similar experts to help harden open source software, right? Or, hey, telling the NSA to disclose vulnerabilities they find in open source software so they can be patched, instead of sitting on them hoping nobody else notices. Right? No? Wait, what? It's just about telling the federal government to use less open source software? How does that make open source software more secure?


You're just attacking the title. The description doesn't demonize OSS at all, it praises it.

Risk frameworks are important, but they're not something you need the NSA to help you accomplish.

It doesn't take the NSA to do things like:

"Make a list of dependencies"

"Make sure that they have active developers"

"Check the list of dependencies for vulnerabilities"

"Look in the commit history to make sure one doesn't say 'People's Liberation Army -- implementing backdoor'"


Right, but it's focused on helping the government decide what open source not to use, rather than on actually making open source more secure.


The risk framework isn’t written yet, this just directs CISA to write one. While evaluating dependencies is often a part of what you’d do in a risk framework, it’s typically just one part.


A good move would be increasing funding for defensive security teams at NSA, especially those working on open-source software, and restoring the separation of governance/leadership between defensive and offensive security teams.


> ? Or, hey, telling the NSA to disclose vulnerabilities

To not disclose them is an opportunity to legit spy whatever using scapegoats (ie. North Korea hackers did that, etc.) and to legit sue or accuse enemies of using them (if they fix it, attackers won't attack).


https://news.ycombinator.com/item?id=32956218#32957137

> FWIW, while this specific act may not be enforcing significant regulation, software developers need to understand that there's a ticking clock.

There are several initiatives from LF's OpenSSF and startup Chainguard.

Sept 2022, "Concise Guide for Evaluating Open-Source Software", https://github.com/ossf/wg-best-practices-os-developers/blob...

Sept 2022, "Show off your Security Score: Announcing Scorecards Badges", https://openssf.org/blog/2022/09/08/show-off-your-security-s...


If they really wanted to make open source more secure they would just pay people to audit & submit fixes to widely used open source software. Of course the reality today is that the opposite occurs -- when federal intelligence agencies find out about vulnerabilities they prefer to keep the exploits for themselves.


"This led top cybersecurity experts to call it one of the most severe and widespread cybersecurity vulnerabilities ever seen."

Apparently they never changed one character in a query string in the late-90s.


I won't get into the reason behind this, but I will say I found this statement horrible, and could not disagree more.

>“This important legislation will, for the first time ever, codify open source software as public infrastructure,” said Trey Herr

Open source software is no more "public infrastructure" than the efforts of volunteer organizations. The government should have no say over this matter IMO.


There's a funny bite here that seems long-term good.

The gov has a large culture of tapping integrators who do not give back to OSS, just use, basically the middle man, and leave behind fragile one-offs. Such abandonware should overwhelm recieving depts within 6-12mo, and bringing back integrators for the treadmill of patching superfluous npm CVEs would break their budgets.

So that means pressure to, well, not do that. Either the integrators get more involved, or part of the budgets finally goes to people who are.


Well I've just had the weekly "why can't you come here" call with my forign gf, failed to deliver on the couple tickets I have open at work, and posted my daily "barf into ~/stuff/*.c" to /prog/ so lets go through and read this instead of going and being with people.

TFA has no bill number so lets see if we can find it. Actually no, I'm not seeing it. Someone send me an HR? I'll update my comment if you do.


Text not available on Congress.gov yet, but keep your eyes on this space: https://www.congress.gov/bill/117th-congress/senate-bill/491...


a radical thought: how about hiring some engineers to contribute to oss that's being used in critical infrastructure?

I believe that most of the assessment stuff is covered by many NIST recommendations anyways.


Why can't they treat this like academic / scientific work and create a funding body around grants that support OSS devs so they have more time and money to protect the software?


Any laws regulating OSS is potentially a big problem... why is it even needed? Reminds me of the "PATRIOT" Act authored by Joe Biden in 1994.


FWIW, while this specific act may not be enforcing significant regulation, software developers need to understand that there's a ticking clock. Modern civic engineers went without any significant regulation, and then that changed. Software is young, it's in the phase where people aren't dying too often for the public to care. But breaches are leading to massive privacy problems, real wars and conflicts are increasingly leveraging software defects, and the impact and scrutiny will only grow.

If you want to avoid having to pass tests, having to maintain insurance, having to do a bunch of bullshit, all just to be a software engineer, get started on fixing things now.

It is absurd that anyone can anonymously provide open source code, with no assurances whatsoever, and that can end up in critical software. And you might be saying "well, it's up to people to audit their dependencies" - and maybe you're right. But I would challenge that everyone has the right to publish code for distribution purposes with zero responsibility.

Publishing code to Github? Sure, go for it, anyone can do it. Publishing packages to package distributors ? No, that crosses a line. I don't want legal requirements, I don't want identification requirements, just to publish and distribute code.

If we want to avoid that we're going to need to step it up - that means, yeah, basic measures like strong 2FA to distribute packages should be a requirement. Signing packages should be a requirement. Acknowledging and triaging vulnerabilities should be a requirement. If you aren't willing to do the above, which is frankly trivial, you shouldn't be allowed to publish software for distribution purposes.

I think we need to start taking a bit more responsibility for the work we do. "NO WARRANTY" doesn't mean "No obligations", it just means no one has a legal right to pursue damages due to your software, you should still do some things.

edit: K I'm rate limited so I can't have this conversation with all of you, thanks again Dang


I'm going to disagree, I think. The problem isn't on the push side, it's on the pull side. People throwing random-quality code in github is fine. People deciding to amalgamate that into distributions and publish it is fine. The problem is that somewhere someone who is supposed to be held to some standard decided to pull that code in without looking at it, and that is the problem. NO WARRANTY is partially about legal issues, but not exclusively - if people share their code for free, they don't owe anyone anything. If you don't like that, you're free to offer them enough money to actually accept your standards.


I think the first step should be making a conceptual separation between developers and distributors, even when they are just different roles of the same person. A developer can share code with other developers, but once you start sharing the code to a wide audience, you are acting as a distributor.

Consumer protection laws tend to expect that consumer products should be safe by default. Products intended for professionals and businesses often have fewer requirements, but they should also be safe with reasonable precautions. Someone in the supply chain must take responsibility for that.

As is common in legal matters, this is more about intentions and reasonable expectations than exact definitions. GitHub can probably avoid responsibility by arguing that it's just a platform that allows developers to share their code. If you are hosting a popular package repository for some programming language, you must take some responsibility as the distributor, even if your users can be reasonably expected to be sophisticated. And if you are hosting a package repository for a consumer OS, you should probably take consumer protection laws into account.


I'm not arguing, but the standard response (caveat emptor, basically):

> pull that code in without looking at it

Is no longer reasonable. The dependency chains are too vast to expect the end-user to be able to audit the whole thing.

There are a couple of options:

1) Don't use open-source code, and make sure that commercial code that you use doesn't have it.

2) Have some kind of "regulated middleman" auditors, or certification authorities, that can certify (and probably hash) "approved" open-source chains.

They both suck. I worked for a company that did #1. They hired a company (can't remember the name, but it started with "P") that scanned our entire codebase, looking for open source.

#2 is likely to result in either corruption, or "roadblocks," where we can't use new fixed libraries, because the chain hasn't been audited, yet.


> Is no longer reasonable. The dependency chains are too vast to expect the end-user to be able to audit the whole thing.

The end user shouldn't have to audit the whole thing. The software that includes the dependencies should audit their dependencies.

If that burden is unworkable (and in a lot of cases, it is), that's a sign that the software needs to shed a lot of the dependencies.


>> pull that code in without looking at it

> Is no longer reasonable. The dependency chains are too vast to expect the end-user to be able to audit the whole thing.

Each open source project is different. For example I'm using:

Racket: Only the main distribution that is created by the development team and a few packages, and ver few additional packages, like 1 or 2 for viewing the assembler version of the compiled functions.

Python: Also only the official distribution, NumPy and perhaps 1 or 2 more packages. The batteries are included, so it's not necessary to download too much.

LaTeX: As many packages as I can add, my coworkers hate me. Each package has a different author and chains to more and more packages. But I'm using MiKTeX and I thrust the maintainer whoever he is [1]

[1] I had to google the name of the maintainer. He is Christian Schenk, I was convinced his name was Michael or something like that.


It sounds like you want to get something for nothing. If you want software that meets some given standard, then someone has to invest the effort to make that happen. This isn't always expensive, but it's never free. So your options are indeed that if that should happen, then it has to be done by the author, an intermediate party, or the consumer. Trying to make the author pay when they're not getting anything out of it is a great way to kill FOSS outright. That's not to say that "open-source code" is some boogeyman that has to be kept away, it just means that you gotta pay someone to make it meet your standards. Heck, offer the original author a contract and you'll solve the problem for everyone, and other problems besides.


> It sounds like you want to get something for nothing.

Not exactly sure how you read that from what I wrote. Maybe you were replying to someone else?

#1 is definitely not free. My company paid a bundle for that audit; far more, than if they had simply licensed commercial software.

#2 would require some kind of paid “middlemen.” The biggest problem would be that the money would not go to the authors, but to the auditors, instead.


I think a lot of people will disagree, which is cool and I'm fine with that but I do hope that this discussion can be had.

> The problem is that somewhere someone who is supposed to be held to some standard decided to pull that code in without looking at it

Why is it that there is no standard applied to those who publish code for distribution purposes? Why do we want that to be the case? Again, publishing to Github or some source repository is fine, that should never ever be restricted, but publishing with the express intent for others to use it? I don't get why we're trying to ensure that that's something that shouldn't at least imply the bare minimum of assurances.

> if people share their code for free, they don't owe anyone anything

My point is that they don't legally owe anyone anything but we should impose a moral standard in lieu of a legal one. If you are saying "here's this code, I've packaged it up and sent it out for distribution" I think it should be perfectly fine for us to say "did you do the bare minimum to make this code acceptable for others to use?".

I don't get why we say "you have no ethical obligations in open source", why do we do that? Who benefits? I get not having legal obligations, but once you're distributing code for use it seems absurd to say that you have no ethical obligations. You chose to do that, you chose to distribute it, you didn't have to do that.

And while I do think that the obligation exists regardless, I also feel that if we don't step it up here, these things are going to be forced on us. I'd rather we do it ourselves.


> Why is it that there is no standard applied to those who publish code for distribution purposes?

Because it's rude to make demands of someone who is doing you a favor.

Because a system that adds costs to profit-free work will collapse.

Because your "distribution" line-in-the-sand doesn't exist. I assume you're thinking of NPM or pypi, but ex. Debian doesn't ask people before including their packages, and ex. nixos pulls directly from those "non-distribution" channels.

> My point is that they don't legally owe anyone anything but we should impose a moral standard in lieu of a legal one. If you are saying "here's this code, I've packaged it up and sent it out for distribution" I think it should be perfectly fine for us to say "did you do the bare minimum to make this code acceptable for others to use?".

Okay; let's also make a moral standard of paying people when we derive value from their work. I think it should be perfectly fine for us to say "did you do the bare minimum to repay the person who gave you this code to use?".

> I don't get why we say "you have no ethical obligations in open source", why do we do that? Who benefits? I get not having legal obligations, but once you're distributing code for use it seems absurd to say that you have no ethical obligations.

We do that because we benefit. Making things easy for the people who are giving their work away for free helps foster an ecosystem where people keep giving stuff away for free.

> You chose to do that, you chose to distribute it, you didn't have to do that.

Yes, that's the point. We'd like people to keep giving things away even though they don't have to. If you try to impose costs on them for doing that, you'll alter the incentives so that they do the rational thing and stop giving stuff away, and/or start charging for it.


> Debian doesn't ask people before including their packages, and ex. nixos pulls directly from those "non-distribution" channels.

Then Debian (or NixOS) are publishing the code for distribution, and Debian (or NixOS) should be morally obligated to do the bare minimum to make the code acceptable for others to use.


no


> Because your "distribution" line-in-the-sand doesn't exist. I assume you're thinking of NPM or pypi, but ex. Debian doesn't ask people before including their packages, and ex. nixos pulls directly from those "non-distribution" channels.

Correct, I'm talking about publishing to a package index. The fact that the line is murky right now isn't important, that's exactly the sort of thing we should be clarifying and changing.

> Okay; let's also make a moral standard of paying people when we derive value from their work. I think it should be perfectly fine for us to say "did you do the bare minimum to repay the person who gave you this code to use?".

I don't think it's an "either/or" situation, the two are not exclusive. I would like to see more funding for open source software, but that doesn't solve the fundamental issue. It's also extremely hand-wavy. Do I pay you a monthly stipend to implement 2FA?

It also assumes that open source developers aren't getting "paid". They are. Github is free, PyPI is free. You are using services for free, that is a form of payment, or at least a social contract between various developers. So it should be reasonable then to negotiate that contract, eg: PyPI saying "if you use us as a distributor of your code, we need you to enable 2FA", which they now do.

> We do that because we benefit.

Who's we? Because a lot of people don't benefit. When there's a supply chain compromise because a developer used a weak password and no 2FA people are harmed.

> We'd like people to keep giving things away even though they don't have to. If you try to impose costs on them for doing that, you'll alter the incentives so that they do the rational thing and stop giving stuff away, and/or start charging for it.

Of course, and I'd like to keep those incentives as much as possible. I think there is a very happy medium here that isn't strictly "if I publish open source code for distribution I can do no wrong whatsoever, I have no obligations whatsoever". PyPI and other distributors now enforce that if your code gets to be very popular you must enable 2FA. That is a cost, but it's a very minimal cost targeted to a small group.

What I'm advocating for is that we standardize a set of responsibilities as an industry for those who distribute code. They can be very minimal and still have massive impact.

The alternative, in my opinion, is that we'll see increased regulation, because these problems aren't going away and they're going to get worse if we don't collectively try to improve.


Do you demand that every screwmaker make aircraft-grade screws? Aircraft makers need screws and it would be very convenient to them to be able to go down to any hardware store and just buy whatever screw they want since they are all up to spec. No need to evaluate their suppliers since everybody is required to make things up to their demanding standards.

The problem with this is that not everybody needs expensive aircraft-grade screws. Most people only need hobbyist-grade screws, or construction-grade screws. The requirements depend on their usage and it is up to the consumer to correctly identify their requirements and use the appropriate product that is fit for purpose.

The problem with software today is the rampant, careless usage of hobbyist-grade dependencies in critical software. It is the fault of the entities including dependencies that are explicitly hobbyist-grade or inadequate for purpose that poses a problem. It should not be the responsibility of makers of hobbyist-grade screws to produce aircraft-grade screws because the aircraft makers want to go to Home Depot and pick out whatever screw is the cheapest.

The solution that matters today is holding the consumers of these endless software dependencies to task for the usage of substandard or even defective software components, whether open source or proprietary, like every other industry where you must use suppliers that are fit for purpose. To demand a change to the software that is explicitly marked as unfit for purpose is to solve the problem of a aircraft manufacturer using screws from Home Depot by making Home Depot required to only stock aircraft-grade screws while demanding they keep the prices the same.


I'm quite sure that screws are indeed something that, if made in a shoddy way, would have legal repercussions for those producing them. That's the norm - it's software that's weird for not having that. I would actually expect screws are even rated for specific work. Also, people sell screws, so the analogy really makes no sense.

I'm not suggesting that software developers be required to do anything if they're just writing code, and I'm not suggesting that they do things to some sort of extreme, as you seem to be implying, if they do distribute their code for use.

> explicitly hobbyist-grade or inadequate for purpose that poses a problem

Well, no, they're not explicitly hobbyist-grade. That would be fine if someone were just publishing code and saying "don't use this", but they are publishing code for distribution to package repositories. They don't have to do that, they could just leave it as open source code that isn't distributed, and note that it's not production quality explicitly as you suggest.

Perhaps a more appropriate analogy would be if you were making dinner. You go to the farmer's market and someone with a booth their says "I'm giving away some free fruit, here you go". You would hope that someone who set up a stand at the market would be giving you fruit that's edible. If you went home and ate it, and then you got sick because it didn't meet food quality standards, you would not be the one liable, the vendor would be. "But the vendor gave it away for free!" Yes, but other than software the person giving you something is in fact liable for its quality.

Anyway, analogies suck, I'm sure this misses plenty of important bits. Rather than argue about analogies, let's clarify the actual argument.

1. No one forces anyone to publish their code for distribution purposes

2. When you publish code with the intent for others to use it there should be an obligation to provide basic quality standards to avoid that code doing others harm


Package repositories were never intended nor designed for anything other than hobbyist grade software. The have never offered any guarantees on the quality of their contents and in fact the vast majority of their contents are freely accepted without review and explicitly disclaim any fitness such as via Clause 15 of the GPLv3[1]:

"THERE IS NO WARRANTY FOR THE PROGRAM, TO THE EXTENT PERMITTED BY APPLICABLE LAW. EXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT HOLDERS AND/OR OTHER PARTIES PROVIDE THE PROGRAM “AS IS” WITHOUT WARRANTY OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE. THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE PROGRAM IS WITH YOU. SHOULD THE PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF ALL NECESSARY SERVICING, REPAIR OR CORRECTION."

If the quality of a package is not positively asserted, then it must be assumed to be hobbyist-grade regardless of how others wrongly treat it, how convenient it would be to assume it is not, or even if the quality is high but no explicit positive guarantee has been made to that effect. It is ultimately the responsibility of the users of a package to verify it is fit for purpose which demands either a positive, legally binding guarantee from the package creator or a assessment of the package itself (which would be easy if the quality is high, but no explicit guarantee was provided). To use a package while expecting guarantees on its function when it explicitly disclaims any guarantees is to use the package wrongly no matter how many others may do so and it is a problem with the user of the package, not the creator of the package who accurately advertised the quality of their package. It is the recognition of this fact and the usage of components that have actual guarantees about fitness for purpose that distinguish civil engineering from software development, not the quality of the underlying components; the quality of the underlying components follows naturally once liability is accepted and guarantees are required.

[1]: https://www.gnu.org/licenses/gpl-3.0.en.html


no.


> Why is it that there is no standard applied to those who publish code for distribution purposes?

What does "publish code for distribution purposes" mean? That sounds like all published code.

Does that mean that I can't put my own hobby code up in public? That's a sure-fire way to kill the community dead.

> You chose to do that, you chose to distribute it, you didn't have to do that.

I think the burden is more properly on the people who choose to download and use it, knowing what it is.


> but we should impose a moral standard in lieu of a legal one

I agree with you, but these moral obligations tend to get enshrined in law eventually (or quickly! See Covid)


They're gonna get enshrined into law eventually one way or the other. If we do it ourselves and we're effective at limiting the damage we cause we'll be able to maintain control over our own processes. If we don't it will be taken out of our hands.


> If we don't it will be taken out of our hands.

Which will take the code out of theirs. Disincentivize sharing, and sharing goes away.


> Disincentivize sharing, and sharing goes away.

And existing proprietary software companies will rejoice, as the barrier to entry in the software market will again be very high.


> It is absurd that anyone can anonymously provide open source code, with no assurances whatsoever, and that can end up in critical software.

While you're welcome your position and your ideas on how to solve these problems, I believe that the logic you're applying punishes the provider and not the consumer. Nobody is forcing anyone to use OSS without auditing every damn line, if that's their requirement.

Telling people that share their hard work with strangers for free that they have an ethical responsibility to accept vague "obligations" - as defined by lobbyists and politicians - is not an idea with wings.


Nobody is telling them to share their work and distribute it to others as a package. As I said, I don't think there should be any restrictions on anyone to publish code.


I don't know what your background or interest in this issue is, but I'm glad that the overwhelming majority of people do not find this perspective to be reasonable or compelling.

In the meantime, if you don't like the MIT license, don't use software published under it.


Licensing is completely irrelevant to this discussion.


I don't entirely agree with you (nor do I totally disagree), but I wish this comment was not so heavily downvoted. This is a good perspective worth discussion. Importantly, this is also how a lot of people outside the industry will see it.


> Modern civic engineers went without any significant regulation, and then that changed

There is no analogy. The only reason why other engineering disciplines are not adopting software practices is because the other engineering fields are not easy to iterate. You build a bridge. And then you could maybe get some funding to improve one part of it a decade afterwards. Because it is too expensive and cumbersome to do it.

When IoT, AI, nanomachines, 3D printing proliferate, you will see how that will change. Devices and buildings will be possible to iterate, and they will have versions that get incremented as they are improved.

...

As for obligations, the existing law already covers it. From GDPR to payments compliance, everything is there. And a lot of the best practices are invented and standardized by Open Source, actually.

...

What Open Source still lacks is the mindset to approach end-users and consumers and be able to get them on board. Open Source needs to take the route of 'no backwards compatible changes', and even 'add, never deprecate' (like JSON project) along with the habit of hiding complexity from end users and making things easy.

Then we can create a truly Open Source world in which there will be infinite new possibilities.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: