Very true, I was writing as absolute value, not % (magnitude is where my day job is). My point still stands: it is complete nonsense that tolerance goes down.
They said it "should" go down, but that another comment saying the worst case is the same is "also correct".
I do not see any "complete nonsense" here. I suppose they should have used a different word from "tolerance" for the expected value, but that's pretty nitpicky!
I'm sorry, but it's incorrect, as stated. It's a false statement that has no relation to reality, with the context provided.
Staying the same, as a percentage, is not "going down". If you add two things with error together, the absolute tolerance adds. The relative tolerance (percentage) may stay the same, or even reduce if you mix in a better tolerance part, but, as stated, it's incorrect.
It's a common misunderstanding, and misapplication of statistics, as some of the other comments show. You can't use population statistics for low sample sizes with any meaning, which is why tolerance exists: the statistics are not useful, only the absolutes are, when selecting components in a deterministic application. In my career, I’ve seen this exact misunderstanding cause many millions of dollars in loss, in single production runs.
It only stays the same if you have the worst luck.
> You can't use population statistics for low sample sizes with any meaning
Yes you can. I can say a die roll should not be 2, but at the same time I had better not depend on that. Or more practically, I can make plans that depend on a dry day as long as I properly consider the chance of rain.
> In my career, I’ve seen this exact misunderstanding cause many millions of dollars in loss, in single production runs.
Sounds like they calculated the probabilities incorrectly. Especially because more precise electrical components are cheap. Pretending probability doesn't exist is one way to avoid that mistake, but it's not more correct like you seem to think.
I've repeatedly used a certain words in what I wrote, since it has incredible meaning in the manufacturing and engineering world, which is the context we're seeking within. It's a word that determines the feasibility of a design in mass production, and a metric for if an engineer is competent or not: determinism. That is the goal of a good design.
> It only stays the same if you have the worst luck.
And, you will get that "worst luck" thousands of times in production, so you must accommodate it. Worst off, as others have said, the distributions are not normal. Most of the << 5% devices are removed from the population, and sold at a premium. There's a good chance your components will be close to +5% or -5%
> Yes you can. I can say a die roll should...
No you cannot. Not in the context we're discussing. If you make an intentional decision to rely on luck, you're intentionally deciding to burn some money by scrapping a certain percentage of your product. Which is why nobody makes that decision. It would be ridiculous because you know the worst case, so you can accommodate it in your design. You don't build something within the failure point (population statistics). You don't build something at the failure point (tolerance), you make the result of the tolerance negligible in your design.
> Sounds like they calculated the probabilities incorrectly.
Or, you could look at it as being a poorly engineered system that couldn't accommodate the components they selected, where changing the values of some same-priced periphery components would have eliminated it completed.
Relying on luck for a device to operate is almost never a compromise made. If that is a concern, then there's IQC or early testing to filter out those parts/modules, to make sure the final device is working with a known tolerance that the design was intentionally made around.
Your perspective is very foreign to the engineering/manufacturing world, where determinism is the goal, since non-determinism is so expensive.
> If you make an intentional decision to rely on luck, you're intentionally deciding to burn some money by scrapping a certain percentage of your product. Which is why nobody makes that decision.
Now this is complete nonsense. Lots of production processes do that. It depends on the cost of better tooling and components, and the cost of testing.
And... the actual probabilities! You're right that you can't assume a normal distribution. But that wouldn't matter if this was such a strict rule because normal distributions would be forbidden too.
Determinism is a good goal but it's not an absolute goal and it's not mandatory. You are exaggerating its importance when you declare any other analysis as "complete nonsense".
> since non-determinism is so expensive.
But your post gives off some pretty string implications that you need a 0% defect rate, and that's not realistic either. There's a point where decreasing defects costs more than filtering them. This is true for anything, including resistors. It's just that high quality resistors happen to be very cheap.
Please remain within the context we're speaking in: final design not components. When manufacturing a component, like a resistor or chip, you do almost always have a normal distribution. You're making things with sand, metal, etc. Some bits of crystal will have defects, maybe you ended up with the 0.01% in your 99.99% purity source materials, etc. You test and bin those components so they fall within certain tolerances, so the customer sees a deterministic component. You control the distribution the customer sees as much as possible.
Someone selecting components for their design will use the tolerance of the component as the parameter of that design. You DO NOT intentionally choose a part with a tolerance wider than your design can accommodate. As I said, if you can't source a component within the tolerance you need, you force that tolerance through IQC, so that your final design is guaranteed to work, because it's always cheaper to test a component than to test something that you paid to assemble with bad parts. You design based on a tolerance, not a distribution.
> Determinism is a good goal but it's not an absolute goal and it's not mandatory.
As I said, choosing to not be deterministic, by choosing a tolerance your design can't accomodate is rare, because it's baking malfunction and waste into the design. That is sometimes done (as I said), but it's very rare, and absofuckinglutly never done with resistors tolerance selection.
> But your post gives off some pretty string implications that you need a 0% defect rate, and that's not realistic either.
> There's a point where decreasing defects costs more than filtering them.
No, defects are not intentional, by definition. There will always be defects. A tolerance is something that you can rely on, when choosing a component, because you are guaranteed to only get loss from actual defects, with the defects being bad components outside the tolerance. If you make a design that can't accommodate a tolerance that you intentionally choose, it is not a defect, it's part of the design.
0% defect has nothing to do with what I'm saying. I'm saying intentionally choosing tolerances that your design can't accommodate is very very rare, and almost always followed by IQC to bin the parts, to make sure the tolerance remains within the operating parameters of the design.
I feel like this has lead to circles. I suggest re-reading the thread.
Maybe you could give an example where you see this being intentionally done (besides Wish.com merchandise, where I would question if it's intentional).