Hacker News new | past | comments | ask | show | jobs | submit | more clipsy's comments login

The government can mandate anything it wants, if insurance companies are allowed to perpetrate mass fraudulent denials then it simply doesn't matter.


goodreads does.


> it’ll make murder for your cause viable and that’s a road to chaos

Murder for your cause is already viable and profitable, UHC used it to great effect by rejecting enormous numbers of legitimate claims.


Try flipping the question and asking yourself: given all of that, why did UHC have a false-denial rate more than double the industry average?

For example, perhaps the issue here is that UHC was undercutting competitors knowing that the vast majority of their fraudulent denials would go unchallenged -- they still meet the requirement for minimum spending on payouts, but they attract more customers[0] and in so doing distort the health insurance market.

[0]: Customers in this case are primarily businesses buying policies for their employees, rather than end users of the product.


Surely we can do better -- I bet if we put our minds to it, we could drop to 66th by 2030!


Seriously, we’ve got RFK questioning vaccines, promises to roll back government overreach in areas like protections against toxic chemicals, two fresh pandemics brewing. I bet we can get there by the end of 2025!


I heard they're going it put a vaccine skeptic in charge of the FDA. You'd just need a few more moves like that, roll back whatever government funded healthcare currently exists and a few other moves to encourage more unhealthy lifestyle habits and 2030 might actually be possible.

Meanwhile in Australia, we are sitting at 4th place with universal healthcare and a half decent welfare system, although there is a housing crisis at the moment which is resulting in more homeless every year, if that keeps up we might fall below 4th place.


> Reusing symbols like +, *, or / to define operations that aren't the + or the / you're used to is pretty common in math. It's just notation.

Reusing symbols in a different context is pretty common; taking a symbol that is already broadly used in a specific way (in this case, that `a/b` is defined for elements in a field as multiplying `a` by the multiplicative inverse of `b`) is poor form and, frankly, a disingenuous argument.


I am a professor for algebra at a research university. I make a point out of teaching my students that `a/b` is NOT the same as multiplying `a` by the multiplicative inverse of `b`.

The standard example is that we have a well-defined and useful notion of division in the ring Z/nZ for n any positive integer even in cases were we "divide" by an element that has no multiplicative inverse. Easy example: take n=8 then you can "divide" 4+nZ by 2+nZ just fine (and in fact turn Z/nZ into a Euclidean ring), even though 2+nZ is not a unit, i.e. admits no multiplicative inverse.


I think you will find that a large majority of people supporting both parties vote based on factors like desired policies and the current state of the nation, rather than (pardon my phrasing) trumped up personal drama.


The definition is that it is the additive identity for the field; eg x + a = x no matter what value x takes and what field you are considering. This must be unique; suppose a and b are both additive identities for a field, then b + a = b and a + b = a, but commutativity gives us a + b = b + a, resulting in a = b.

The reason the additive identity cannot have a multiplicative inverse is likewise fairly straightforward: once again using `a` as our additive identity we have y.(x+a) = y.x for all x, y in our field; distributing on the LHS gives y.x + y.a = y.x for all x, y in our field; subtracting y.x from both sides finally gives us y.a = 0 for all y in our field.

You would need to relax one or more of the field axioms to have a structure in which the additive identity can have a multiplicative inverse. I'm not aware of any algebraic structure of particular interest that would allow a multiplicative inverse of the additive identity, but in general if you're interested in reading more on this sort of thing I'd recommend reading about rings, commutative rings, and division algebras.


> even in other languages a/b gets closer to it's actual value as a and b get bigger (the "limit", which is the basis of Algebra)

This is not generally true. 5/2 = 2, 50/20 = 2, 500/200 = 2, and so on no matter how big the numbers get.


Yes, I meant when the result gets bigger. You get the idea.


What's the output of this Go program, without going to the playground link?

  print(math.MinInt / -1)
https://go.dev/play/p/Vy1kj0dEsqP


> The alternative is to never change your view

No, I would say the alternative is to listen to and evaluate arguments and base your views on what you find to be convincing. The problem with the "I don't care until it happens to me" mindset is that they often flatly refuse to consider arguments until those arguments affect them personally; this isn't open-mindedness but rather closed-mindedness paired with a deeply self-centered world view.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: