Hacker News new | past | comments | ask | show | jobs | submit | elsjaako's comments login

They didn't actually get those damages.

To quote wikipedia:

> In addition, they awarded her $2.7 million in punitive damages. [...] The judge reduced punitive damages to $480,000, three times the compensatory amount, for a total of $640,000. The decision was appealed by both McDonald's and Liebeck in December 1994, but the parties settled out of court for an undisclosed amount.


You can find a reference by looking for "generic trademark"


There's the SKEDD from WE, no experience yet, but it's on my to do list.

https://www.we-online.com/en/components/products/REDFIT_IDC_...


A clever zero-cost alternative I've seen some manufacturers use is a 1xN PTH footprint with every other hole offset slightly (~20 mil) to the side. The misalignment is just enough to make a straight pin header get stuck in the holes.

There's an example of this on Digilent's Arty development board, at the top of the board between JB and JC:

https://digilent.com/shop/arty-a7-100t-artix-7-fpga-developm...


That's very cool. Thanks. At first I thought they were just press-fit pins (see https://www.te.com/content/dam/te-com/documents/automotive/g...) but they are not!

More here: https://www.we-online.com/components/media/o210254v410%20ANE...

Very nice alternative to Tag-connects. It looks like the PCB footprint is a little larger per pin but the connector is less fragile and the connection would be more robust. Great for having a debugger or logger attached during lifecycle testing especially in the prence of vibration. I also find their other suggested use - enabling expansion connections with no added cost to the base unit - very intriguing. "Options" are very typically higher margin than base products so having a way to enable adding them that does not add a connector to every unit has a lot of potential use cases.


This is exactly what I was looking for!

(In case anyone missed it, there is a video on the page that explains how it works)


You might be interested in the compiler option -fsanitize=undefined. I think it works for gcc and clang. I don't think it catches all undefined behavior, but it catches some.


On libstdc++ std::unreachable() also reliably crashes if you define either _GLIBCXX_DEBUG or _GLIBCXX_ASSERTIONS. libc++ should have a similar macro. I expect MS STL to also reliably crash on debug builds here, as it's quite heavy on debug assertions in the standard library anyway by default (and debug and release builds are explicitly not ABI compatible there).


I had an account to follow two or three guys, and maybe 4-5 accounts that only post every few months. The timeline gets filled with all kinds of things. Besides the horrible clickbait ads, it puts in random posts from unsubscribed parties. You really have to look between the crap for the stuff you're subscribed to.

If this is Twitter in the age of AI, Twitter isn't worth the effort to me in the age of AI.

Especially since I can use Mastodon, and that is actually very nice. I get that not all interests are as well represented, but I like it.


I love the way Hunter S. Thompson described it. I'll strip it a bit for brevity, but the whole book (Fear and Loathing in Las Vegas) is worth reading.

San Francisco in the middle sixties was a very special time and place to be a part of. Maybe it meant something. Maybe not, in the long run . . . but no explanation, no mix of words or music or memories can touch that sense of knowing that you were there and alive in that corner of time and the world.

...

So now, less than five years later, you can go up on a steep hill in Las Vegas and look West, and with the right kind of eyes you can almost see the high-water mark—that place where the wave finally broke and rolled back.”


Another relevant quote from Fear and Loathing that was posted on HN a little over a week ago by isoprophlex:

... a generation of permanent cripples, failed seekers, who never understood the essential mystic fallacy of the Acid Culture: the desperate assumption that somebody - or at least some force - is tending the light at the end of the tunnel.


This is such a great succinct summation of my feelings on “psychedelic culture” as someone who has actually participated. It drives me crazy when people act like psychedelics open a window into deep spiritual truths and that taking them makes you some how more enlightened.

This culture is still pervasive - probably accounting for the majority of people who actively argue for decriminalization and research - and a bad spokesman. It reminds me a lot of the people who think Cannabis can’t be addictive, cures cancer, and has no negative effects. In some ways, it fetishizes and ascribes magical properties to drugs just as much as the “drugs are bad with zero exceptions” crowd does. Except, in many aspects it’s worse, because it downplays risks and pressures people to partake. Normal people notice it even if they can’t articulate it, and it does a lot to discredit reasonable people.


Could not agree more. I have tripped countless times. But in the late 80s and early 90s. No hippies to be found. I did have some interesting revelations but nothing I wouldn't have come to via other means of conversation or introspection.

For me, and most people I was doing this with, it was just an affordable, long lasting drug. Sure, I had read various books about opening the doors of perception but at some point you realize it's just more "you" that you are finding.

I do think psycadelics can be used therapeutically in the right conditions but I'm not convinced ad hoc trips will get you there unless you are already skilled at introspective techniques.

The one thing I'll say I agree with is it does change your perspective and can give you a break from your ego. Seeing how nice it is to not be a slave to all the negativity, dwelling in the past, and obsessing over the future is very freeing and taught me that pursuit of meditation was worth doing.

Ultimately, I'd say meditation can give you the benefits that people associate with psychedelics if you are consistent in the practice.


> it's just more "you" that you are finding

surely "just" more you still offers "deep spiritual truths", and surely finding them makes you more enlightened.

> I'd say meditation can give you the benefits that people associate with psychedelics if you are consistent in the practice.

I don't see how this in anyway devalues the psychedelics


Enlightenment is not something you have more or less of. :)

If you have experienced what people often refer to as enlightenment, you'll know the feeling fades. It's like trying to hold onto water. Knowing it is there can change the way you think about the world, sure, but chasing it is the ultimate ego trip. It's probably better to be open to it than to run around looking for it.

And I have to tell you, listening to Slayer on the second day of an acid binge in some dingy basement apartment while you try to figure out who you know that has some weed is not the same as hanging around in an alpine meadow with the cast of Hair. It's just not.

I never meant to devalue psychedelics, I was agreeing with the removal of that magical cache they seem to enjoy. For a lot of people they are simply affordable party drugs that let you drink your face off and have a good time for 8 hours or whatever. Their clinical use is very different and I love that it is being explored in a rigorous manner.

The psychedelic drugs themselves do not operate in a vacuum. They are neither good nor bad. I don't mean to devalue their therapeutic use but if anything they are a shortcut to give you a taste of what is possible with your existing hardware.

For an adult trying to lead a responsible life, regular meditation (or walking, playing an instrument, etc) is probably better than self-dosing after work on a Friday. For someone struggling with PTSD, I would imagine a dose in the right environment alleviates that burden in a way most other things cannot. Seeing that it is possible to not feel like you have been feeling could give you the motivation to try to hold on to that and find other ways to experience it. That's incredibly valuable.


> surely "just" more you still offers "deep spiritual truths", and surely finding them makes you more enlightened.

Or maybe it just makes "you" more crawled up "your" own esoterical butt?


> Sure, I had read various books about opening the doors of perception but at some point you realize it's just more "you" that you are finding.

But that's what opening the doors of perception means. Of course you're "only" discovering things about yourself, but that is still valuable and I think that is what a lot of people find 'mystical' about these drugs.

I agree that shamanistic woo-woo doesn't do any good though.


Do you know if `pip3 install --user module` still works?


Suse Tumbleweed made the same change and it doesn't work there (I use it through WSL if it changes anything)


I also don't understand why the backups are done the way they are. Why not just copy the current file to the backup, and not break hardlinks?


writing to a temp file then renaming it to the real location is a long held tradition to avoid partial writes and filesystem corruption if power is lost


Yes. This.

If you copy from unsaved file, you are risking to lose both the file you are baking up and the backup.


I agree, backup-by-copying should be t by default.


There are good reasons for the Emacs default that relate to atomic operations in typical file systems, reliability in shady hardware, or power losses and recovery. The argument of the author about hard links breaking is a matter of choice; some people want this copy on write semantics, others call it broken. Most people would want to avoid hard links to editable files anyways.


Can you explain some of these reasons? I feel like there is an opportunity for me to learn something here.

I don't know enough about filesystems to be sure, but I feel like there are more opportunities for stuff to fail with the move-and-copy technique. E.g. when moving the file it could end up existing twice (if creating the new link happens before removing the old one) or being lost (if removing the old link happens before creating the new one).

If you just copy the file, the copying can fail and you don't have a backup. But if the original is still fine then you never notice a problem with the backup, so this option is preferable (as I understand it).

> Most people would want to avoid hard links to editable files anyways.

Why would you want to avoid this? I though it would be something that Linux can easily handle.


Here is a breakdown of some of your questions:

1. Atomicity of Rename Operations: On many file systems, the process of renaming a file is atomic. This means that the operation either completes in full or doesn’t take effect at all. This makes it safer in cases where there might be interruptions, such as power losses. If the rename (which creates the backup in Emacs’s default behavior) is interrupted, you’re left with the original file intact.

2. Concern about Move-and-Copy Technique: Your point about the possibility of the file ending up existing twice or being lost is valid in theory. However, in practice, the renaming operation ensures that such intermediate states are avoided. A rename isn’t quite the same as creating a new link before removing an old one. Instead, it’s a reassignment of the file’s metadata, which is generally a reliable operation.

3. Drawbacks of Just Copying: While just copying the file might seem simpler, it can have issues. If there’s an interruption while writing the new copy, you can end up with a corrupted backup. With Emacs’s approach, since the original is renamed (and thus preserved in its entirety), you’re always assured of having at least one uncorrupted version.

4. Avoiding Hard Links to Editable Files: As for the avoidance of hard links for editable files, there are a few reasons:

Ambiguity: Editing a file that’s hard-linked elsewhere can lead to confusion since changes reflect in all linked locations. This can be unexpected for those unaware of the link.

Data Integrity: If there’s corruption in one location, it affects all hard-linked locations.

Backup Issues: Some backup systems might not handle hard links as expected, leading to either duplicate data or missed backups.

Linux does handle hard links well, but their usage needs careful consideration, especially when editing is involved. They’re great for static data that doesn’t change but can be problematic for editable files.

I hope this clarifies things!


3. Is unclear, copy may result in a corrupted backup, but the original remains uncorrupted, so you also have one uncorrupted version just like if you renamed the file but couldn't restore the original?

4. That's an argument for raising awareness via better tools, not for avoiding the useful links


> 3. Is unclear, copy may result in a corrupted backup, but the original remains uncorrupted, so you also have one uncorrupted version…

Think about what happens when your power comes back and you start editing that file again. Which is the correct version of the file? How can you tell? Suppose the backup file is newer than the original file, and 99% of the size. Is it a partial copy, or did the user delete some lines and then save?

Now consider what Emacs does by default when `backup-by-copying` is nil and the user asks Emacs to save a file:

    1. Emacs deletes the existing backup file (atomic),
    2. renames the existing file so that it becomes the backup (atomic),
    3. writes the buffer content into a new file with a temporary name (not atomic),
    4. calls fsync(2) to ensure that all written data has actually hit the disk¹,
    5. finally renames the temporary file so that it has the user’s desired filename (atomic).
If the power goes off anywhere in the middle of that process, then no corruption will occur. The state on disk will be easily observable; it will either be a missing backup file but the original still exists untouched, or it will be a backup file but no original, or it will be a backup file plus a temporary file that might be incomplete, or it will be a backup file plus the new file.

In all of these cases the editor can recover automatically without ever losing anything that was already saved on disk. Sure, in most of those cases we lose the _new_ data that wasn’t yet saved, but after all the power did go out in the middle of trying to save that very data. We’re not magicians here; this is the best we can do.

Of course, in practice the power doesn’t go out all that often and users habitually save the document after every few words, right? So backing up by copying is safe enough that most users who prefer it never lose any data. Probably. At least as far as they know; sometimes this type of corruption goes unnoticed, or gets chalked up to other factors such as inebriation or forgetfulness.

¹ We’ll ignore the fact that some operating systems have an fsync(2) that lies. See also <https://danluu.com/file-consistency/>.


> How can you tell

By observing the status of a flag "wasBackupSuccessful", which wouldn't be set to true if there is a power loss after the file was copied (this flag could be set as an atomic rename operation on the copied backup file so that you could tell via the observable file system behavior whether the copy succeeded)

But it is now clear what you meant, thanks for the explanation!


Thanks!


It means that there are bug fixesall the time, but most of the time no one sorts these into "security" and "non-security" categories.

I remember a message (I can't find it back right now) where this is explained. Basically the thinking is that a lot of bugs can be used to break security, but sometimes it takes a lot of effort to figure out how to exploit a bug.

So you have some choices:

* Research every bug to find out the security implications, which is additional work on top of fixing the bug.

* Mark only the bugs that have known security implications as security fixes, basically guaranteeing that you will miss some that you haven't researched.

* Consider all bugs as potentially having security implications. This is basically what they do now.


Someone releasing their back catalogue of already written books on amazon for the first time.


Then there could be a special process for those cases, which are probably rather rare compared to what is essentially the ebook equivalent to spam (= essentially free to make, so with just volume you can make money even from inadvertent clicks or things people buy while barely paying attention).


There could and there should be a process for that. Amazon just doesn't want to spend the money for implementing this unless they are forced to do so.


Are we talking about the same things? How does a latency of a few days affect the author in that scenario? It's books not hft. Should be fine to limit much further than 3 per day.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: