Hacker News new | past | comments | ask | show | jobs | submit | omeysalvi's comments login

Actually, it is a metaphor for formulating a brand new branch of mathematics that fixes the identity principle and all the problems with the square root of two. But also, it is not a metaphor because show me any physical system where an action times an action does not equal a reaction.

It's actually super easy to form a "brand new branch of mathematics". Just start with some definitions and run with them. Although you'll almost certainly end up with something inconsistent. And if you don't, it'll almost certainly be not useful. And if it is useful, it'll almost certainly turn out to be the exact same math just wearing a costume.

There are no problems with the square root of two.

> show me any physical system where an action times an action does not equal a reaction.

Show me any gazzbok where a thrushbloom minus a grimblegork does not equal a fistelblush. Haha, you can't do it, can you!? I WIN!

That is to say: you're using silly made up definitions of "action" and "times" here.


> That is to say: you're using silly made up definitions of "action" and "times" here.

I believe they’re quoting Howard’s Rogan interview, fwiw


> show me any physical system where an action times an action does not equal a reaction

Not quite sure what an action times an action is, but how about rotating a 2d shape 180 degrees? Do that twice and it's the same as not rotating it at all.


You mean two reactions. Otherwise 1x1 would be 1

Are you saying you actually buy into the Terrence Howard school of mathematics? For serious?

The re releases are not meant for those watching the movie for the first time. It is assumed the audience coming for these one time events is already a fan of the movie.


Fans of a movie will often bring people who haven’t seen the movie. So sure 80+% of the audience may have seen it, but that’s not everyone.


I love the fact that the ruling came from a court in a location called Sapporo (Sapphic)


UPI payments used to be a little hit or miss a few years ago but they have been almost flawless for me over the last year. Apps have matured and show a warning if the recipient or self network is down rather than initiating a transaction that might never complete. The only roadblock now is to figure out a way to handle large volume days such as new years when the network is overwhelmed at night.


Yes, the latter


I hope Microsoft quickly finds a solution to move to RISC. I don't care about compatibility with older software. I want the battery life without having to move to MacOS and using it's awful window management.


If window management is the only thing holding you back, look into the third party UI helper app scene. There have been lots of them—Rectangle for example. https://rectangleapp.com/


If by RISC you mean RISC-V, I think it will be quite some time away.

I suspect that most RISC-V designs that are supposed to compete with ARM apps processors don't actually have lower power consumption.

....yet.


If you mean SiFive's designs, they always target (and beat) some ARM one in PPA.


Remove 11 and install Windows 10 LTSC IOT edition until the equivalent for 11 drops.

Make Windows shut up for good with Tinywall (https://tinywall.pados.hu/) and whitelist applications as needed.


What is so bad about MacOS window management?

I haven't used Windows for a while, but back in the day, it did not provide any extra capabilities in this regard over MacOS. Now I use an extension to tile the desktop with windows, and it works great.


Windows does have some more built-in window tools these days -- it lets you snap windows into various parts of the screen by dragging them up against the edges. You can replicate that on macOS with apps like Rectangle.

More-controversial is the different behavior of cmd-tab between Mac and Windows (app-level vs window-level). Once you learn that cmd-` also exists for switching between windows inside an app, I find it's a matter of personal preference and you can make arguments either way about which one is "better"...


> Once you learn that cmd-` also exists for switching between windows inside an app, I find it's a matter of personal preference and you can make arguments either way about which one is "better"...

Here is an example of how the app-centric model of task switching is problematic beyond personal preference: it causes focus stealing when opening GUI windows from the CLI. The apps themselves have to (partially, because that's the best they can do) work around the window manager's broken behavior.

The final answer from the initiated seems to be 'you're holding it wrong'. So here we have a Certified Unix™ where something as banal as setting your $EDITOR to a GUI program is broken by default and even when 'fixed' causes your window order to be shuffled around.

Examples:

https://emacs.stackexchange.com/questions/24033/when-returni...

https://github.com/sublimehq/sublime_text/issues/4460

https://www.reddit.com/r/osx/comments/i5zqn5/focus_doesnt_re...

Using macOS is a barrage of papercuts like this for me, and the window manager is at the heart of many of them.


macOS window management is "application centric", while Windows and most Linux desktops are "window centric". Mac is my primary platform since a decade now, but I still find that particular aspect quite awkward, old time Mac users seem to like it though (it's not much of an issue for me since I don't use many UI apps anyway besides iTerm2, Chrome and VSCode).


> What is so bad about MacOS window management?

MacOS is so intrusive and distracting that I can't imagine doing any meaningful work on it, feels like a toy OS.


Any specific example of this?


I highly doubt they ever will. Their insistance on baking in legacy support to modern Windows is their biggest downfall. Because of that every installation of Windows will always need to be backwards compatible and carry that burden.


They've supported ARM since Windows 10, and there are multiple Surface models that have an ARM chip.


Supported is probably the wrong term. Tolerated might be better. Whilst Windows runs on it theres very little there beyond the usual Microsoft standard of "yep we did it, whats next" and ignoring the platform.

Making it work on ARM is one thing, making it work well, and supporting the platform are a far more important part of the puzzle.


I think this could be handled more elegantly, as Apple has been able to make x86 compatibility better and more performant than quite a bit of actual x86 silicon… for MS, the backward compat of their software stack is rather important for enterprise customers.

Honestly tho, I find modern Windows infuriating to use without stripping it down quite a bit. Tiny11 is a lifesaver.


> I think this could be handled more elegantly, as Apple has been able to make x86 compatibility better and more performant than quite a bit of actual x86 silicon.

I wouldn’t count on this being the case going forward. Let’s not forget that some 4 years ago Apple decided to completely drop support for 32-bit apps. I wouldn’t be surprised if in 5-8 years time they decide to drop support for anything x86.

Apple hates supporting legacy crap. The exact moment when the vast majority of the recently updated Mac apps will have ARM builds, Apple will just set a release some years in the future that will kill native x86 support and that will be the end of it.


> I wouldn’t be surprised if in 5-8 years time they decide to drop support for anything x86.

I'd be pretty shocked if they didn't.


Which is odd because I feel like they run a VM for older compatibilities anyway?


Most software innovations seem to be made to make the developer's life easier instead of creating more performant software. I'm not complaining but user focused software innovations like LLMs do seem to come very few and far between compared to hardware.


I wouldn't consider LLMs a software innovation.


I feel like if you're gonna drop a take like that you should have at least one more sentence of justification to follow it up with.


I’m not sure I agree, but I’ll do my best to argue the position.

The neural net software algorithms have been around for decades. What made LLM’s feasible are the hardware advances to achieve unprecedented scale, just barely providing the ability (at great cost) to train today’s LLM models. Transformer architecture might be called a software innovation, but RWKV Raven gets similar performance to transformers and is built on decades-old RNN technology. So it is the hardware that was far more instrumental than the software in achieving LLM’s.

Counter to that argument: had google not done neural net research for google translate and proved the transformer approach scaled and performed well in their “attention is all you need” paper, people wouldn’t have spent the money to train foundation LLM models and we would not be having this discussion, so the software really mattered more than the hardware.

In reality I think it’s a little bit of both.


That's not necessarily true. The algorithm ( transformers) was done by Google in 2017.

Even while they missed the opportunity, Gpt-1 was released in 2018 by OpenAI.

Then they incrementally added parameters in the next versions.


Great take. It is really a mix of several factors, each one leveraging the other, and your arguments are great.


I don’t think I totally agree either but your argument for the position definitely has some merit.


I kind of admire the gumption, personally. Like writing them off as fancy markov bots.


Johnson and Johnson tried to spin up a subsidiary and shift all the blame to it for including asbestos in talcum powder for decades. Thankfully, the courts saw through the move and made them pay billions of dollars too


J&J ended up paying $9 billion but they still ended up being allowed to spin a subsidiary to hold that money and essentially take on the liability of future lawsuits or what happens when the money runs out and there are more claims.


What if this subsidiary has $1B in assets? Do they go bankrupt and not have to pay the rest of the $8B?


Would recommend the Matt Levine write-up: https://archive.is/44wfu

In the J&J case, the subsidiary had the right to draw at least ~$60B in order to pay off future lawsuits if the initial subsidiary's assets ran out, so there was never any real risk that it would leave suitholders unpaid. The switch into bankruptcy court is a way to arbitrate and organize the lawsuits, which was overturned because given the right to draw money from the J&J parent co the subsidiary wasn't actually at risk of bankruptcy.



I'm using ChatGPT to build a note taking app just for myself. Haven't completed it yet but plan to open source it once it is done.


what does AI add to a note taking app?


No, I'm a game developer. The app I'm making is a web app that I don't have much experience with. I'm using ChatGPT as a tool to cut down coding time and it is working great so far. I should have worded my comment better.


I may be completely wrong but having a lot of money, especially if you earned it by conventional definitions, might intrinsically motivate you to take better care of your health to live longer and enjoy your achievements. Kinda like an article I read that said poor people have unhealthy diets in some part due to the temporary relief from daily stresses offered by high sugar fatty foods might be more valuable to them than saving up money and health for a life that is pretty much going nowhere.

EDIT: It was a passage from a book called Hired by James Bloodworth


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: