i want to do a data dump here. did you ever wonder how its possible that you can experience vivid, real-time non-scripted (so to speak) events that are totally lifelike? how is it possible to do that without sensory input generating the images and sounds? i thought about this a lot and there is only one answer that fits: the human mind generates a model of the world and everything in it, your experience of the world is actually happening inside of this model. when you are awake, your mind looks at the world thru sensory organs and updates the model accordingly which gives the illusion that you are directly experiencing the world. but in reality there is a middle man. this is why so many aspects of the dream world kind of half-work. like clock hands changing position. your mind isnt designed to operate without having information fed to it, so it can only predict so much and you get a kind of semi-logical state of world within this internal model. most of what you experience is not as real as you think, although the minds model is designed to track and predict reality as well as possible.
but theres more to the story. there were reports of people using a drug called ISRIB, russians mostly, a few years ago. its a drug that impacts the latency of cells, turning on cells that have gone latent due to injury... or for any other reason, reasons we may not be aware of. this drug is basically used in labs in cultures and is very new. people arent supposed to take it. but some people had it synthesized and they tried it anyway of course. most of them reported extremely life-like dreams. one person reported having waking dreams that were so real that even while totally and completely lucid, she was not able to tell that she was dreaming. she was pretty shaken up by it based on the chats. i think she described it as experiencing total insanity. but this tracks. when you are asleep, your model is not firing on all cylinders. part of the reason why the world is "unrealistic" is because your brain is not working as hard as it could to fill in the model in a coherent way. think about the experiences that schizophrenics have, seeing in full fidelity people and things that are totally non existent. all of these considerations together point to the fact that we inhabit models of the world rather than experiencing sensory stimuli directly.
wow thank you so much for linking this. the journey to this conclusion began simply because i wanted to explain how my lucid dreams could be so realistic. i also came to a satisfying conclusion to the question of free will, many years ago. there are many things that inform how i see and operate in the world but these two tools most of all i cannot imagine living without. the one thing in my life that i still dont have any kind of satisfying answer for is pain. what is pain? do you know if someone cracked it?
not really. its stalking. it poses a direct threat to someones safety. and people arent even supposed to have that information under the current rules... thats why its only possible to get his jets location by crowd-sourcing it. anonymity for high profile passengers is a legitimate concern, enough for the FAA to bake it into how these flights are tracked. soon this loophole will be closed and then what will you point to? how could you seriously compare this to being banned for saying something that is politically incorrect?
Even if true, which I'm skeptical of, social media sites don't have to be permanent. It's no different from any other service being enshittified, just use it until it sucks and then stop.
I'm thinking that it's a cynical way to put all the data on an open network rather than a gated and paywalled one. As far as I can tell, Bluesky is still entirely funded by Twitter and that will eventually be cut off. They're betting on the Google/Mozilla type relationship to be maintained for now. They need to make the indexing service easier to replicate before that happens or the network will collapse when funds dry up.
its so frustrating that the only reason i am able to post this is because of X... because searching for this guys name or "poject veritas nudge" does not produce the result that it obviously should anywhere except for X. this is the tactic that is so often used by people like you. state something that is factually correct but completely incorrect and misleading when the full context is taken into account. even if this were an actual civil case brought on in the normal way it would still be the undeniable truth that one billion is silly and that this is political.
if anyone know the person who maintains that site or if that person reads this: this site would be massively improved if the speed of the ISS footage playing in the background were simply slowed down a little. right now it gives this feeling of rocketing forward which is a very different vibe from the premise of the site. the user should float slowly to emphasize the thoughtful nature of the activity and enjoy the sensation of watching the world go by.
this is what youtube and reddit was like during 2011. it was calm, serene, accepting, warm, human. it was just this perfect mixture of user friendliness, people knowing how to type and use computers and just before the internet was taken seriously by anyone and corrupted by money. before social media became a serious political consideration. i remember very clearly that even at the time it felt too good to be true. these videos capture that feeling pretty well. it was all unfiltered and it made you feel like you were connected to the world. like you had your finger on the pulse of the world. or like the entire world was inside your computer. really warm fuzzy vibes. i still miss that. but now i am too busy to spend so much time on the computer anyway.
Do you mean oils that have turned rancid before they are consumed? I don’t really get the hate seed oils are getting. In studies they seem to have shown no ill effects. They do certainly use oils in studies that are not rancid, while your average supermarket oil might be (?).
Without a literature review, they'd certainly be a number one suspect.
Consider this statement elsewhere in the thread:
> before 1985 T2D was called adult onset diabetes considered an adult disease and 1983 was the first case of pediatric nonalcoholic fatty liver disease.
We've eaten sugar and saturated fats for ages. Of course, not everyone ate the same amounts that people do today - but we'd expect someone to be feeding their kid enough bacon (which people ate huge amounts of even ~100 years ago relative today) to give them fatty liver disease, if e.g., its saturated fats, or feeding them enough sugar.
But what people didn't eat, almost at all, was seed oils. Canola oil was not consumed at all before the 1970s - canola is a CANadian scientist created version of rape Oil, with Low Acid - rape oil itself being too poisonous/bitter to eat. Soybean oil was practically unheard of. Cottonseed oil (aka Crisco) was just being invented as a wonderfood, here to solve our problems. Today these oils, particularly soybean and canola, are the second highest source of calories in the average American diet, and the single highest source of fats. We're suddenly beset by major metabolic problems, from heart disease, obesity, fatty liver, T2D, that did not exist or existed in much smaller proportions, even in historical populations where people were eating tons of bacon or sugar. Meanwhile, we have a food source that went from "negligible" to "one of our main sources of calories." It's not proof, there are almost certainly other factors involved as well, but it's really, really suspicious. Making matters worse, what you feed animals also impacts the fat composition of their meat, and we now feed cows and pigs canola and soy.
> before 1985 T2D was called adult onset diabetes considered an adult disease and 1983 was the first case of pediatric nonalcoholic fatty liver disease.
This statement is factually false, though: https://www.sciencedirect.com/science/article/pii/S258955592.... It may not have been explicitly called that, but it was clearly shown to exist. This is not some new phenomenon that first popped up after the introduction of seed oil.
>The term non-alcoholic fatty liver disease entered the hepatology lexicon in 1986, introduced by Fenton Schaffner (American physician and pathologist).
As you acknowledge the disease didn’t even have a name until 1986, or 3 years after the first diagnosis in children.
There is nothing in the study you link suggesting kids were being diagnosed and treated for nonalcoholic fatty liver disease pre-1983 under a different name - they weren’t.
This is easy enough to confirm on google independently [1].
>This is not some new phenomenon
Yes it is, in the ~40 years since the first recorded medical diagnosis it’s become an epidemic effecting 5-10% of kids or ~10M kids in the US. There is no way this is not a new phenomenon and 5-10% of kids had nonalcholic fatty liver disease throughout history and we have no record of it.
[1] Title: Steatohepatitis in Obese Children: A Cause of Chronic Liver Dysfunction.
> Today these oils, particularly soybean and canola, are the second highest source of calories in the average American diet, and the single highest source of fats
Isn't canola oil one of the oils with the fewest saturated fatty acids, typically 7-8g per 100g?
Olive oil has ~twice as much, sunflower oil as well. Palm oil has ~7x more. And coconut oil ~10x more.
Calories-wise, all these oils are pretty much the same, typically in the range of ~800kcal per 100ml. So, I am not sure I understand the arguments against it.
Canola oil is pretty much the best bet if you want to reduce your fatty acids intake.
I didn't say saturated fats, I said fats - obviously, oil is pretty much 100% fat by definition.
Of course, it's also controversial in some circles whether saturated fats are bad, but that's a separate discussion. There is more to a cooking oil than it's saturated/polyunsaturated/monounsaturated fat ratios.
The problem is linoleic acid and our overconsumption of it. It seems to cause way more oxidative stress during metabolism, to which the brain is more sensitive. Plus it also seems to adversely affect metabolism of other kinds of fats. And it plasticizes during cooking.
The idea being in the past the only linoleic acid we would be getting was from whatever seeds we consumed naturally. With the advent of industry it's now a 20B+ business. It was hard for humans to consume so much seed oil in the past.
"Seed oils" are a nonsensical category because the main example is canola oil which doesn't actually have the problems associated with them (bad omega-3:6 ratios).
I thought the problem with seed oils is that seeds don’t want you to eat them and their chemistry may reflect that. Fruit bodies such as olives on the other hand are “designed” to be eaten and so aren’t likely to have such defenses.
Olives are a pretty weird example of something that "wants" to be eaten by us, given the insane amount of processing it takes to make anything remotely palatable from them.
Contrast with, say, sesame or sunflower seeds which can be eaten straight from the plant raw, or pumpkin seeds which just need a simple roast and peel, I'm not sure that your categorical assertion really holds up, as intuitive as it may seem.
It just has to be mammals in general. If e.g. a rape seed doesn’t want to be eaten by squirrels, to take a common seed eating mammal, there’s a decent chance that as mammals we share enough in common that whatever surface is being targeted in squirrels would affect us as well.
If something was specifically targeting birds or reptiles then it may not affect humans, but are the seeds in question in environments without mammals? I don’t think so?
its insane to me that people are working so hard on reverse engineering apple silicon. like, the diagrams are right there in cupertino. it just seems like such a waste. its like during some kind of economic depression there are people starving and struggling while a bunch of food and resources are just sitting around not being used. existential grid-lock.
This definitely sucks. I feel similarly about e.g. the jailbreaking community: I appreciate the work they do and at the same time I very much wish it wasn't necessary.
If Apple and other companies like them were a little less greedy we could have far more nice things for free and Alyssa and other brilliant engineers could go work on other great projects. Also if regulators were a little more vigilant and a little less corrupt.
To me what makes it suck even more is the fact that Apple has no qualms exploiting FOSS themselves. BSD, WebKit, their new “game porting toolkit”. And look what they provide in return. It’s so gross.
Agree, at least WebKit can be used outside of Apple. They still did KHTML dirty though.
Your point about working for free is right on the money. I get that asahi is probably intellectually stimulating to work on, but I couldn’t do it knowing I am essentially enriching a company that doesn’t have public benefit in mind.
Clang can definitely be used outside of Apple, and can even compile Linux. Swift technically can be used anywhere, though it is largely driven by Apple and laughs at backward compatibility.
The people I know at Apple actually do have public benefit in mind. They believe in things like usability, privacy, accessibility, sustainability, etc. They don't want to replace you with intrusive AI (yet). And personally I like Apple's products and am using one right now. Unfortunately all large companies tend to turn into monsters that do their best to grind up resources - natural or human - in order to turn them into money. And the designer of the iPhone regrets turning us into zombies - that was not an intended effect.
If people wonder why some of us don't like Apple, this is the fundamental philosophy why. It's not about the M series, it's been their modus operandi since time immemorial. It's like if Microsoft owned x86 and nobody could run anything on it but Windows. And people would like it because it's a "cohesive ecosystem" or whatever.
I'm not sure that's really the same thing. Apple doesn't own ARM and the main issue here seems to be the GPU no? Is this much different from how things work with Nvidia? I guess the difference is that Nvidia provides drivers for Linux while Apple does not. As far as I know Nvidia Linux drivers aren't open source either though.
Nvidia is not much better, but they do only make one component and generally ensure compatibility. If Nvidia went full Apple, their cards would have a special power connector for the Nvidia PSU, a custom PCIe express lane that only works with Nvidia motherboards, which also requires Nvidia RAM sticks and only boots NvidiaOS. And also most of the software that would run on it would be blocked from running on other OSes because fuck you that's why. Also if you tried running NvidiaOS in a VM, they would sue you.
It's still profoundly weird to me that nobody can run Safari outside MacOS, even for testing. At least the EU has strong armed them into dropping thunderbolt ports now, so we have that minor interoperability going for us, which is nice.
You left out that they would also cost ~double per unit of performance. And that when Nvidia claims to be better for graphics and video, they can back those claims (albeit unfairly, some might say), whereas Apple marketing appears to avoid any price/value comparisons. So, I guess, even when you're dressing Nvidia up to sound ugly for a hypothetical, they still sound better than Apple.
Yeah the big difference is that Nvidia really is the top dog, in any way you look at it, they simply make better hardware.
And even if you add value/cost into the mix, they still do very well, even their highest performing products.
Apple is better at some stuff but it's really not total domination and you really need to look at things in a very particular way to think they are indisputably better.
If you add value/cost, everything falls apart and it really becomes: if you have cash, you can buy that stuff that is going to be much better at this very specific use case.
It's even true for their headphones where the only thing they are better at is integration in their own system, everything else is passably competitive but if you look for value it's just plain bad.
I had AirPods, I find it amazing how much better the Nothing Ears are for the cost, if you don't care about the Apple ecosystem (the "magic" gimmicks never worked that well for me anyway).
Are we living in the same world? Nvidia only recently started caring about Linux (due to profit obviously, it turns out servers don't run anything else nowadays).
May I remind you of the famous `--my-next-gpu-wont-be-nvidia` flag in Linux compositor? Meanwhile, apple literally went out of their way to make secure boot for third-party OSs possible.
Conversely, Nvidia provides first-party Linux support for most of the hardware they sell, and Apple goes out of their way to make booting third-party OSes on the majority of hardware they sell (read: all non-Mac devices) all but impossible.
The point is that apple acts as both the source of hardware and software. Your analogy is not applicable because you can't run apple's OS on generic third-party ARM hardware.
But isn’t this whole thread about running Linux on Apple hardware? I haven’t seen anyone in this thread complaining that they can’t run macOS on non Apple hardware.
I think it only fuels the possibility that Apple would open up the architecture documentation where it otherwise wouldn't if you didn't have people diligently reverse engineering it.
Something similar to this happened in the early days of the iPhone, with the iPhone Dev Team. Initially, iPhone "apps" were going to be web pages, but then these reverse engineers came along and developed their own toolchain. Apple realized they had to take action and their "webpages as an app" strategy wasn't going to work.
It was not. But you got contradicted by people who actually remember what happened. It is fairly well documented, and was common knowledge even at the time. Jobs was initially sold on Web Apps for several reasons, and the state of iPhoneOS 1 and its internal APIs was very precarious and not stable enough for serious third-party development. Again, this was known at the time thanks to the jailbreak community, and it has been explained in details over the years by people who left Apple since then, and Jobs himself in Isaacson’s biography.
When they pivoted towards the AppStore model, there was no predicting how big it would become, or even if it would be successful at all. The iPhone was exceeding expectations, but those were initially quite modest. It was far from world-record revenue.
I was there, writing apps with the DevTeam's toolchain before Apple ever released theirs. Were you?
Also, I assume you haven't read the Steve Jobs biography, which discusses this and contradicts your point.
One positive outcome of your comment is that it reminded me I still have the 2008 book, "iPhone Open Application Development" by Jonathan Zdziarski. That was a nice walk down memory lane.
I was there, part of a small community writing apps pre-SDK.
Neither, I, nor anyone else, can promise you it wasn't just a simple $ calculation.
That being said, literally every signal, inside, outside, or leaked, was that apps / public SDK, if it existed meaningfully before release, had to be accelerated due to a poor reaction to the infamous "sweet solution", web apps.
I agree its logically possible, but I'd like to note for the historical record that this does not jive with what happened, at the time. Even setting that aside, it doesn't sound right in the context of that management team. That version of Apple wasn't proud of selling complements to their goods, they weren't huge on maximizing revenue from selling music or bragging about it. But they were huge on bragging about selling iPods.
Apple literally designed a new boot loader for Macs that allows the computer's owner to install and run an unsigned OS without having it degrade the security of the system when booted into the first party OS.
My guess would be that it was personally advocated for by someone who has enough influence within Apple to make it happen. Possibly someone on the hardware team, as I hear that the people developing the Apple Silicon processors run linux on them while they're in development.
This used to be one of the best things about Apple when Steve Jobs was still running the company: you'd get a bunch of features that a purely profit-focussed "rational business" would never approve, just because Steve wanted them. And I suspect Apple still has some of that culture.
On the internet it seems antitrust law can just be used to explain every. Antitrust actually has a pretty strict legal definition. And not a lot of thing fall into that. And if it Antitrust did apply, it would apply far more to the IPhone.
It would take an outright legal revolution in the definition of antitrust for this to be even a remote possibility, and frankly that is not happening.
This is tiresome. They cannot lock down the Mac without losing one of its biggest markets, software development. It was mentioned at a WWDC 5 or 6 years ago I think that software developers were the largest of the professional groups using Macs. You can’t have a development environment that doesn’t allow for the downloading, writing, and installation of arbitrary code.
As long as Apple wants people to develop software for its platforms and/or sell to web developers, Android developers, scientific computing, etc. they will not lock down the Mac.
I don't buy that for one second. If they cared for software developers they would have never moved the ESC key into the touchbar back when that was a thing.
Also if they really cared, they would operate/manage brew and integrate it better than have people spend their free time to make apple's products more usable for software developers.
You think that's bad? Imagine how much churn there is because NVIDIA doesn't have open source drivers. I'll actually do you one better: part of my PhD was working around issues in Vivado. If it were open source I could've just patched it and moved on to real "science" but that's not the world we live in. And I'm by far not the only "scientist" doing this kind of "research".
I have to agree with this take, as much as I appreciate the indelible hacker spirit of jailbreaking closed hardware and making it available to freedom loving individuals... A huge, vocal part of me also feels like the entire effort is truly just empowering Apple to sell more closed down hardware and even the freedom loving nerds will buy it because it's sexy.
There's no getting around the sexiness of Apple hardware. But there's also no getting around how closed it is. You implicitly support Apple's business model when you buy their hardware, regardless of whether you run Asahi or Mac OS.
For many people, the Apple Silicon GPU is an interesting problem to solve given that the firmware is loaded by the bootloader and all and its actually generally easier to interact with than say NVIDIA while having decent perf. Also GPUs in general are really complex beasts involving IP from tons of companies in general. Would not be surprised if even Apple doesn't have the full schematics...
> and its actually generally easier to interact with than say NVIDIA while having decent perf
I’m pretty sure that Turing and newer work the same way. The drivers basically do nothing but load the firmware & do some basic memory management if I recall correctly.
Specially because Apple seems to not care much about the project even after current progress.
m3 support still not there (let alone m4) because things broke. Which is expected from Apple, they are just doing their thing and improving their products.
If they cared they would have at least hired these people by now. It wouldn't make a dent in their budget.
You are misreading the comment. It is indicting Apple, not the Asahi team, for not caring. If Apple cared and hired the Asahi folks and provided them with help, they would probably be able to churn out drivers faster.
My wife's 2017 MBP has gotten so dog-slow since Apple dropped support for it that it can't handle more than 3 Chrome tabs at a time now. The reality of Apple products is that the manufacturer sees them as eminently disposable. As early ARM macbooks age, the ability to run something, anything that isn't MacOS will be an asset. Otherwise, they're all landfill fodder.
More than half of "recycled" e-waste just gets exported to developing countries without environmental regulations where it either gets burned or buried.
The only sustainable thing you can do with a bad laptop is fix it or part it out, but for all my years taking apart fragile electronics, is it really worth the effort to take apart a device that was intentionally designed to be difficult or impossible for the owner to repair?
The last few macOS updates have really been killing performance on Intel Macs. Your 2014 is probably safe because it'll still be running an older macOS.
I have an old google nexus 7 tablet that I recently installed uboot and postmarketos on. I can ssh to it, run X apps over ssh, print directly from it. It's pretty cool.
I also have a really old iPad 2. It works perfectly HW wise, screen, wifi etc. But is effectively a paper weight due to software.
I am logged into it from my old Apple account, that was only ever used for this tablet.
I have the username and password but cannot log in as I don't know the security questions, so I can't reset the device or try to install apps. I even phoned apple but they said there's nothing they can do.
It pains me to just dump a perfectly good piece of hardware.
We have an iPhone 6 in a similar boat. It's my wife's old phone, and it'd be totally fine as a test phone for my work, but there doesn't seem to be a way to factory reset it without a passcode she doesn't know.
A recurring theme you'll encounter across most of Apple's products is that any feature that forces first-party Apple software to compete on fair terms with other products is conspicuously missing.
You're just stating the problem the parent content was upset about. It's all well and good to state the facts and say "face reality" but in this case we all apparently know that it's a fragile state of affairs.
It doesn't have to be 100% libre. This is about booting any OS you want in the first place.
If you take some random windows laptop off the shelf, it will boot linux (and continue to do so in the future) because they have to support UEFI. If you take a "linux" friendly vendor off the shelf, you may even have coreboot or something on-board.
But with this apple situation there is no guarantee the next generation of hardware won't lock you out, or they might push out an OTA firmware update that locks you out. It's like porting linux to the playstation or something.
I get what you mean. I'm glad that they're doing this; it's great that the best laptop hardware is going to run Linux before long; it's a fun endeavor -- but when you zoom way out and take the philosophical view, yeah, it seems silly that it should be necessary, in the same that way it feels absurd that impressive and courageous feats in battle should have actually needed to happen.
I’ll go back a little further: at one point before Apple purchased NeXT, Apple had its own version of the Linux kernel called Mklinux (https://en.m.wikipedia.org/wiki/MkLinux).
Oh please. OpenDarwin lasted what, 2 years? The people running it realized their efforts were merely going towards enriching OSX, it was not a truly open source effort.
Is it really a far comparison? Apple has a proper bootloader capable of secure booting 3rd party OSs. What part of the open-source ecosystem was built differently?
It just so happened that after possibly even more painstaking reverse engineering, the responsible hardware vendor later realized that server machines are de facto linux, and they better support it properly. Like, that Intel chip working that good was not achieved differently, we just like to wear rose-tinted glasses.
I'm not familiar with this. Suppose Apple released docs under an "as is" type disclaimer like is so common in the open source community: would doing so potentially come back to bite them?
Yeah, I agree. I do respect the effort itself, but it always feels like a huge waste of talent. Imagine what could have been created with all that time and energy.
I believe the best solution to proprietary hardware is not to buy them in the first place. Buy from vendors who are more open/supportive, like Framework and Thinkpad. This helps those vendors keep supporting open source.
1. Even if one loves macOS, Apple doesn't support its hardware forever. Being able to run an alternative operating system on unsupported hardware helps extends that device's useful life. My 2013 Mac Pro is now unsupported, but I could install Linux on it and thus run up-to-date software.
2. Some people want to use Apple's impressive ARM hardware, but their needs require an alternative operating system.
It is an inspirational demonstration of the hacker spirit and a way for the individuals involved to both expand their technical abilities and demonstrate them to prospective employers.
I personally consider it very inspirational though I recognize that I will probably never be able to undertake such a difficult task. I can imagine that it is very inspirational to the next generation of extremely proficient and dedicated teens who want to master software development and explore leading edge hardware.
if you want an arm laptop with incredible specs, incredible build quality, incredible battery life and incredible performance that runs linux what other option is there?
Yeah, the M4 is apparently the fastest CPU on single-core benchmarks. If you want a fast laptop, you have to get it. Not being forced to use Mac OS would be nice.
This is not a comparable user experience to running Linux natively for a variety of reasons, but the most obvious one is the relatively limited graphics acceleration. That's pretty important for a variety of use cases!
reply