Hacker News new | past | comments | ask | show | jobs | submit login
My 2020 Hackintosh Hardware Spec (infinitediaries.net)
334 points by morid1n on Jan 28, 2020 | hide | past | favorite | 276 comments



> Replacing an iMac also means getting rid of a perfectly good monitor.

This is the thing that absolutely friggin kills me with the Mac lineup.

As a developer I see monitors and computers as the perfect example of loose coupling. And I've absolutely had different lifecycles for monitors and computers in the past. My Mac mini that was once my primary desktop machine is now a media center hooked up to a TV.

But if the Mac mini doesn't meet your needs (fast CPUS, storage bays, whatever), and the Mac Pro doesn't fit your budget, Apple gives you a big ol' middle finger.

I mean, I COULD buy an iMac, but then 8 years down the road when I want a new desktop, I've got this unwieldy all-in-one where that big screen feels like more of a hindrance than an asset.

I haven't built a Hackintosh myself, but every time I look at the desktop Mac lineup I think about it.


I just completed my first Hackintosh in many years. I have perfectly serviceable iMac in the family room (2015, 32GB, 1TB Fusion), but wanted more.

I have a beautiful case (Corsair 570x mirror black tempered glass on 'all' sides). 8700K, 64GB memory, 4TB of SSDs, Vega 64. Runs just as silently as you could hope (H150 closed loop water cooling).

Like you said, Apple gives you the finger for a lot of things. I've lost track of the number of iPhones, iPads, MBPs between my girlfriend and I (but probably at least five each of the letters, and most of the iPhones). But I cannot stomach, in good faith, the world's richest company charging me $1,000 for 56GB of RAM (8->64) when I could buy high end or even faster memory for $250 for 64GB.

And, following some guides? It "just works". Continuity, Handoff, Apple Watch unlocking, iMessage, Sidecar. Clover Configurator is awesome, and if you want to be/feel even more native, OpenCore is even better (though requires manual work, whereas Clover can have your system booting into macOS in 40 minutes from building your USB).

So could I spec out a new Mac Pro that outpaces this? Sure. As could I grow this machine (it's just been my Windows workhorse for about 18 months).

I still love Apple products, but I don't feel any particular remorse for doing this, versus plonking down what most likely would have been $9,000 for a similar Mac setup.


Until you try to update it and doesn’t boot, then spend hours researching guides (that don’t have your exact hardware specs) on the hope you can update.

Then inevitably something doesn’t quite work properly anymore.


Exactly. And the major version upgrades are a pain. Unless you're willing to give your weekend away to figure out how to fix the broken configs, it's better to wait a month or so to see how other, more enthusiastic people have worked around it.

IIRC my first build was with El Capitan, and I tried to follow with all updates, but it was painful every time.

When I was building my Hackintosh, the common advice was to use nvidia GPU, so I got myself a then fresh out GTX 1070. But then soon, Apple decided not to sign the nvidia web driver, so we all are SOL with the support on newer macOS versions.

I was even considering trying to sell my current GPU to replace it with an AMD one, but instead decided to make it work with iGPU and do the GPU-heavy work on Windows.

I would never use or recommend a Hackintosh for any work where you'd expect a reliable machine, let alone using it for work.

But I would also not drop the cash on the insanely priced mac dekstop lineup. I've got a macbook and I will probably buy another one (not convinced because the keyboard fiasco which also affected my machine).

I was not expecting it, but windows 10 is much improved over the windows I knew. I now use it for gaming and some personal projects where GPU matters (game dev, computer vision, machine learning). I still need to get used to it and I sometimes miss some tools (Alfred, Preview.app, iTerm, modifier keys -> cmd > windows). But while macOS seems to have been dropping the ball recently, windows is scoring goals.

I will seriously consider switching back for my next machine, especially if apple keeps doing this stupid stuff. I was also thinking about linux, but I'm not sure if I want to spend time configuring all the little things.


I chuckled because that was the last experience I had with Hackintosh before I caved and bought a real Mac (2011). But to be perfectly honest, it is the same experience that I have been having on my 2017 MacBook Pro.


same here. however with a legit mac you can always netboot to restore so long as you haven’t borked the uefi firmware somehow. this obviously isn’t the case with a hack


The trick is to update but a few weeks after the update release.

It gives people the time to fix the potential issues.

Also a fun fact, I have a 2012 macbook pro and when High Sierra got released the update would fail on it because I had changed the hdd. They were trying to counter Clover but it took less than a week for the Clover team to find a workaround and I coulnd't update my real macbook pro for months before Apple corrected the issue for me.


Does imessage work to send and receive? I had previously seen a build where they said that not working was the only flaw.


It does, with text message forwarding too. I know it had been an issue a while back. The only thing I've heard of currently (which I haven't tested either way) is possibly with the Apple TV app.


Yeah Apple is missing a huge market opportunity there.

I've always wanted a tower mac but the new Mac Pro is not the machine for me. I just want the specs of an iMac on a tower.

A couple of years ago I ended up buying an iMac 5K since I needed to upgrade my old MBP and the Apple laptop landscape was so desolate. I'm very happy with it for dev and design work but I tried to use it for music production and it's not the machine for that. Cooling is bad for anything other than short bursts. Even at 30% of sustained CPU work for audio the fans turn on and are quite annoying. I ended up building a Ryzen Windows PC which cost me as much as an i7 Mini but is even more powerful than the base Mac Pro.


I feel like the problem is that it isn't a huge market opportunity (at least by Apple scale).

Checking out these graphs about Apple's revenues, apparently Mac revenues are pretty flat, while iPhone revenues continue to skyrocket (currently 5X Mac revenues!): https://sixcolors.com/post/2020/01/fun-with-charts-a-decade-...

So sure, a hobbyist-approved Mac desktop would please you and me and plenty of other geeks, but they'd also cannibalize their Mac Pro sales somewhat, and when it's all said and done, their Mac revenues would still probably be in the same ballpark. So, why do that when they can pour effort into new phones or TV services or AirPods or whatever?

The thing that worries me is that the hobbyist/geek is often the one who turns lots of others onto a technology. I'm probably where I am right now because my geek uncle passed down his used Mac to me and showed me Hypercard. And I'm pretty sure there are people who own Macs today that wouldn't if it not for me. I think appeasing the geeks is worth more than the direct profit or revenue that such a model would bring in, but in ways that aren't as immediately evident on the quarterly report.


I feel like I exist in the "enthusiast" segment which Apple completely ignores.


What is the cost (for Apple) in straight up R&D dollars of building a new model of Mac? And how much does it cost them to tool up for manufacturing and create the logistic chain to put it in customers' hands?

Remember that Apple aims to sell a vertically integrated stack, all the way from their own cases, motherboards, and PSUs up through the OS and core apps. The point of this is to deliver a specific user experience that they control, and also to keep competitors out of their pool (they allowed licensed Mac clones circa 1994-96 and it didn't end well). While they still buy in CPUs from Intel and other components (RAM, video chipsets, SSDs, hard drives) from other suppliers, these are generally low-level components -- as low level as Apple can get -- and even then they're trying to build self-owned stacks so they're not dependent on anyone else (e.g. the A-series processors in the iPhone and iPad) -- a lesson they learned the hard way thanks to Motorola with the 68K and subsequently PowerPC architecture. (I'm guessing Tim Cook personally has bad memories of the failure to deliver a mobile G5 back in the day.)

Anyway ...

Developing a new machine from the ground up has got to cost in the tens of millions, at least. (Look at the sunk costs that went into the new Mac Pro, for example.)

Are there enough "enthusiast" customers with system-building interests out there to make such a beast net-profitable, after taking into account the brand dilution implication of stacking up too many SKUs (as they did in the Bad Old Days of the first half of the 1990s, before the Return of Jobs)?

I'm guessing they've done the calculation and concluded the answer is "no", at least for the time being.


I think this is backwards. Who is buying underpowered $6000 computers? An iMac in tower form factor would be popular because it's something people actually want.

The only people excited about the new Mac Pro are people already locked into the Apple ecosystem. It is a fundamentally backwards looking product that attempts to squeeze more money out of already captured customers. It is not likely to attract new customers to the Apple ecosystem.


> I'm guessing they've done the calculation and concluded the answer is "no", at least for the time being.

It's possible but I doubt it. Does Apple know how many of its iMac customers would prefer a tower? Or how many sales they've lost to hackintoshes?


Yes, they do, in part because they opened up previously in the heyday of people building their own systems, and continue to engage deeply on the enthusiast asks right now contra-posed against their broader market understanding. Roughly, they figure out which enthusiast things will appeal to the larger market, and make it all work, fully integrated.

Per the curve shown in Geoffrey Moore’s “Crossing the Chasm”, HN exists largely on the left of the chasm, Apple is a trillion dollar company because it understood how to shift to the right of the chasm.

Visual explanation (.png):

https://smithhousedesign.com/wp-content/uploads/2018/02/smit...


> in part because they opened up previously in the heyday of people building their own systems

20+ years ago?

> continue to engage deeply on the enthusiast asks right now contra-posed against their broader market understanding

Is this speculation or you have concrete evidence of this claim?

> HN exists largely on the left of the chasm, Apple is a trillion dollar company because it understood how to shift to the right of the chasm.

It think it would be the opposite. Having a tower with macOS and consumer specs (no Xeon or ECC) would be pretty boring and conservative. I think that is one of the reasons Apple won't do it.


> 20+ years ago?

Yeah. Back then lots of folks moonlit or ran local PC builder companies as small business PC builders and gaming PC builders. Before gateway and e-machines, and the like.

> speculation or evidence?

first hand knowledge

> think it would be the opposite

The curve is within a market. There are boring computers and exciting computers, workaday computers and toy computers, etc. Each has its curve.

Compare to automobiles, what gets shown at shows or on the track, what is limited edition, and finally what is high end then mainstream then outdated.

The high end buyers want them some of that enthusiast kit — without the high maintenance and visits to the mechanic.

That’s even true within a brand line, buyers want stuff they see getting played with over on the left of the chasm by competing enthusiasts, and they want their brand to adopt it too.


> first hand knowledge

Please do go on :)


The more I think about it, the more I conclude they just cannot afford to support tower or modular designs for the enthusiast market. Not only would they "lose" money from costly upgrades. But it would also become apparent the lack of driver support Macs have; especially as it pertains to video cards.

Maybe the Mac Pro will change the support issue and they will eventually be able to enter this market. But I believe their biggest fear is being compared to Windows or Linux which both have tremendous hardware support.


It's not just the enthusiast segment. It's the pro segment. Apple can't conceive of a professional who does anything other than edit videos or audio.


Their marketing collateral for their XDR display shows it with some code on the screen. The notion that either the display or the Mac Pro itself makes any sense for a developer is laughable.


You can test run an awfully large K8s platform in there ...


If you're running Linux anyway, you might as well buy a 3990X, get 64 cores instead of 28, and use the money you saved to go on vacation.


you don’t buy a $6k base machine to swap the os to linux, the entire reason to buy a mac is macos. also there’s a lot of us who still don’t trust AMD procs. I personally don’t as the only CPU I’ve ever had burn up was a bulldozer while transcoding. I’ve never even had the slightest problem with Intel procs so until AMD somehow reproves themselves to me without me spending a dime (which obviously will never happen) I don’t even consider them when speccing out a new machine


K8s is kubernetes, Linux containers. If you want to run it on macOS or Windows it runs inside a Linux VM. You might as well buy a Linux computer and SSH into it from your Mac, is the point.


Not really*, see xhyve.

https://github.com/machyve/xhyve

Meaning, it’s not a heavy Virtualbox or Vmware type VM, no ‘vagrant up’ needed.

Instead:

https://minikube.sigs.k8s.io/docs/reference/drivers/hyperkit...


There's still overhead in running containers on MacOS. File I/O is especially much slower in containers on Mac vs. natively on Linux.


yes i’m fully aware of what kubernetes is. and your point is exactly mine, don’t buy a mac with the intent to run linux on it so thanks for confirming that for me


Nerds: Give us the xMac! Example from 2001 / 2005: https://arstechnica.com/staff/2005/10/1676/

Apple: NEVER!


Apple enthusiasts buy Macbook Pros or iMacs. There’s nothing wrong with integrated displays. In fact, it saves on a ton of cable mess, and they are usually worlds better than what you would plug a computer into anyway.


> There’s nothing wrong with integrated displays

What? There is a lot wrong with AIOs.

1) Cooling is atrocious

2) Dust problems

3) Very difficult repairs and almost impossible upgrades (with some exceptions such as RAM)

4) No internal expansion. You save a cable with the monitor but you add cables for everything you want to add.

5) Once some component dies you have a very expensive monitor that can't be used for anything.

6) Much like laptops, tablets, and smartphones it's an environmental disaster.


Why don’t people complain that toasters aren’t expandable?

Apple builds appliances made from hardware and software.


There's everything wrong with integrated displays. You can't use it for dual displays. You can't upgrade the display withhout upgrading the computer. And if you get a bad case of stuck pixels, you need to carry the thing to a mall and pay out the behind for a replacement panel which requires complex surgery on the device.


apple will send you a box to ship it back in


I don’t think the imac is that much of a step up from the mini performance-wise. Only the i9 in the imac significantly beats the top i7 in the mini, and the cooling of the imac can’t handle the i9, so the set of cases where people need more power than the mini but are satisfied by the imac should be quite small. The imac has a dedicated gpu, but it is outdated and has to drive that massive 5k display so that’s a wash as well. And the imac has the same lack of extensibility as the mini so it doesn’t win out there either. I basically see imac as a mini with a screen. I know the specs don’t quite match up, but they’re close enough. Realistically if top performance matters you want one of the pro macs anyway.

I’d like apple to make the xMac (cheap expandable tower) but I wouldn’t buy it myself. I suspect not many people would.


> > Replacing an iMac also means getting rid of a perfectly good monitor.

> This is the thing that absolutely friggin kills me with the Mac lineup.

You can still use your iMac as a monitor -- just plug your computer into the Mini DisplayPort connector.

> But if the Mac mini doesn't meet your needs (fast CPUS, storage bays, whatever)

The new minis have thunderbolt -- basically just a PCIe connection -- for storage bays etc. I understand it's not as esthetic as an integrated bay but who moves a desktop machine around anyway?

I'm not saying "take these options or shut up" -- I'm merely saying Apple is providing some options that will satisfy some chunk of the market that shares needs with you, and out at the long tail there are always people who aren't happy. I also buy Apple hardware and know I'm buying features that don't matter to me and miss a few features I wish they'd included.

This isn't to defend apple (why should I? Plus the list of things that annoy me about their offerings is long...like your point about their disappointing choices in Intel CPUs) I'm simply addressing two points you made.


>You can still use your iMac as a monitor -- just plug your computer into the Mini DisplayPort connector.

I believe that this only works on iMacs made before 2014. New ones don't support Target Display mode (https://support.apple.com/guide/mac-help/use-an-imac-as-a-di...)


You don't use target display mode on the more recent ones with thunderbolt (usb type c) input -- continue reading that confusingly-written page.

However in either case if your iMac is kaput you don't have access to the display -- the iMac has to be able to at least start their boot roms, though not boot macos, to make this work. Then again, modern external displays also have to be able to boot their firmware so perhaps I should't consider this a limitation.


Are you talking about using Target disk mode to boot your imac off the drive in a mac mini or MBP? because that will still use your imac's processor & ram, etc.


Would you say the same thing about a laptop screen?

That said, I do agree that I wish it could be used as a monitor.. even to power a laptop would be great. There used to be a way to do it but it died once we hit 5k.


Say you want dual screens of the same pixel density, size and shape, what do you do? Hide the iMac under your desk? Buy another iMac?


I'll do you one better: say you want three 5k screens, you:

- Connect one to either left port, and one to either right port, of the iMac Pro.

https://support.apple.com/guide/imac-pro/connect-a-display-a...

Now, personally, I want my computers either portable, or loosely coupled to the monitors. But the iMac Pro is quite a capable machine.


Unfortunately it’s not really made for this purpose. You also can’t add screens to your iPhone or iPad.

The best Apple solution at the moment is to use MBPs in this way, unless you want the ultra expensive Mac Pro.


The reason "it's not made for that purpose" is the whole basis of many people's complaint and there's no practical reason for it.


You buy the LG 5k display. Different bezel though - which is somewhat frustrating.


Do any AIO units work as a monitor for external input? It seems like a systematic blindness to extending the usefulness of your product. The MS Surface Studio for example (at least at launch, I haven't checked if they've added that functionality) had the exact same issue, it's an amazing screen and drawing tablet setup but the computer inside is somewhat middling so after a few years it will need to be replaced wholesale instead of just turning it into a drawing tablet.

Is it surprisingly hard to do this for some reason I'm missing? Or just slightly expensive and a minor feature so no one bothers?


iMacs actually used to (models from 2009 to 2014) support being used as a monitor for external input:

“Use your iMac as a display

You can view the desktop of your Mac on the display of some iMac models using Target Display Mode. In some cases, you can also use Target Display Mode to play the sound from your Mac (called the primary Mac) on the speakers of the external iMac. For example, if you have a MacBook Pro you could use an iMac as the display and for playing audio.

Note: Target Display Mode isn’t supported on iMac models with Retina display. Only iMac (27-inch, Late 2009), iMac (27-inch, Mid 2010), and iMac (Mid 2011 to Mid 2014) support Target Display Mode.”¹

――――――

¹ — https://support.apple.com/guide/mac-help/mh30822/


Neat, wonder why they stopped. Sounds like the Retina models might have too high of a resolution to go over a single Thunderbolt connection but it should still be able to do a scaled up resolution...


I don't have the answer to why they stopped supporting this, but I don't think it's because the resolution is too high for a single Thunderbolt connection. I have an LG 5k that works over a single thunderbolt3 connection -- and I believe even thew new Pro Display XDR connects with a single TB3 cable.


Lots of Windows based AIOs do this. I know a number of models of Lenovo do and some Dells too.


Wouldn't this same argument apply to laptops?


Yes, however a laptop justifies this by unlocking a whole new set of working patterns, whereas an AIO desktop is almost entirely aesthetic.

Furthermore, even though it is all-in-one in nature, a laptop's form factor is small enough that "putting it out to pasture" to live its life as a server with the screen always closed in your media center, or in a closet next to your cable modem is totally feasible.


I’m still not seeing the difference. I like my iMac because it is a lot easier to transport than a desktop, monitor, and cords. I can also use my old iMacs as servers with the screen in sleep mode, as you suggest with laptops.

Regarding monitors, I used to use self-assembled PCs all the time, and while there was a time where I would use the same monitor across different PCs, I just didn’t do that anymore by the time I bought my first iMac. Because monitor sizes and resolutions get better over the years, I’d typically give my entire computer to someone else and just buy a new monitor to go with my new PC. Maybe others don’t mind looking at an old monitor. To each their own, I guess.


I've started building my own "budget" hackintosh yesterday actually! It is pretty straight forward if you are purchasing components specifically for hackintosh (instead of trying to salvage your current PC).

Here is my geekbench 4 benchmark: https://browser.geekbench.com/v4/cpu/15176970 (slightly better compared to new 16'' macbook with i7)

And these are components that I've bought and for which prices:

  Motherboard: ASUS TUF Z390M-PRO GAMING - 144 euro
  CPU: i5-9600K - 208 euro
  RAM: Predator 2x16gb RGB - 184 euro
  NVMe: SILICON POWER 512GB P34A80 M.2 PCIe M.2 2280 
  SP512GBP34A80M28 - 81 euro
  PSU: THERMALTAKE Smart Pro RGB 750W Bronze - 89 euro
  GPU: ASUS Radeon RX 580 4gb - 166 euro
I still have to see how stable this will be, and if it can replace my 2015 iMac.

Here are useful links I used: https://github.com/alienator88/ASUS-TUF-Z390M-Pro-Gaming-Hac... Also Google: hackintosh asus tuf z390m


> It is pretty straight forward if you are purchasing components specifically for hackintosh

Definitely this. There are "golden builds" where people have ironed out components and steps exactly so there's barely any more pain than installing Windows.


I have the same, 9900k with a 5700XT. Catalina works with SIP enabled, FileVault 2 works perfectly, etc. Been doing hackintosh since 2016.

With that said I’m pretty much done with it now that they made the 16 inch MBP. I bought one with the i9 and 64GB of ram and the experience is way better. I don’t need to mess around with clover or update random kexts anymore. It gets old after awhile!


Do you not ever miss the added power of your hackintosh though? I have a 2017 MBP and it's a great machine but I still love having my big ridiculously powerful desktop with three monitors for when I want that.


The new MacBook Pro is pretty powerful. I have four monitors hooked to mine and the machine is much faster than my five-year-old Hackintosh. I could build a new Hackintosh that would have somewhat better performance than the MacBook, but for me the difference is no longer worth it, so like the previous commenter I'm probably done with building Hackintoshes for now.


I totally get that. In all honesty, in a rational world a MacBook is more than enough performance for me (and it's actually what I use at work), but there's still very much an enthusiast alive in me that loves my over the top desktop.


The desktop is undeniably faster, but because of what I mainly use it for these days (digital art) I only really get benefit from single core performance and lots of memory. The difference between the 9900K and the 9980HK in this regard is imperceptible.

I'm just going to turn the desktop into a Windows gaming rig now :V


The last hackintosh I made was in 2017 and it's still a beast. Maintenance was relatively easy

However, the future of pain-free Hackintosh doesn't seem very likely. It's rumored (or confirmed?) Apple will do its proprietary ARM system as MacOS eventually moves onto that [1] (maybe not this year, but in the near future). And then there's the T2 chip.

I have been slowly moving away from Mac in general as I have no need for it (not a Mac/iOS developer). The alternatives are just as good or better for my taste.

[1] https://www.tomshardware.com/news/apple-mac-arm-cpus-2020-in...


We know that Apple uses x86 macOS Servers internally on commodity hardware (from previous discussion on HN[0]). Presumably, they'll need to keep some way of booting on machines missing the special Apple security hardware while staying secure, so Hackintoshes will always be able to find a way, since macOS will have to be able to boot on non-T2 systems for quite a while [1] (some machines they sell don't even have them yet, and presuming they keep up the 8-10 years of software support, there's nothing to worry about).

[0]: https://news.ycombinator.com/item?id=18114712 [1]: https://support.apple.com/guide/security/mac-computers-apple...


That thread mentions Apple using a combination of linux and macOS on commodity hardware for workloads that weren't suitable to their current hardware. If the Mac Pro satisfies those workloads now Apple is free to make the T2 chip mandatory.

Their own security makes a mandatory T2 chip inevitable imho - there were dubious allegations they were using compromised motherboards in servers last year, it wasn't true but they'll want to make it impossible.


If the Mac Pro satisfies those workloads now Apple is free to make the T2 chip mandatory.

I don't think even Apple can afford to buy too many of the Mac Pro. :)

In all seriousness, the Mac Pro doesn't make a good server (for most purposes). Most servers don't need GPU, no server needs a way way overdesigned and expensive case. Don't need the expandability. Etc.


Thing is, I'm not a Mac or iOS developer either, but I find that even if alternatives exist, they're rarely as polished as what's available for macOS.

Consider Paste - Ditto exists for Windows, but it's just not as nice in my honest opinion. Or BetterTouchTool, I'm not aware of a Windows alternative.


CopyQ, Touchégg and ElementaryOS (Ubuntu LTS based)

give it a roll.


Then you lose the ability to run things like DAWs and Adobe products. I've used Linux since 1994, but it just doesn't have the commercial software support necessary to make it a serious desktop contender, which is unfortunate.


This is a big part of it for me. I love Logic Pro / Cubase / Ableton, and having access to those is pretty important.

Admittedly part of the reason I prefer macOS to Windows is that I find pro audio software tends to get more love on macOS.


> And then there's the T2 chip.

I wouldn't be too concerned about T2 as a blocker for the next few years, as the latest iMacs continue to ship without it. Presumably they'll give those a minimum 5 year support window from the day they leave the store lineup.


What exactly would Apple lose if they offered A subscription service ($199/year? $19/month?) that allowed registered developers the right to install macOS either on bare hardware or a VM? I can’t speak for everyone else but the only reason I bought a MacBook Pro was for iOS development; I have to imagine there is a number where Apple would be fine with dispensing with making/shipping/selling/supporting the hardware and would just sell the software. Heck, I’d be fine if it was a developer-only stripped down macOS that only ran Xcode and tooling that required Xcode (Xamarin, Flutter, and Ionic development and IPA compile step). Seriously: what is the harm to Apple is allowing/selling this?


They would lose the ability to sell hardware.

They would have to specify which hardware was supported, which would automatically give those suppliers a boost and Apple would lose the markup.

If they didn't specify supported systems, they'd have to support more hardware, which is a pain.


They would lose branding(the "best" Mac is this build + subscription not their top of the line model), would have to support lots of additional hardware, would have to support the setup, offer tech support, training for their support etc. . In short, not likely.

Instead they may considering offering cloud instances for running XCode etc. that they can rent out.


It would encourage the proliferation of low effort/lowest common denominator/“checkbox” ports to their platforms, which directly opposes what they want. By requiring a hardware purchase, they select for devs who are serious about their presence on Apple platforms and are more likely to invest in the quality of the iOS/macOS versions of their apps.

Of course a lot of low effort ports find their way into the App Store anyway, but if you compare how many of those iOS has vs how many Android has, the difference is staggering. On macOS the advantage is less clear since desktop platforms have throughly been colonized by Electron, but it still holds an advantage thanks to a bevy of small shop Mac exclusives.


It's actually not too bad to run macOS in a VM -- this is an option Apple doesn't explicitly support, but they also realize how useful it can be for automated testing of apps and browser compatibility. The company line is that they only support macOS on a VM on a "legitimate" Mac, but realistically it's not hard to find guides on how to virtualize it on a Windows host.


With the ever notable exception that graphics acceleration straight up does not exist at all in macOS in a VM. The only way to get it is to use virtIO on a Linux box to pass through a secondary AMD card, I believe.


That seems to be the case.

Note that you'd have to user vt-d, in combination with vfio, not virtio. Vt-d allows for PCIe passthrough. VirtIO is a family of paravirtualized hardware like HDD controllers and network interfaces.

Besides the name-switchup, you're completely right.

https://wiki.libvirt.org/page/Virtio

https://software.intel.com/en-us/articles/intel-virtualizati...


I have a 16 core Ryzen Threadripper. Built a kvm install of Mojave, gave it 8 cores, 32GB of RAM. It wouldn't so well for someone who wanted to actually use it as a Mac, but for me as a cross-platform C++ developer, it's totally ideal. No separate keyboard, no KVM (the other KVM) switch, just boot the VM and there's a faster more powerful version of macOS waiting for my compile/edit/limited-run workflow than I could reasonably ever get from Apple.

Note: I own a couple of Mac Minis too, just for the license of course.


> Seriously: what is the harm to Apple is allowing/selling this?

The harm is they will make less money than currently.


One of the best features of macOS is its tight integration with the hardware. If they officially supported custom hardware, not only would they create tons more work for themselves and likely slow down the progress of the OS overall, they would also dilute their brand.


They'd lose a bit of money, which is more than enough harm to justify, really.


Check out macincloud.com


Upgraded my 2011 Hackintosh last year: i7-950 -> x5680 overclocked to 4.25GHz (good enough can probably go up). Upgraded from 12GB RAM to 24GB triple channel RAM (can upgrade to 48GB). Also upgraded from a old 1.5GB video card to the Sapphire RX 580. It was too much of a pain to get mac OS to run with this setup so I gave Windows 10 a try. It works great. I only use it for photography and video work. This setup smokes my 2016 Macbook Pro. I was considering tossing the system but a couple hundred dollars worth of upgrades and new life to a old system.


I have heard good things about Hackintosh on Ryzen does anyone have more Info?

How viable is it? I'm using a 15 inch Macbook Pro but it struggles with 3 1440p monitors but my PC with a Ryzen 3800x and RTX 2080, 32gb RAM runs like a beast on Windows.


Support for Ryzentosh is improving everyday, and more and more Hackintoshers are successfully going the Ryzen route. Latest Catalina even works.

However, there are some caveats. Because AMD is not officially supported, some things don’t work like Adobe apps, Android emulator and VMs. Fortunately ios development on Xcode works.


I'm running a hackintosh with a Ryzen 9 3900X. The OS itself runs great. The only thing to be aware of is whether your software will play nicely with a non-Intel CPU. Adobe software requires patching for example, and VMware / Parallels explicitly require VT-x and won't work with SVM.


Do you have tutorials/examples for both the installation on a Ryzen system and the patching of Adobe systems?


The canonical guide for installation is amd-osx.com ; plenty of helpful folks in the forums and discord.


It's getting better and better every month. The AMD community really made it easier the last few years to use AMD's CPUs on Hackintoshes. You can find plenty of builds on the dedicated subreddit.

If you plan to Hackintosh your PC though, your RTX 2080 won't work.


Oh won't work at all? I thought Nvidia released drivers for macOS


> We've spoken with several dozen developers inside Apple, obviously not authorized to speak on behalf of the company, who feel that support for Nvidia's higher-end cards would be welcome, but disallowed quietly at higher levels of the company.

https://appleinsider.com/articles/19/01/18/apples-management...


Nvidia released drivers for macOS up to High Sierra indeed. After that, we don't know exactly what happened behind the scenes, but apparently Apple would not approve the Nvidia drivers for later macOS releases and switched completely to AMD GPUs.


IIRC it's a little more tricky than on intel because the processors aren't officially supported. It may take some sort of kernel hackery to get going, and then you need to be careful about updates.

I have a similar build (64GB and a 2080 Ti) but the lack of Nvidia support is keeping me from giving it a go.


> The iMac has a build-in screen which likes go bad from dust, as do the components inside.

First I've heard of this, but I guess it's a thing?

(I've got a 2015 iMac 5k with no such problem, but, ironically, my screen is blue-taped in as I'm in the middle of adding a larger internal SSD)

https://forums.macrumors.com/threads/did-you-know-about-dust...

https://hothardware.com/news/apple-class-action-lawsuit-dirt...


It's not a thing. It can happen but it's not a thing in the same way as macOS updates disappointing are are thing.


I had 5 or 6 screen replacements on mine. Most of my friends had this issue too.


interesting list of hardware and beautiful photos, but this blogpost missing one key item -- cost!

HOW MUCH $$$ TO BUILD THIS MONSTER of a "mac", and how does that compare to a similarly-specified Apple product?

Without this, it feels like a collection of thinly-disguised product placements.


I paid around 1600 USD for everything. Counting the parts I reused from my old hackintosh, around 2000. A similarly specced Mac Pro 8-core is around 7200 USD.


thank you. A very useful contribution that neatly answers the question of why anyone would go through so much trouble. I would have guessed the difference in price was "only" a few hundred, and obviously would have been quite wrong! I never would have guessed 3.5 times!


In a way this isn’t a “similarly spaced Mac” though - it’s using Xeon CPUs, ECC RAM, server motherboard with 6 memory channels, professional GPU, etc.

The problem is if you want a Mac tower desktop they don’t offer anything with consumer components.


Well, for one, the memory is far cheaper and faster. 56GB of memory (going from eight to 64) will set you back $1,000 alone from Apple, or $252 from Amazon (and before anyone complains "but the Apple memory will be higher quality", unlikely. Same manufacturers. In fact, you can get far faster memory yourself, versus the 2666 on some Apple stuff).


But for the up iMacs (not the base models) you can put in third-party RAM. That is what I have always done to avoid Apples high prices on that (except on my laptop, where it is soldered in). But I would caution you about trying to mix in faster RAM, many non-third part. system only take one speed of RAM. That makes the system not only cheaper to build, but often you can tweak some more speed (really latency) out of it.


I think AMD offers the better bang for the buck right now and thanks to amd-osx.com it's not that hard to build a Ryzen (including Threadripper) based Hackintosh.


Until you need to use the Adobe Suite.


Someone else in this thread mentioned there are patches available for Adobe products.


So what's the problem with Adobe exactly? Is there any fix on the horizon?


In short, some software (Adobe included) makes use of features that are available on Intel processors but not on AMD. Whilst both AMD and Intel are x86-64 they do both have their own unique features. For example, on Intel you've got fast memset, which doesn't exist on AMD.


But Adobe apps run on Windows with AMD, which means they custom build/compile their apps for each OS ( which is perfectly normal ) and since Apple only deals with Intel CPU's it's understandable.

Maybe it would be better to check for CPU capabilities at runtime and have a more portable app, but what do I know, obviously Adobe PMs will not pay the cost just so some mutts can run hackintosh.


My understanding is it's actually a limitation of the Intel C++ compiler they use but I can't confirm this. On Windows it bakes in some conditionals to check if the CPU is GenuineIntel before doing some things, on macOS it doesn't bother.


Adobe products were not written to take full advantage of high thread counts not GPU bound tasks. This is why their products run much better on Intel chips because they have higher single core speeds.


I am pretty sure that bang/buck is completely irrelevant when it comes to Apple


This isn't Apple. This is for people who want more bang for the buck and/or options


It would be nice if Apple opened up OS X a bit. I _know_ that goes against brand principle etc but they also lost a lot of ground back to Windows in the past five years.

I used OS X from launch but switched to Windows three years ago because it was simply a better package for me. Cloud computing has pretty much eliminated the need/preference for OS X.


> Cloud computing has pretty much eliminated the need/preference for OS X.

Could you expand on this bit? What need does cloud computing fulfil that used to be provided by OS X?


Had it not been for the reverse takeover from NeXT and Apple would have all the ground to Windows a couple of decades ago.

Copland didn't make it, Mac compatibles were eating Apple's remaining profits and BeOS also a possible interesting What-if, would probably not saved the company.


They should release an ATX "Mac Board", one version with an Intel socket and one with an AMD socket for like $499-$599 USD. It would sell so well and they would still earn so much money.


They make their money selling you overpriced hardware. This won't happen. Well maybe it would but then every part of the ATX Mac Board would also be sold by Apple and third party RAM etc would never be allowed.


Right. Plus, it would only take Apple to start using a custom firmware to only allow hardware that is exclusive to Apple’s computers.


> they also lost a lot of ground back to Windows in the past five years

Is this just you and some people you've seen online, or is there some study showing them genuinely losing significant market share to Windows?


Not OP, My hypothesis.

In 2014 Apple Announced they had around 80M Active Mac Users.

In 2017 They had over 90M Active Mac Users.

In 2018 They Reached 100M Active Mac Users.

Apple has a fairly stable Mac business, and ships 18 - 20 Million unit over the past few years. They are also very proud of their consistent ~50% of Mac sold are first time Mac user.

If no one was leaving the Mac ecosystem, you would have expected 9 to 10 million increase of active user per year. Except it didn't happen. The growth rate has slow down dramatically. They are gaining ~5M user per year.

There is also another stats Apple gives out, nearly All of the New Mac user were from China. And that is not surprising given the whole iOS ecosystem and games development requires Mac along with the rise of middle class. But this also tells you most of the New Mac User are from China.

This hypothesis doesn't takes into account how Apple measures its "Active Mac Users", but suggest Apple is losing millions of Active user outside China. And since US has the largest Mac user base, I would not be surprised if it lost ground to Windows was an observation from a US users perspective.


That presumes constant growth of desktop computers (/computers running desktop OSes.) What if it's just mobile growing at the expense of desktop generally? I know that in e.g. education, computer labs have been replaced with carts full of tablets.


Well first they are selling that many Unit, so it is either their user replacing old unit and no user growth, or their 50% of Mac are to first time user stats are bogus.

Second there is still over a billion Windows PCs on the market. And judging from Intel's latest quarterly results suggest the PC market is still in very healthy replacement cycle.

I know a lot of education are going all in on Tablet. I do wonder how big the market is. I dont have any Data with that.


This.


Would have been curious to see a price comparison between the build and a Mac Pro with the same specs.


This build cost me around 1600 USD (I moved over a few parts from the old hackintosh (2x HDD, WiFi card, GPU). So around 2000 with those. I already had the two monitors (for many years now) too.


A Mac Pro with the 8-core CPU, 48GB of RAM, 1TB of storage, the Promise bay for 2x HDDs runs around 7200 USD.


Perhaps the closest comparison is to throw in a 5K screen making it (say) $2500 with the Hackintosh and compare with an iMac (Non-Pro) with 64Gb non-ECC. It’s around half the cost, plus you get to keep the screen.


Thanks a lot!


Haven't made a Hackintosh in a while, OpenCore is new to me. Does anyone know if Messages, Apple Store, etc. work or do you still have to hack it with a real serial?


You still need a real serial but you can generate one pretty easily. Just need to check on Apple's website if it validates.

I have my PC with an AMD Ryzen 3900X running with macOS with working iMessage / App Store etc.


OpenCore works in the same way as Clover in this respect.

You still need to provide a serial, MLB, etc.

If you configure everything correctly, you should have no issues using Apple services.


In my case I'm running macos in kvm VM on my home server based on SuperMicro X10SDV-6C-TLN4F. Seems to work ok but with some exceptions. First is the graphic card which I pass through to VM, it is geforce gtx 1060 and unfortunelty it limits me to use only high sierra, which seems to be the last version of macos for which nvidia web drivers are available. Both video and sound goes through hdmi cable connected to my tv. Other is network card, if you are ok using kvm emulated nic like e1000-82545em then you are set, if you are trying to pass physical interface, or sr-iov vf you may have to deal with bunch of kexts or flashing NIC eeprom to been seen as a card supported by drivers like the one provided by smalltree.

If someone want to try, here is with what I started https://github.com/foxlet/macOS-Simple-KVM


Surprise. Would try it with my intel nuc Ubuntu tmr.


Interesting. I actually started buying components to build a new hackintosh to replace my old El Captain one from 2013. Man, I can't believe how far SSDs have come. My current SSD is only 250 GB and now you can get a 1TB SSD for like 150 USD


Kind of off-topic but can anyone weigh in on how a 2019 (2020?) Macbook Air fares for programming? Specifically frontend programming: running VSC, maybe a couple frontend build watch tasks, and a node server. Currently using a 2017 13" MBP and I assume it'd have similar performance?

At this point in my career I spend most of my days running around in meetings so if I'm doing programming I'll usually do it on my desktop. My laptop ends up being a glorified note-taking machine with only light frontend programming so I'm really optimizing for portability. Interested to hear some thoughts from fellow engineers with the newest Macbook Air.


1. I’d wait for the new keyboards first. The Airs and 13-inch Pros still have the faulty ones.

2. I’d get a low-spec MBP13 or MBP14, whatever comes out next.


The keyboard is a fair point. I had mistakenly assumed they fixed them with the recent update but there they are on the list of models with free keyboard replacements.

As for 2, I already have an MBP13 2017 and I want something even more portable. That was why I was interested to hear people’s experiences programming with it’s lower spec CPU.


I haven't used an Air for development, but I do know that its power supply is 30w instead of the MBP's 100w. That's a big difference. Beyond the CPU's official specs, I'd bet some throttling happens there.

That said, I recently upgraded to a new MBP because my 2013 MBP had gotten really laggy for certain things, particularly the web (and the inspector, etc.). This was weird, because on pure-compute tasks it still did pretty okay. But basic sites like Twitter were visibly laggy, and the Chrome debugger was like mud.

I did some research and ended up concluding that it came down to the integrated GPU. The processor itself was still significantly faster than a brand-new Air, based on benchmarks, but integrated GPUs have made huge strides in the past 5 years (and web pages have gotten significantly heavier) and I guess that has a more noticeable impact on basic tasks than the CPU does.

Anyway. If you're only doing the occasional, light web dev then I would guess you'll be fine. It'll be noticeably slower than your 2017 MBP, but probably still passable.


Oh, I did expect a higher geekbench score TBH: My dual Xeon 2687W (v1) with 128GB DDR3 (2*4 channel) gives me 830 SC and 6000/10000 MC (1/2 CPUs). And that's with thermal limitation due 1U cooling solution. OS is Arch Linux on nvme with spectre&other mitigations disabled (compute node).

Omitting disks, that system cost me just a little bit more than your mainboard ;-) Of course two dozen 40mm/7k rpm fans and 480W peak power draw (no GPU!) are quite a tradeoff.

https://browser.geekbench.com/v5/cpu/1046367


I've been working on a Hackintosh build since last 2 years, but since I don't need a video rig and/or compiling work horse, I reduced my self to a Intel NUC with Clearlinux OS. I'm getting familiar with Rawtherapee and Darktable workflow (instead of Capture One) since I'm a weekend photographer. Clearlinux has most of my needs, so I'm happy with it at the moment.


RawTherapee is great. I also use Digikam. Linux is great but I needed Photoshop (GIMP is not there yet). There are also a handful of mac OS apps without replacements. I upgraded my Hackintosh build and for a while had Arch Linux running on it. Runs great, but ended up running Windows 10 to get my Photoshop fix.


> this is probably the last time I’m building this sort of computer, before MacOS is locked down forever

Any info on how they will lock it down?


Switch from Intel to their own ARM CPUs will hit the Hackintosh community really hard.


The could have done this, if they wanted to, with their own motherboard.


With the T2 chip which is already present in other Macs. It’s optional at the moment but I assume at some point it will be closed off. And then there’s the ARM CPU argument - we could get those soon.


According to a Linus Tech Tips video[1] that I saw a a couple of days ago, you can use PCI nvme drives on the new Mac Pro, but it won't even boot unless one of the proprietary Apple SSDs is connected (because of the T2).

[1] https://youtu.be/mIB389tqzCI?t=180


You should be able to turn off the T2 function Secure Boot which blocks booting from other drives: https://support.apple.com/en-us/HT208330


According to this newer Linus Video [1], one of the T2 SSD slots still needs to be populated with an Apple drive to boot from other drives.

[1] https://www.youtube.com/watch?v=zcLbSCinX3U


Which makes sense: it needs to load firmware/EFI from somewhere; and in a setup where you expect your own SSD to be a fixed resource in the system, there's little reason to have separate BIOS flash.


New Macintoshes come with an Apple-produced "security chip", which is an ARM like those found in iPhones and iPads. They communicate with the main CPU and provide it the boot code via SPI for security, but the security chip itself has an entirely cryptographically signed boot chain and will only boot Apple-provided code.

The presumption is that when enough time has passed that Apple will obsolete the non-security-chip-equipped macs and macOS will only boot when able to communicate with a security chip running Apple-signed code. It's quite plausible. The security chip does provide a ton of other benefits, though.


Is it worth it building a Hackintosh? Are there any downsides comparing to Apple hardware? (in terms of using OSX)


I spent a lot of time setting up Hackintoshes and, in the end, I always ended up asking myself: why did I bother doing all this?

The time you have to spend maintaining it can amount to a lot. In my case, the machine would always feel sort of unstable with random freezes, etc. You just never really know 100% what's going on behind the scenes, it could be the wrong kext, or the wrong graphics card, or the wrong os patch, the wrong iso used, etc.

If you have plenty of time to play with it, I would say go for it. But if you just need your machine to do work and cannot afford having a computer that may suddenly stop working (and requiring a lot of time to get it going again), just save your time/money and buy an actual apple device.

The idea of having a fast and custom made computer running macOS is pretty enticing, but it's a lot more time consuming/intricate than one would like to admit.


You need to buy hardware specifically for hackintoshing. Don't just buy whatever's in fashion and try to make Mac OS fit.

If you do that it's mostly hassle free. Never had freezes, had stuff like sound not working when coming back from sleep (which i solved by switching to usb audio).

Of course, then something will come along and fuck you up, like Apple dropping nvidia drivers completely. Which is solved by getting an AMD video card. Admit it, it was time you upgraded the video card anyway :)


I bought specific hardware,

HP 6300 SFF: $35 NVidia GT 710: $30 16GB of DDR3: $60 Cheap 256GB SSD: $30

More than fast enough for me. Everything works: iMessage, iCloud, etc. No crashes, no freezes, no audio issues. It started on High Sierra and is on Mojave now. Installation straightforward, though it requires concentration as you read the instructions - and no cheating.


I was gifted an HP 6300 and was astonished by how easy they are to Hackintosh. The 6300 & 8300 can be had on ebay for $75-150.

https://www.tonymacx86.com/threads/guide-hp-6300-pro-hp-8300...

I added this graphics card:

https://www.amazon.com/gp/product/B07B7YMBFC/

I even get HDMI audio support.


Does that graphics card work with the little PSU in the 6300? I have the SFF. Not sure if it even has a power cable to feed it. And is it noisy?


It's a low profile card and I don't remember it needing any power being plugged in. To be fair, I think this card is overpriced for whatever performance you would get out of it. That is especially considering I only use it for 2d acceleration. But I needed the card because I could not get displayport from the motherboard working - I suspect it is the particular CPU. Regardless, my entire outlay for this machine was the price of the graphics card.

It is fairly quiet, but I've never heard it really ramp up.

I do vaugely remember the card NOT coming with the shorter edge connect (metal L bracket thingy). Need to look back there and see what I have going on.


I am using Hackintoshes for the past 5++ years and it's running smooth and stable. I always make sure to buy only hardware that is supported and I never had any problems.

Updates were a pain in the past, but recently they are being applied smoothly.

My Hackintosh is my main device (typing from it right now) and I am working on it 10h+ a day for years without interruptions, so I think it's more than worth it for me.

Apple simply does not understand what we want.


Don't forget the random times it doesn't just reboot into the OS and you're sitting there debugging for hours to login back in.


I have had my previous hackintosh since 2014 and only once was there an issue. It all depends how good your boot loader config was. If it was configured properly, it was rock solid.


Depends on a persons needs. Worth it for me. It’s faster than an 8-core 6000 USD Mac Pro and a 10-core iMac Pro. Downsides are maintenance, knowing what you’re doing, checking if it’s safe to upgrade to a newer version of macOS.


It's all about your trade-off between what you want, time, and money. Do you want the desktop equivalent of a fully-loaded 16" MBP or a low-mid-range Mac Pro? Is $3000 a significant amount of money to you? If the answer to both of these is yes, it's worth it.


If you are going to spend the money, just build yourself a desktop and install Elementary OS on it. Modern Linux is fantastic. I moved to Ubuntu earlier this year and had to tweak like 2 things for seamless use.

Then if you really want MacOS, look into VFIO with GPU passthrough. People are getting native like performance with both Windows for gaming and MacOS.


The main downside is you have to be super careful when applying os updates - and recent macos tend to be updated more frequently.


The problem is that a lot of the tools you need to run a Hackintosh may not be open-source or only provided in binary form (or you don't have a platform to compile the code even if it was open-source, so you'd still need to somehow bootstrap it first, either with a real Mac or with the untrusted binaries).

Running untrusted code from anonymous people on a forum, some of which runs at kernel level (like hardware drivers) is a no-no for me and should be for anyone.


Do you have an example of tool that is not open-source and mandatory for running a Hackintosh?

I followed the "vanilla" guide to build my Hackintosh. It uses only open-source kexts files available on Gitlab or Github.

Just avoid TonyMac and their apps.


OpenCore is completely open source.


And you can build everything on Windows or Linux. Mac not needed (though I still prefer it).


I move that PirateBay downloads can be a lot more secure than stuff from many commercial vendors.

In our case, the hackintosh community is highly technical and would notice any suspicious behaviour immediately.

Also, I think all the drivers are available on github? The TonyMac stuff is just a frontend that installs the same source available drivers...


I thought about myself, but OS updates / upgrades are risky.


Upsides: price, upgradeability, repairability.


I want to test Safari in a CI pipeline because a lot of our customers are using it, and the only way to do so seems to be by running a Mac. We can test Firefox and Chrome perfectly via docker containers. Should we build a hackintosh and connect it to our AWS VPC? Any tips?


Should we build a hackintosh and connect it to our AWS VPC?

Are you in a country with a laissez-faire legal system? What you are asking, in a commercial context, is whether or not you should violate your MacOS license agreement. At least by USA legal rules, anyway.

I've never heard of Apple litigating over this, but it's something that businesses try to stay away from.

IMO hackintoshes are for hobbyists.


Hm, definitely not in a loose legal system, so hackintosh is not an option. Then the most affordable way is to either run a (used) Mac on-prem or use a SaaS solution for browser testing.

I'm frustrated by how much more difficult this is from running Linux or even Windows on ec2.


There are saas tools out there like browserstack.com that allow selenium testing on safari and other browsers.


We're trying that, but the latency and proxying are making it slow and error-prone. We're taking thousands of screenshots per CI run and want it to be done within 5 minutes for our developer experience. We run about 20 to 40 of these CI jobs per day. The SaaS tools I've seen so far are also very expensive for this use case.


Is it a dual boot box with macos and windows? Or do you use two boxes? Then how u do switch between with one keyboard/mouse? What kind of kvm or so?


Dual boot. Two SSDSs.


with all the mention of "getting the right components" what's a reference website where i can learn how to build my own system?


https://www.tonymacx86.com/buyersguide/building-a-customac-h...

tonymacx86 has a list of supported components and has suggested builds as well. Been around for years.


thank you


Quote: "The SSD is more than fast enought for my needs. Funny thing is that it’s even faster under Windows (3.5 GBps read and 3.3 GBps write)." Also OP is showing ~2.5 Gbps in an image.

Funny thing, I run my Macs in virtual machines (VMWare), and the speed is identical to Windows (host OS), while all other OS'es (Linux / BSD / other Windows') show slower speed. I wonder why.


I thought you could only run Mac VMs on a Mac host. Has that changed, or did you hack VMware somehow?

Our automation group has a couple of maxed out trash cans just for Mac VMs. The IT department probably can't wait to replace them with the new rackmounts.


As other said it, there is unlocker for VMWare which allows you to have Mac OS as option when creating a new virtual machine.

As for hacking VMWare I got my hands on a official Mountain Lion image years ago, directly from Apple, which I used to copy/upgrade into new VM each time a new Mac OS was released. I have lying around cca 15 VM's with different flavors of Mac OS each, all tests for various projects for my clients.



Legally you can only run them on Mac hardware, even if it's a windows/linux OS.

As for technically, it's possible. I believe you either have to edit some file or run a patcher to get it to run MacOS though.


look for unlocker on github


Screenshot shows 3 GBps for both read and write under Windows. Windows has 3.5/3.3.


Correct. Also for curiosity I did a test on my host and Mac VM. Program reported on both Windows and Mac VM same speed of ~500 MB/s (which translates into 4.8 Gbps for OP style). On Linux I got ~450 MB/s. I used Total Commander in all cases (my normal file manager) and used my Witcher 3 .iso file as test (66 GB in size).


> which translates into 4.8 Gbps for OP style

I think you are misreading/misunderstanding the performance of modern NVMe SSDs.

SATA based SSDs which it sounds like you have, top out at around 550MByte/sec.

However a PCIE 3.0 NVME can easily achieve 3500 MByte/sec throughput. That's saturating the PCIE 3.0 x4 link. PCIE 4.0 based SSDs are even faster, with advertised 5000MByte/sec throughput (but in testing may top out slightly slower than that)

Yes, modern SSDs are stupid fast.

[0] https://www.techspot.com/review/1893-pcie-4-vs-pcie-3-ssd/


Aye, it's SATA. PC is from 2013, old badly, planning this year to make a new one. But it covered me in past 7 years quite nicely. I made a quick math with my wife when she complained why I need a new one since this one is still good. Calculated that in past 7 years she paid 20% more in her cell phone subscription then me in buying/upgrading the PC all these years. And it was the work horse with at least 10 hours daily running, average of 16 and plenty of times stayed on for days.


tangential: i am _seriously_ considering getting an eizo monitor for personal (non-gaming) use, and was wondering if anyone here can chime in with some personal experience. a bit hesitant given the price tag. thanks !


and the price?


See one my of other comments please.


Can someone explain why you would want to run an operating system that actively tries to prevent you from running it?

Apple clearly doesn't want to support third party hardware and from what I've heard, you have to live in constant fear of updates.


I ran a hackintosh setup for most of my university days.

I loved OS X on my MacBook but I wanted more power and a larger screen, and I couldn't afford a Mac Pro.

Compared to both Windows and Linux, I preferred OS X's behavior in:

- the systemwide emacs keybindings in every textbox

- the consistent keyboard shortcuts (Cmd+C was always Copy, Ctrl+C was always Interrupt)

- the dedicated keyboard key (Option) for typing non-ASCII characters

- the general higher quality of apps that were available (Transmit was a better FTP client, TotalTerminal a better terminal, Preview a better PDF viewer, iBooks a better ebook reader, MplayerX a better video player, iTunes a better music player, Adium a better chat app, Transmission a better torrent app, Parallels a better VM, than anything for Windows or Linux)

- the consistent look-and-feel of apps

- the centralized notification system, Growl (before any OS had built-in notification support, the third-party Growl library was the de-facto standard on Mac, at a time when in other OSes, every app used their own notifications that constantly got in each other's way)

- QuickLook

Compared to Windows, I liked:

- the presence of a bash terminal underlying everything

- multiple desktops (Windows has this now, but didn't back then)

- Exposé (the thing that showed all open windows so you could click the one you wanted to easily switch to it)

Compared to Linux, I liked:

- Homebrew, a package manager which actually installed up-to-date packages instead of years-old packages (this was before Linuxbrew)

- official support for various paid apps like Photoshop and MS Word

- the ability to do many important things in a GUI (and therefore, the ability to do new things without needing to Google them)

Aside from the complications updates posed, I ran into no major problems with the hackintosh setup, other than I think the microphone didn't work? Everything else worked fine. I had overall much fewer problems than I have with Linux.


What made you stop using one?


This is one of my favorite stories.

When I updated to OS X 10.7 Lion on my Hackintosh, I got a weird issue where the hard drive would wipe itself every few days – like, I would turn on the computer, and it would tell me the hard drive was blank and had no operating system on it, and I had to reinstall everything all over again.

(To be clear, this was the first issue of any kind my Hackintosh had ever gotten.)

At that point, my business had gotten successful, so I could afford to buy a 2013 Mac Pro and stop worrying about what went wrong.

So I bought the Mac Pro... and its SSD still wiped itself after a few days. Turns out, the problem was in Lion's exFAT driver. I'd partitioned it into HFS+ for Mac, NTFS for Windows, and a shared exFAT partition, like on my Hackintosh, and Lion just couldn't handle that.

I gave up on having an exFAT partition after that, but kept the Mac Pro because it was so nice (it was also much quieter and smaller, which let me put it in my backpack for hackathons).


Did you apply security updates when they came out? If not, how did you justify connecting such an outdated machine to the internet?


> Can someone explain why you would want to run an operating system that actively tries to prevent you from running it?

If you're a developer, OS X is great.

Windows is a second class citizen if you aren't using Visual Studio.

Linux is great - it's basically your deployment environment - but you often have to mess around with it a bit.

OS X is 90-95% of the benefit of Linux, but it just works™. If you want to get work done, that's a pretty good feature.

The trouble is that, if you want to run a desktop, there are basically no good options. Mac Mini is very limited, same with the iMac. The Mac Pro is not limited, but it supports such enormous expansion that the base config is extremely expensive, while not being very performant.


> Linux is great - it's basically your deployment environment - but you often have to mess around with it a bit.

This doesn't make sense. One definitely, in general terms, has to mess around more with Hackintosh than with a Linux installation, as the hardware compatibility is much higher with the latter than the former.

I did use a Hackintosh, for some time. I still need Mac occasionally, but nowadays I just run it in a VM.


Building a _new_ hackintosh is easy, and requires very little messing around if you follow sensible guidelines. It generally gets messy after a few years and several OS updates. There are generally less users on forums with the same hardward specs, asking less questions. Apple drop support for things, and recommended methods to install/update change enough to become painful.

I just upgraded my hackintosh from El Capitan to Catalina. It wasn't all fun, but wasn't too bad either. I'll probably get another couple of years trouble free with this OS, so it was worth it. After that I'll switch to Linux or Windows for my main system, though.


Building a new Linux box is easy, too. If you can pick and choose amongst compatible hardware, there's no messing around at all to get a performant and updateable system.

I can see how someone in a particular field, such as video editing, might prefer certain macOS apps. For general software development, though, a Linux machine feels like a much safer bet.


>has to mess around more with Hackintosh than with a Linux installation

As someone who lived with an Ubuntu installation for 5 years up until 10.10 and then with Arch for arount 4 years - you statement about linux being an easy thing to handle is correct as long as you are fine with a distro like ubuntu and don't want to change much.

As soon as you embrace something more hacky or at least more bleeding edge - you are in the world full of unexpected surprises. Any major update can cause you hours of problems because distro devs decided to switch from somethingX to somethingY and you had to prepare your installation before updating but alas, you rarely read the fron page of https://www.archlinux.org and now you have to rollback and do everything right.

Not a too common scenario but not an uncommon one too.

Not to metion that for the most part linux remains a second-class citizen to software devs, so you won't see much of a general purpose software of the same quality macOS has to offer. Which is a problem if you are going to use the machine for something other then software development.


Windows with WSL is pretty nice. I still prefer MacOS but mostly for convenience with interfacing with my other personal devices.


My experience with Linux is that if your hardware setup deviates even slightly from the majority you’re going to have trouble getting things working right. This is especially true when it comes to GPUs (dual screens on dual GPUs for instance causes trouble with both Nvidia and AMD). There’s also distro specific weirdness like how the Ubuntu will screw with your boot partitions on every disk in your machine (fixable but annoying).

With a hackintosh, as long as you’re using supported chipsets, cards, etc the larger overall configuration of your machine doesn’t really matter much — it’ll just work. Point in case, macOS handles the aforementioned dual monitor dual GPU setup without so much as a hiccup both on AMD (and if you drop back to High Sierra) Nvidia. It’s paradoxical, but hackintoshed macOS is often better at smoothing over differences between machines than Linux is.


A Hackintosh does not just work. It requires far more upkeep than a Linux system.

I certainly wouldn't do it again!


As long as you choose your hardware wisely all you need is UniBeast and post-install MulitBeast. All I've ever had to do in over 10 years of Hackintosh use is re-install my MultiBeast config after an OS update.


You have to think of a hackintosh like an android phone. You might get OS updates or not but mostly whatever OS version it starts with will stay on it for life. You will certainly not be able to update immediately. For some people that's fine, for others who are dependant on Xcode which needs constant OS updates, it's not fine. Really depends on your use case.


> ...like an android phone. You might get OS updates or not but mostly whatever OS version it starts with will stay on it for life.

Not all Android phones are like that -- what are you buying which doesn't get any OS updates?


Obviously android phones get OS updates. Hackintoshes also get OS updates. But in either case there's never a guarantee unless you buy a current gen flagship. If you buy an iPhone you will consistently get updates for up to 5 years. Android is more of a "best effort" kind of deal. Maybe they will update your phone this year. Maybe not. Who knows?


Not true. I’ve had my 2014 hackintosh updated regularly without issues.


Agree 1000%.

For native macs that’s probably the case, but I spent so much time just getting basic features of my hackintosh to work. If I wanted to change audio source from headphones to HDMI I needed to add/remove kexts and restart!


That just means your config wasn’t properly made to support your hardware. Mine was rock solid for 6 years.


Can you elabarote more on 'config' ? Especially if someone plans to follow your build and this will be first hackintosh. Any more links to hackintosh resources/community? (technical ones :)

Also: could you write which models of components you bought:

a. 580 gfx card ? /nitro? pulse? others :)/

b. in terms of 5700 youre planning to buy do you have exact model?

c. which exact memory (size of one) and 3600 was base or overclocked


I've been using Windows 10 Pro for about a year now. I've got it running VSCode and docker with Ubuntu on WSL2 running VSCode server and the docker client. There are some minor annoyances with file permissions, but overall it is a great environment. I just couldn't justify paying 3x for half the hardware again.


I am a developer and have no problems developing Linux software on Windows desktop (namely C++ rest servers). It just works™ for me


With JetBrains/VS Code and WSL, modern Windows is actually a pleasure to develop on (IMHO, of course).

The only reason I bust out OSX is for iOS development.


Yes, but then you get Candy Crush install by itself, and every day new KB updates, system updates that you cannot prevent like it's not your own system to chose, optimisation processes which start consuming CPU without you asking for it, clunky UI. I prefer to use a Mac just for the peace of mind.


Because it's better to run an unpatched OS (considering installing updates on a hackintosh tends to break it) than to run an OS with some pre-installed software that you won't even notice?


I meant that I run on a Macbook. I was comparing Windows with MacOS. Btw, software that I won't even notice? I paid 2.5k on a laptop that I keep docked most of the time, so I guess I do notice it, along with all the other things I've mentioned.


It’s definitely better than it’s reputation, and I find it manageable at work.

Also while powershell is great, I find it just too wordy. Many commands that are 4 characters in unix systems are multiple words, with hyphens etc.


I think there are plenty of reasons. You might like the OS a lot but not care about Apple's feelings.

Spite is another one. Or fun. It's a neat thing to do. Hacking-wise.

You might want the high performance hardware without the premium price.

I've considered it but will probably go Linux for my next desktop work machine.


> I think there are plenty of reasons. You might like the OS a lot but not care about Apple's feelings.

Apple is a company. Companies do not have feelings; they have interests. Who are you (financially) hurting by running Hackintosh?


> Spite is another one.

This is a bad reason to spend any of your life on something.


If in earnest, yes. But if more like in jest, than actual hurt feelings spite, why the heck not. It can be fun.

My somewhat humorous take back in the day - the terms said the OS must only run on Apple labeled hardware. So I labelled the hardware with a couple of Apple stickers. That should do it, right?! :-D

(If I recall correctly the stickers even came with the Snow Leopard DVD...)


> Can someone explain why you would want to run an operating system that actively tries to prevent you from running it?

Because for a variety of reasons they don't want to pay for a "real" Mac I'd assume.

Personally I'd not consider it as the value I get from a Mac is that I don't have to worry about updates, hardware, drivers and spending hours reading up on Hackintosh forums what's safe to do.


It’s that good. And the software on it is even better.


Massive savings and modability. A Hackintosh can be significantly cheaper and you're free to upgrade your disk, RAM and cpu.


I still struggle to see the point these days -- swapping CPUs and RAM makes little sense even on Windows; RAM sizes have remained stagnant for years and the small generational speed increases on CPUs mean you'll likely need a new motherboard anyway by the time the cost/benefit is there. Disk is a non-issue as there are external enclosures that aren't expensive, and performance over Thunderbolt is as good as PCIe (because it is PCIe).

Performance-wise a MacBook Pro is and has been a great machine and not incredibly expensive, plus you get the portability aspect. For the "semi-professional" market that a Hackintosh is appealing to, most folks would be better off with a MacBook Pro and a Thunderbolt dock. You also miss out on a number of core features of the OS because the hardware that supports them doesn't exist.

Combined with the headache of keeping the machine current, it's not really a good tradeoff for a couple hundred bucks. You'd be better off buying a used MacBook Pro.


Try a couple of thousand dollars. My build is 1K cheaper than a similarly specced Mini without an eGPU. A comparable Mac Pro (yes, Xeons and ECC) is 7.2K USD.


I was thinking more about what happens when parts fail, especially if you're nowhere near an Apple retailer. I used to be able to swap out disks and RAM with my MacBook Pro. Not any more.


I just can't stand Windows, I boot there to play games only.


Anything in particular?


* Consistent keyboard shortcuts. One can get by with a keyboard for way longer on macOS. It makes general usage very efficient.

* Native bash terminal.

* Just works and the defaults are fine.

* homebrew package manager is quite nice.

* iPhone integration is pretty awesome (file air drops, calls, messages all available from macOS)


Agree with you on native bash and general fit and polish of the OS. However, when it comes to keyboard shortcuts, I strongly disagree. I used windows for about ten years, then moved over to Mac for the past ten. I still am not as comfortable with the keyboard under macOS as I was windows. Main culprits are tabbing between controls (many just don’t seem accessible) and different interpretations, across apps, of things like home, page up, etc.

Best example is finder: windows explorer, I could basically navigate by keyboard alone. Finder pretty much requires the mouse.


I think he meant what games you play.


I run mine for audio production, exclusively. I have some mac only software that is critical to my workflow, so I keep a hackintosh (which works perfect btw) running.

Updating an audio rig, in my experience, is done only when absolutely necessary. So, I don't bother (still on 10.12.6).


You can't develop and deploy IOS or hybrid apps on any other system, and OSX is a pain to run on a VM...


I also don't get it.

Anyone doing this should support any of OEMs selling the BSD or Linux based hardware instead, like System76, Tuxedo, Dell, Asus.

Either one buys into the full Apple experience, or not.


With a Hackintosh you have the full experience, since you have macOS installed. The underlying hardware is basically the same.


Try to take the Hackintosh to an Apple store to see how the Apple Care repair experience plays out.


You don’t need to. If anything fails you just install a new part 1 hour later instead of waiting a week for Apple’s support to sort it out.


Except you have to consult a compatibility list on a forum post that may be 2 years old, and if that part isn't available you roll the dice and buy something you think might work. If it doesn't, you're on your own with no support.

Speaking from experience here -- if the price of a Mac is your barrier, forego the Hackintosh and buy a used Mac.


Not really. The biggest point of failure will be the drives and they’re universal. The only truly Mac-specific part is the motherboard and there’s plenty to choose from.

There is no Mac that supports my needs, used or new.


Video cards as well; anything newer than High Sierra only supports AMD cards.

I'm genuinely curious -- what needs do you have that a used Mac would not support?


Why on Earth would anyone do that.


The point is that Hackintosh is not the Apple experience, rather being cheapskates to buy Apple's hardware and Care plans.

That being the case, better contribute to BSD and Linux OEMs.


As someone who has spent, as best I can tell, $50,000+ on Apple gear at home over the last decade, tell me again why I'm a cheapskate because I take issue with the world's richest company charging me nearly five times market rates for memory?

Oh, for the "Apple experience"? Please.

The last two Apple experiences I had involved them wanting to charge me $900 to repair a charging circuit in an otherwise perfectly functioning MacBook, and another where they repeatedly denied me warranty service on a maxed out MBP which could repeatedly be made to crash, oftentimes multiple times a day (due to what Apple begrudgingly admitted nearly six months later was in fact a design issue).

To be clear I'm perfectly happy with Apple hardware and software and realize that my experiences are outliers. But "because I'm a cheapskate" isn't really a valid argument here.


> Anyone doing this should

That's backwards. Rich megacorps are the ones that should be pressured to change their behavior, not individuals.

Plus the full Apple experience is just about getting your money, it's not actually better than the full Linux experience or the full Windows experience.


> it's not actually better than the full Linux experience or the full Windows experience.

But it is?

- Can I view my iMessage texts / make and receive calls using FaceTime on Windows/Linux? No? There you go, a feature I'd miss on day 1.

- Can I run the Affinity suite on either of those? Nope. For that matter, you still can't use Adobe products on Linux, can you?

- Can I develop for iOS/macOS? Nope.

- Is there a consistent UI for native applications on Linux yet? Nope. For that matter, is Gnome still the bare bones GUI of choice for the most widely used distros?

- Do either of those two platforms integrate nicely with the rest of my Apple devices? Nope.

- Is the X Window System still a thing? How about Wayland, is it ready yet for a retina grade multi monitor setup? Am I able to take screenshots now, for the love of all that's holy? Let me guess, probably not.

- Is Windows still the insane mess that contains Win95/2000/XP UI elements?

- Is Windows still the telemetry infested product it was a few years ago?

I could go on, and on, and on, frankly, but what would the point be? I have slightly different preferences than you have, and that's totally fine. I'm sure you get by well with Linux/Windows, just as I get by well with using macOS.

Of course Apple created macOS and Mac hardware to make money! I assume correctly that Microsoft didn't create Windows to lose money on it, did they?


Full experience presumes you buy into the platform and get used to it, drop all the alien stuff and alien ideas from other platforms, especially all the exclusive lock-iny ones that you listed.


Ok, that answers maybe half of the points listed.

What do you have in defence of the other half other than empty platitudes about "buy in", which is of course somehow different from the "lock in" Apple sheep do?


Affinity runs on Windows. I have Photo and Designer.


Even less reasons to use Hackintoshs then.


Yeah, I don't understand it also.

For me I like the hardware of Mac but hate the software (macOs) as it assumes the users is a dummy, it is particularly hostile to power users.

Heck, even my wife (which is not a power user) thinks that macOs on her MacBook Air is too limiting and tries to hide "too complicated stuff" from the user, and considers getting a Windows laptop after using Air for 6 years. The only benefit of macOs from her perspective is integration with iPhone and airpods.

Similar feelings from my coworkers who choose MacBook over Lenovo at my work (particuarly using fullscreen on 2 monitors is a PITA).


The Terminal is always there at your disposal. OS X is a real UNIX so I don't get how you're limited in any way. Why do so many developers swear by their MacBooks if it's so dumbed-down?


Maybe not every developer is a power user? There are those that are happy with UI and there are those that prefer CLI. Maybe it is a frontend/backend preference (I prefer backend BTW)?

Some of the issues that my wife had (ordinary user):

- directory structure in Finder is strange, you don't know where your Documents is located

- photos after transferring from iPhone using the default app land in some strange format, should would expect to have jpeg/heic/mov files, but what she sees is a binary DB file (how to send photos to relatives from e.g. gmail, when you can't select it from the filesystem?)

- closing apps - when I click on the red button I expect the app to close, not just one of the windows, so the only way on macOs is to go to app menu and click Exit (too many clicks and needs scanning for a text), while I think most would prefer the other way around (my wife does)

And from myself: Why the default unix tools are in ancient versions? One has to install brew to do anything non-trivial in Terminal.


I have almost the exact same setup as you: same CPU cooler, same CPU, and the pro WiFi version of the Z390. However, when I’m doing a long-running clean build that uses the 16 logical cores, the temperature for each core gets up to 80 C. Any thoughts why that would be?


Is your environment 10C higher than OP's? Most reviews compare coolers in terms of the temperature increase over background temperature to control for this.

80C for a 9900k under sustained high load would not concern or surprise me. It's a high end power hungry CPU. 100C is the cutoff temperature so maybe more cooling might help it turbo more but I doubt you even get throttling at 80C.


you are only prolonging their foul empire, lol...

I mean I get you (pity and understand) if you are stuck in Logic or Final Cut Pro land with no painless way out, welcome to the state/reality of proprietary media production workstation software lockin, which is arguably nearly as tight of a lockin as obscure scientific instrumentation and machine control driver lockin etc... "yeah, you'll need a working install of NT4.0"


Mac mini too expensive? I'm not sure. Also I've ran some benchmarks against my friend's hackintosh and smoked it.

I mean read his specs, it's not cheap at all. Fractal cases are not cheap, and the RTX 580 that he will upgrade once he gets to Catalina? Right, that's cheap.

Those little mac minis are beasts if you ask me. Mine is called minibeef ;)

I've added an eGPU to it and it's powerhouse for all of my work, in spades.

Edit: my geekbench score is 5547 single and 23621 multi, that seems to be much higher than the posted specs.


A comparable Mac Mini spec, without an eGPU, HDDs, a Core i7 instead of an i9, costs 2700, which is 1000 USD more expensive than what I paid for this build. The CPU is also slower. Add an eGPU and we’re talking another 500-600 USD and still slower than an i9.


Something appears to be wrong with your benchmarking. You should be in the vicinity of 1100 ST, 5600 MT


Older version of Geekbench.


You’re using an older version of Geekbench. Use the same version as me to compare (Geekbench 5).


The Mini's wouldn't be bad if they didn't absolutely gouge you for RAM and HD upgrades. A 1TB drive and 16GB of RAM adds $600 to the total. What? That's at least double what you'd pay at Newegg, if not more.


The Mac Mini has no integrated GPU and is dead because of this reason. Why would I buy such an elegant device if I have to hook an ugly eGPU case.


> The Mac Mini has no integrated GPU

They do - they use Intel UHD Graphics 630, which is an Intel GT integrated GPU https://en.wikipedia.org/wiki/Intel_Graphics_Technology.

How did you think they were running the display? They can't squeeze a tiny discrete graphics card in there!


Not a discrete one, but there are much better mobile Radeons available. Like the ones they put in the Mac Books. Would be nice if they offered that on the higher end models.


Yes, I know, and the iGPU is complete dogshit. Why would I spend 2k+ on a device that is completely limited by it's GPU. It is not even able to run scaled 4k screens properly.

And regarding the size: Who cares? It's not a portable device, so make it a bit bigger and no one cares but is happy that the hardware is better.


> Yes, I know

Lol you said it didn't have one a second ago!


No, he said that it does not have a (worthwhile) GPU inside the case and made the unfortunate word choice "integrated". Hence the need for an eGPU.

Your interpretation of what he/she wrote makes no sense for any non-server hardware. If it can show something on the screen it obviously has a GPU (it just might be garbage).


An 'integrated' GPU is one on the processor die, not one inside the case.


Hence why I said unfortunate word choice.


It should be thicker, with a real GPU. Now it's unusable on a 4K display. Shame.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: