If you are a designer, this is absolutely the best thing you can do to sky-rocket your market value over night:
Create a concept design from a popular product and put it on a slick landing page. It shows that you, as a designer, are proactive and think beyond designing standard stuff (like webpages or mobile apps).
Moreover, you are not limited by any client restrictions[1] which hurt your work (and portfolio), you learn 3D modelling if you haven't yet (it's not hard just time consuming), if you are lucky with social news sites you get so much free promo and finally, it's the eye-catcher on any CV.
> Moreover, you are not limited by any client restrictions
But you must still limit yourself to sanity and physics. Too often, designers don't have a clue what's actually going on behind the scenes, and embarrass themselves.
This example puts 16 TB3 ports on a computer, because "consumers want it to be very expandable" and "2 columns of 8 looks pretty". It also adds two graphics cards but buries the connectors into the floor rather than making them externally accessible.
Concept cars by designers may have faults such as zero visibility from the drivers seat, have aggressively high front bumpers and low hood lines that look designed to kill pedestrians, utterly lack necessary things like exhaust pipes, crumple zones and spare tires, or have absurd specifications ("500 miles from the 2 cubic foot battery pack!" "600 HP V12 under the rear seat!")
Your target (other marketing departments) may not care. But it's also very possible that they encounter these limits as a part of their daily work, and will care, judging you for your lack of domain knowledge. Definitely create some concept work for your portfolio. But don't stray too far from the realm of the possible.
Ugh - did anyone take a look at his portfolio? He designs for video games so of course the specs are exaggerated.
He did something fun and most of HN is railing on him. The same criticisms ought to be said here - if you're going to criticize, first take a look at the whole picture and don't jump to conclusions.
It's weird that they presented the cards as standard PCE-E cards when the existing Mac Pro uses custom cards that don't have their own cooling manifold, but I don't think their intention was that there'd be connectors on the bottom you'd use.
Just like the current Mac Pro I'm sure they expect you'd use Thunderbolt or HDMI for display out. Those connectors are the same as on the existing Mac Pro, except this one has TB3 as well.
I hadn't even noticed the floor facing video card connectors... I imagine this could be addressed by having mini display port cables that connect to external ports on the back. I agree that VGA and DVI would likely not fit. From what I recall most modern video cards have multiple output types though.
Just reverse them (so the GPUs are venting up) then have right-angle dongles (which are very Apple now) under a magic magnetic lid compartment. Or, recess them enough (increase the case height) enough that you can plug in normal DP / HDMI cables.
I like it, but it's not proprietary enough for Apple to make. They didn't come out with a single upgrade GPU for the Mac Pro dustbin design, so I doubt they care much about selling modularity or upgradability these days. Their message is clear: "upgrade" means new device, which means all the products they sell are disposable appliances.
IMO, they could open up MacOS to work on some variations of hardware and sell it for $300-$500. They would still make the nicest hardware, but since desktops and laptops obviously aren't their main focus anymore, they could still offer power users good tools to build all these amazing mobile apps on without limiting them to their current slim pickings. Hell, even just make XCode cross-platform. Even just let it run on X, and use the Windows Linux subsystem to support it.
Using two standard full-length cards is just lazy design (even if he made it a point to use standard components). The GPUs should be on boards like they are on the current MacPro and attached to the monumental heatsink. This add two fans to a design that absolutely doesn't need them. The SATA disks also seem misplaced - it should extend PCI-E storage or, even better, DIMM flash modules: big Xeons have 4 memory channels per socket, enough for letting a couple of those go to storage. Having one single bus for everything makes the machine much neater.
> The GPUs should be on boards like they are on the current MacPro and attached to the monumental heatsink
This is the exact problem with the trashcan, it's a computer made for GPU computation (one of the trashcans GPUs isn't even hooked up to the display it's purely for computation) yet you are stuck with proprietary versions of GPUs that run the wrong sort of code.
Almost the entire GPU computing community works on CUDA, the trashcan cards don't run CUDA and because they're proprietary they can't be upgraded.
The right solutions are either this design or one where there is a heatsink+liquid cooling bracket system where you can attach a GPU after taking the stock cooler off to. Which is also very common thing to do for people in GPU computing. The latter invalidates your warranty so this design actually makes sense.
Just like the designers of the current Mac Pro, you don't get it.
This design (even though it doesn't work in reality) is about creating something that is functionally like the earlier Mac Pros. That means commodity off-the-shelf hardware can be used, not just specialized or outlandish components that are going to be very expensive, if anybody bothered to produce them at all. The current Mac Pro still ships with the outdated GPUs it originally was introduced with, with no upgrade path foreseeable.
And all of the performance improvements are in the GPU space. A 2 or 3 year old CPU is not that big a deal but a 2 or 3 year old GPU? I believe it's an order of magnitude slower (or more).
Heh, I work in a design studio and would temper your statement. Yes, having something like this in your portfolio certainly can't hurt - and having it frontpage HN is most certainly a good thing, exposure wise.
But when we interview candidates and do portfolio reviews, a project like this would be one of the things I'd be the least interested in. It does showcase the designer's creativity and skills at using 3D software, but it doesn't give you any sense of how they can work in a team, with clients, with real specs/requirements to follow, etc. It's kind of like the kind of designs that architecture students do in school: sure, the end designs might look very cool, but they don't represent capacity to work with the real world architectural process in any way.
One of the skills I value the most in a designer is the ability to really think through the various sub-aspects of a problem under different angles, come up with many alternatives, etc. A concept design like the one posted does not show this at all.
If you are a designer, your portfolio should only have one conceptual piece of this sort at the very most, and it shouldn't be its main focus. A designer's main skill should be providing solutions or insights into real world problems, and this doesn't solve any meaningful problem other than "make something that looks cool and will never exist".
It's pretty clear you and he are at opposite ends of the design spectrum. He's an experienced conceptual designer perfect for video games, i.e. Art Center. You're more academic and instructional, i.e. MIT.
Both are legit.
PS - it's pretty clear he's just having fun with the Mac specs. Only kneejerk HNers are taking it as a serious product pitch. He's designed beautiful guns, vehicles... why not a Mac that makes Ives look seriously plugged-up?
The thread is discussing the value of this type of conceptual presentation to a design portfolio. 'Conceptual designer' by your definition would basically mean 3D visual artist. That's not quite 'design' as it's being discussed here, though I wouldn't call it academic or instructional either. For design work, the final product rarely demonstrates the skills needed to be a good designer in real business settings. Hiring managers want to see the process that led to the final product.
You might be right. But if you're a consumer, these kind of concepts do nothing but frustrate because it:
1) sets totally unrealistic expectations
2) is total fantasy
And lesser informed people often think these are real if they see them. I recalling having had multiple people tell me about the "iPhone holographic keyboard" from some silly concept porn a few years ago...
Yes, but suspension of disbelief is hard (and even harder for us here). The objects you design need to be able to do their jobs or have convincing explanations for why they look like what they do.
Moebius got away with a spaceship shaped like Dumbo from Disneyland's Dumbo the Flying Elephant ride because the technology that surrounded it was so advanced that a spaceship shaped like that was no big deal. Seemed perfectly doable.
I might argue that you're right on most counts, but there's one project type that'd be a step above that: creating a side project that stands entirely on its own.
It has the effect, like you had said, of being not limited by clients and learning new skills. But on top of that, you'd show yourself as an independent thinker and started closer to zero than standing on the shoulders of other companies (in this case, Apple) and making spec work.
Again, not to say it's wrong, because I think you're very correct in thinking this is great for designers, but I do think this could be one step higher.
(And, for the sake of argument, I'm completely ignoring what neither of us stated: that probably more than half the battle as a professional in this industry is navigating politics, which the skills for obviously cannot be conveyed through a spec or side project.)
I liked AA employee's response: ...the thing I most wanted to get across—simply doing a home page redesign is a piece of cake. You want a redesign? I’ve got six of them in my archives. It only takes a few hours to put together a really good-looking one, as you demonstrated in your post. But doing the design isn’t the hard part...
The guy who did minimallyminimal.com did this several times, and it (probably) got him hired at Microsoft. It was interesting to see the progression of quality and scope of his work as I was following the blog since quite a long time before it took off.
Depends on if you want it to just look good or if you want it to be technically feasible from electrical, thermal, and manufacturing standpoints. Taking all of that into consideration would easily take 10-100x more time than just a novel visual prototype. Even look at Razer's triple-screen laptop from CES; they had one which worked, and one which looked pretty. Getting that last 10% of fit, finish, and feasibility takes a long time too.
If I used Fusion 360 (free for personal use) to design this, it would probably take me tens if not a couple hundred hours. Maybe you can get pre-made parts, like the memory chips, heat sinks and stuff.
Like another commenter said, fairly easy to do, but it takes serious time and creativity.
Why do people say this stuff is easy? It takes time and a lot of practice to understand the nuances of materials, lighting, how light interacts with various materials, cinematography, and not to mention the ins and outs of 3d modeling software.
This kind of design is WAY more likely to come from a PC vendor than Apple. The 16 TB ports was definitely kind of a LOL moment.
I dig it. They are useful, but that can't be done with "standard components".
The whole idea of a flexible and upgradeable PC is kind of against the Apple "tightly coupled" software and hardware story. The reason that their user experience has historically (I haven't used a mac in about a decade) been so good was because they limit the available hardware for their testing purposes. They don't have a lot of choice in hardware, but what they do support works every time.
I don't know...I got off the Apple train a long time ago. I loved my Mac Pro, but it just wasn't for me.
The PC ecosystem shipped something similar years ago (with different aesthetic choices). Combine a silverstone ft03 with a noctua d14 dual tower cooler, and you get a similar convection cooling design.
The main problem is that tool-less cases were in style back then. If you pick it up in the obvious way, you'll end up with two side panels in your hands and a broken heap of computer on the ground. Other than that, it is a totally solid and very quiet case for high tdp systems.
Actually, the main problem is that you can't put 2x full length video cards in it.
Cooling a mini-itx board with a 14mm processor with <90W TDP is not a unsolved problem. But the standard ATX motherboard layout assumes most of the head will be generated by the processor, and that's simply not true today. A dual GPU setup for gaming or ML might generate 5 to 10 times the heat generated by the processor, and there's literally a couple centimeters for airflow between the two cards.
When I see an innovative design that maximizes airflow across the GPU(s), preferably one that involves removing the stock "windtunnel" cooler and replacing it with massive "tower" passive radiators and 3x 120mm temperature controlled case fans blowing cool air across them, then I'll be impressed. (note that this would require either a new motherboard design or using pci ribbon extenders)
Actually, that case fits two full length cards, or at least full-length the year it was built. The power supply is on the bottom, on the "CPU side" of the motherboard (the motherboard ports stick out the top), so you get the full motherboard length, plus the power supply height, plus a bit extra. It leads to a pretty dense, but well laid out install. There is a diagonally mounted 120mm fan that does nothing and has to be removed, if I remember right.
Crucially, convection (with help from fans) means the cards are pulling cooler air than in normal cases (at least in theory -- in practice, it is very quiet with one GPU).
I can't recommend it due to the lack of screws / latches on the side panels, but the general layout and airflow is great, and could easily be adapted for a different model with support for longer cards or even three way GPUs.
Current generation AMD cards have stock water coolers that are similar to what you asked for, but apparently third party air coolers are cooler/quieter. (I don't remember if the case I linked has water cooler mounts. I doubt it.)
Eyeballing that case, it looks like it might have the length to accommodate a modern GPU, but it doesn't have the width for a double-slot GPU, much less any room for airflow across it.
> the general layout and airflow is great, and could easily be adapted for a different model with support for longer cards or even three way GPUs.
My problem with current ATX/ITX is that it was designed before PCI cards consumed more power and generated more heat than the CPU. An ideal solution would allow a 160+mm clearance on the obverse side of the GPU card (allowing the "wind tunnel" cooling system to be replaced with a tower cooler with 120mm fans, and a dedicated exhaust out the back of the case for each GPU.
It would require a redesign of motherboards, cases, and GPU cards, so I'm not holding my breath. I think it's more likely that at some point we'll see cases with integrated watercooling loops, and the user would simply plug in the CPU and GPU blocks.
But then you're back to the same problem of needing custom coolers which work with the modules AND the case, where we've seen Apple never release an upgrade part since launch.
I am quite sure that apple's is more constrained by appearances than the coupling of hardware and software. the mac pro design while very unique is so constraining that they may have simply painted themselves into a corner and had no graceful out.
if they had kept the previous incarnation then updates could have proceeded yearly or even once every two years.
I do agree this proposal lacks a lot, it certainly has no Apple vibe to it
Well, except for the ridiculous magazine design which looks to be aggressively anti-ergonomic and couldn't actually work as rendered without bullets actually passing through one another as they're fed into the chamber.
The FN P90 has a similar angled entrance to the chamber. The rounds are perpendicular to the chamber while in the magazine, but are directed into place. I never bothered to buy a PS90 (the U.S. semi-auto version) to find out whether this design results in higher-than-typical stoppages.
Calling him a game designer based on [1] seems too limited. Yes, those renders you can find are about gaming. But he's done more than that, see his resume.
Kaby Lake desktop have upped the number of PCIe lanes by 20% if the Xeons do the same then you just about have enough -- your list above is 112 lanes, 48*1.2=57.6 and so 56 is not unreasonable.
The chipset can add more also, but they will have higher latency. It's common on lower end systems where the CPU only has 16 or so. Those'll go to the graphics card slot (the only 16x on the small board) and all other peripherals will come off the chipset's PCIe lanes.
Definitely. They'd have to develop something entirely new to do it, which I bet would cost more than just doing a quad socketed system to get all the lanes.
I get that this is a concept not to be nitpicked to death for feasibility, and I love it. I'm also struck by the fact that a fucking computer has produced such heartache in people that somebody spent an ungodly amount of time on this labor of love.
I don't care how little of their revenue comes directly from selling Mac Pros, it's the feeling that could produce this response that they should be optimizing for, not small-ness, thin-ness, or port-deletion.
>it's the feeling that could produce this response that they should be optimizing for
As near as I can tell, this is exactly how Apple got to be Apple and they already know how it's done. When people I know who have owned non-recent macs describe them, it's like they're having some kind of profound emotional experience. They don't even talk about it like it's a computer. I've never heard them say things like "having all the applications go to the blue 'Applications' icon on the dock is great," even though that's a useful feature they seem to understand and enjoy using, they say "it's so easy to find things" or "it just works". I know a longtime OSX user who is particularly detail-oriented and I asked him why he liked working with Sprinter vans[1] as opposed to trucks. He explained that they handled more like big cars than trucks, which is handy for maneuvering in cities and backing into cramped loading docks. He had no problem discussing his preference in detail without any prompting, the same way I'll tell anyone who asks that I like headset microphones to desktop ones because it's one fewer peripheral to manage and I have my desk laid out just right. When he talks about his Apple products being good, he just says they "work."
So clearly they've succeeded in the past at giving people things that they really really like. They like them so much that they don't even dwell on the specific things that make them so likeable. I've never made anyone like anything that much in my life, and if I had a process for it I can't imagine why I'd stop.
I'm a pretty die hard Apple guy starting from the mid 80s, and I find a lot resonates with your friend's "they work" example. Me, I love the fact that the machines are beautiful, but also functional in these subtle ways. Like the fact that you can open the laptops with a finger -- they're balanced correctly, the hinges are and the closing mechanism are calibrated s.t. you can lift the lid without the base coming up. Such a pointless bit of design, really, but it's so satisfying that somebody spent a bunch of time working on that. To say nothing of the trackpads, which were just an order of magnitude better than any trackpad on any other brand of laptop I've ever used. Maybe high end PCs have caught up by now, I don't know. Yet.
There's something about things made with such passion, such attention. They stir up the religious impulse. I really feel that, for the first time since the Jobs comeback, I'm on the verge of losing that religion, at least wrt the Mac line. (The iPhones still seem obsessed over in all the right ways, as far as I can see.)
That's why this is a work of an aspiring designer, not an (apple) product designer: can someone point to a motherboard with 16 dedicated thunderbolts, such many lanes of PCIE, and answer why should 850 evos be used instead modern M.2 SSDs? The coolers/fans are OFF - they're not positioned above the GPUs, but above SSDs (which produce almost no heat at all), and the SSDs themselves are located around the thermal core triangles, why?
And GPUs are facing opposite directions, therefore, air streams are broken.
I know I've shouldn't be pissed that much by a stupid render, but this person could dedicate his time to make something meaningful and smart. Instead, he's just pushing the dribbblisation of the design forward. My call: this is stupid, meaningless work
> 50% of the site was black background + scrolling for me, but I think I can get the gist of it. Nice design + expansion capabilities, right?
Apple's Industrial Design group needs to get it through their skulls that folks doing REAL pro work still need traditional expansion capabilities. At the very least, pro users need to:
1) Have the ability to expand RAM
2) Have space for two video cards (ThunderBolt 3 + video card enclosures is not a desirable solution)
3) Have space for at least a couple internal hard drives
For some reason, I don't think this will ever happen because the end result would probably be bigger, noisier, and uglier than what the ID group would allow. But, man.. wouldn't it be nice to be able to purchase a base config Mac Pro 2 with one stick of RAM, shipped with integrated graphics and the user could drop in any graphics card(s) they wish?
Hackintoshes can work fine for some, but oftentimes we just want to be able to run software update without the fear that a patch will break our bread-and-butter making machines.
I wonder what OSX-specific software keeps professionals on machines that they feel are underpowered, even if it's a rather mighty Mac Pro?
All the graphics and video editing suites, AFAICT, are now also available for Windows, too. For audio editing one probably does not need two video cards.
I write software for iOS / Mac as my day job and it's really gotten to a point where it feels like writing C# on Windows used to feel (which is not a bad thing). The big difference is that back then I could build a powerhouse PC for $1500 and have a handful of big monitors and really feel ahead of the power curve.
I don't feel like I have the option to do that these days, and I'm pretty disappointed by it. $4,000 on a laptop is my best attempt and it's still not.. quite. there.
Apple's vendor lock-in strategy geared towards developers appear to really be bearing fruit at the moment.
They've shifted most of their hardware lineup towards consumers and is the weakest it's been for developers in a long time, but developers are still hanging around because they have to if they want to target the Apple ecosystem.
People who've been in the industry for longer than 15 years would remember the last time this happened.
Perhaps this isn't the right place for this, but I'd be interested in a broader discussion about this. What do we do about it?
Even if we see it coming a mile away, or if we wake up and look around and see it happening.. what can we do?
Go back to the web is always an option, but I don't like it. I still genuinely believe that the experience I can deliver writing software for iOS is higher quality than what I can deliver on the web.
As an Android/iOS developer, I personally wouldn't need all of the power. However, I could see where professional video editors and the like working with multiple 4K streams and wish to stay within the familiarity of the ecosystem would want something a bit more expandable/flexible.
The current Mac Pro and iMac do have expandable / user upgradable RAM.
> 2) Have space for two video cards (ThunderBolt 3 + video card enclosures is not a desirable solution)
It's still early days for external GPUs. I suspect they will become quite popular and maybe even the preferred solution for many people. They offer a lot of practical benefits such as interoperability between desktops and laptops.
3) Have space for at least a couple internal hard drives
IMO low cost / high storage density SSDs and affordable NAS devices pretty much solve this problem.
Exactly. The trashcan Mac Pro had the same idea. But if I recall correctly there were some licensing issues that prevented many manufacturers from adopting the concept, and the cost did it in.
That the idea is not commercially feasible today (as in, my client wants their project done in 2 weeks) is precisely why pros need basic expandability.
I appreciate it's just an industrial design concept, but the problem with off-the-shelf GPUs is that you need to route the DisplayPort connectors back to the motherboard in order to mux them with the Thunderbolt ports.
The Thunderbolt add-in cards have DisplayPort inputs for this purpose.
The GPU & motherboard vendors should agree on some extra headers to allow you to route these DP signals without ugly jumper cables on the outside of the case.
> but the problem with off-the-shelf GPUs is that you need to route the DisplayPort connectors back to the motherboard in order to mux them with the Thunderbolt ports
If the Intel CPU supports integrated graphics (which I realize a Xeon may not) there is actually support for sending display streams over PCIe, so it wouldn't be necessary to route DisplayPort cables from the card output to the motherboard.
The eGPU guys are doing this to display the output of an external GPU on the internal LCD of the laptop. [0] It does slightly reduce the PCIe bandwidth available as you need to transfer the rendered data back via the PCIe, which is then output from the IGP to the display. [1]
However this is still limiting because it means you're stuck with the display outputs supported by the IGP, so if you want to upgrade the video card to support a newer standard (e.g. HDMI 2.0) you'd be stuck.
A single 4k/60p/24bit display requires 1.5GB/sec, about 10-15% of your PCIe readback bandwidth.
Since it has to be both written to and read back out of main memory, that's 3GB/sec, which is between 5-10% of your CPUs memory bandwidth.
It's OK for 1080p res but falls down at higher res.
Apple don't do this on their laptops either, and still employ a gmux to switch between integrated and discrete graphics for the laptop display (although that's mainly for power-saving).
Have you tried this? The amount of bandwidth consumed is non-trivial and especially impacts compute workloads or anything that does read-back from the GPU.
I have not used this feature myself because I'm not aware that it's supported in Linux, and I don't use Windows.
> The amount of bandwidth consumed is non-trivial and especially impacts compute workloads or anything that does read-back from the GPU.
Given that the most typical use case of this technique is for people gaming, I'd say that the amount of data being read back from the GPU in a gaming situation is quite small, so this doesn't have a significant performance impact for its intended audience.
Since I don't use my dGPU for anything but gaming, do you have any more information on the performance impact on compute workloads?
I'd like to see the industry move away from having connectors on the back of the GPU at all and just provide a standard routing interface of some sort. I hate whenever I buy a new GPU having to figure out what adapters I'll need for my multi-monitor setup because my new GPU has different numbers of HDMI and DP outputs than my old one.
This seems unlikely to happen for the foreseeable future - if ever. As pixel counts grow the port standards are evolving quickly (both HDMI and DisplayPort).
A machine where the display interface is on the motherboard will have a hard time keeping up with these developments, whereas keeping the interface on the GPU will allow the user to add support for new standards over time (see: HDR, 8K, etc).
But then you are limited by the number of connectors on your motherboard. I know people who have more cards to support more monitors(not in SLI, just side by side)
Real Mac Pro would be an aluminum "cheese grater" large quiet box that sits under the desk that I never see or hear unless I want to upgrade something.
The trashcan is at most Mac Mini Pro. Having thousand cables and external boxes sitting on your desk to expand it is not practical nor elegant, and gets out of date pretty fast. And quite frankly it's ugly as well.
I have a G5 from the last generation of the cheese graters under my desk right now. The case has seen better days, so I might have to source a better preserved one soon, but I can say it runs Linux just fine.
The mechanical design is quite sound (forgetting about the HDD cage for a second), though, and they look fetching, even today. (imho much better than the trash cans)
There are some really nice PC/Hackintosh builds with hacked up G5 cases. You can even get custom-built motherboard trays that are meant to go in those cases:
i still use my 2009 8x xeon mac pro as my primary workstation. upgraded ram, gpu, ssd, video, 4k monitor, and a couple of other things.
i'm trying to get 10 years out of it, which looks do-able, maybe. it refuses to install the latest OS X's though, so it may finish its life as a linux box.
will 2019 finally be the year of linux on the desktop???
I managed to install the latest OS on mine. Basically you have 4 options:
1. Hack Sierra installer to bypass restrictions.
2. Use target disk mode on Mac Pro and another Mac that supports firewire 800 to install Sierra.
3. Clone internal disk to external. Do upgrade to Sierra on it, then clone back to internal.
4. Remove internal disk, plug into external USB case, install Sierra on it. Plug it back to Mac Pro.
In all 4 cases you must hack the /System/Library/CoreServices/PlatformSupport.plist to make it bootable on Mac Pro.
You need to do it for recovery partition as well, if you ever want to boot to recovery partition.
I have the early 2009 model but I primarily run Windows 10 on it as a file server/plex server. I was having some file system corruptions on Mac OSX and decided to go the NTFS route on Windows.
But where is all the bandwidth for 16 TB 3 ports supposed to come from? That’s 64 PCIe 3.0 lanes, mind you.
Adding to that, 32 PCIe 3.0 lanes for the graphics cards, 16 PCIe 2.0 lanes for the TB 2 ports, 2 PCIe lanes for the Ethernet ports, 2 PCIe lanes for the USB ports.
Absolutely doable. The modern i7's have 40 PCIe lanes. If you built a quad processor system you would have 160 PCIe lanes... Have a look at the obscene number of PCIe lanes in modern server architectures.
The bigger issue would be memory bandwidth, but I think that could also be engineered around.
Edit: SuperMicro sells a system with 8 PCIE 3.0 x16 + 7 PCIE 3.0 x8 ... 184 lanes not counting the other onboard peripherals. So it basically exists minus the Apple form factor.
The point was less about the form factor and more about the number of PCIe lanes that can conceivably be available and the associated memory/bus bandwidth that is possible.
Haha, I see. That's a blade center with 8 "PCs", each supporting one PCIe x16 card and some supporting an additional x16 card (with only 8 lanes connected).
Of course not; this isn't even possible with a dual socket system, which would also lack the memory bandwidth to do anything except idling with that many PCIe lanes. (Plus, if your IO bandwidth is pretty much the same as your memory bandwidth ... what exactly are you gonna do in terms of data processing? 1+2?)
But this is just one of many things that just don't make any sense at all about this design. It might look nice, to some, though.
If you want to feed dozens of GPUs with compute tasks then there is more specialized hardware for that with way better density. (That goes into a rack of course).
Just spent £2000 on a PC after 17 years of purely Apple because I wanted CUDA cores for my creative work.
If Apple had something like this as an option I'd have easily gone upwards of £4000 to foolishly stay within their ecosystem. Guess my wallet is better off in the universe where Apple doesn't want my custom.
I feel very much like doing something similar but I'm reticent to move away from macOS because of privacy concerns.
If I moved to Linux I would still need some commercial software available on macOS but not on Linux. The obvious thing to do would be to supplement Linux with Windows 10 but I worry about privacy in Windows 10.
This maybe obvious to long term Windows users but I can only find examples from Microsoft about what they collect/track, not a definitive list[1].
Also if anyone knows, I'd also like to find out if it's possible to stop Microsoft Support from remotely accessing files on a Windows 10 machine and if you need a particular version of Windows 10 to do that. e.g. Is Windows 10 Home more or less secure than Windows 10 Enterprise.
So for the moment baring in mind I don't do CUDA I'm sticking with my slow old Mac. I should probably should build a dual booting hackintosh but I think if would be a lot of trouble to keep going (and also breaks the licensing terms).
Look into Windows 10 LTSB (Long Term Service Branch). It's rock solid, doesn't auto-update (only critical security patches), as part of the installation process you can switch off everything, doesn't have Edge or Cortana or any other guff. It was made for ATM's and single purpose machines. I use one for my Plex Media Server and XProtect / Sighthound Server etc.. all rock solid. The "apparent" disadvantages listed against LTSB I actually believe are advantages. Like, want new Windows 10 features? Sorry you'll have to reinstall the latest full release of Windows 10.1 LTSB.
Take that privacy statement with a grain of salt. That's for microsoft.com accounts. When you install Windows 10 they allow you to create a local-only account and opt out of any data collection.
Yep, when we needed to buy a beefy GPU machine we did consider Apple. But no nVidia (so no CUDA) and no update in years makes it a bad value proposition. So we dropped the money on a Dell (which besides the Tesla processor has far more modern cores and allows more memory).
I do understand the reluctance from Apple to built upgradable computers. They make their money on hardware sales, and an upgradable system would hurt those sales. At the same time their "Pro" gear simply isn't iterating fast enough, perhaps because not using standard components slows them down.
It's not Apples style, but it wouldn't hurt if they gave their professional customers a three year roadmap, just so people would know that they plan to move forward, and in which direction.
I smiled at "Standard components ... exceptionally futureproof". The way Apple are going the next mac pro will probably run an ARM CPU and have the RAM soldered on
This type of design isn't useful. Round things don't fit neatly anywhere on a desk. Air cooling makes no sense when your design goal is high performance in a small space. What would really be useful is something 2.5" thick and as long and wide as necessary. Think about it you could lay it flat on your desk and put your screen on top of it, stand it up behind your screen, hang it on the wall behind your screen, bolt it to the under side of your desk, bolt it to the back of your desk, or bolt it to the side of your desk.
It doesn't need to be that thick or that wide even. Consider the Magnus EN1080 [1]. It's a bit taller than a mac mini, has an i7 and a GTX 1080. It's got the liquid cooling you seem keen on too.
I believe this is the future of pre-built PCs. It's so much more convenient than a big beige box or the monitor stand you're talking about.
For a while, most non-Macbook Air 'ultrabooks' were hampered by the fact that they had to have an ethernet port, or I guess enterprises wouldn't buy them? It was weird. The whole entire computer ended up being designed around its biggest single constraint.
This is exactly what's going on here. "Bigger is better" is an attempted rationalization for the fact that graphics cards are determining the design of this computer. But then look at the Mac Pro: they solved a core problem of performance machines (cooling) with its weird looking design. There was a functional reason for its' looking like a turbine. Here, the rounded ends are pointless (haha design joke).
Edit: there's an interesting problem at the intersection of industrial design manufacturing process that this does solve for. The Mac Pro design process was obviously very involved, and it required lots of folks' attention. They obviously aren't paying attention in the same way, so the care that is needed to make something performant and beautifully designed isn't happening, and the releases aren't happening. This guy's design does an end-run around Apple-like industrial design, and in choosing compatibility with off-the-shelf stuff, probably makes the product more likely to be relevant to folks in the future than Apple's Mac Pro, which is just languishing in long product update cycles.
That's a strange definition of "hamper" you have there. What's being optimized for a "pro" computer or "ultrabook"? It's computing power, and expandability. Looks & portability play second fiddle there, although it certainly helps to have something as sleek as can be managed. But not at the expense of technical features.
Are 16x TB3, 4x TB2 AND 4x USB3 actually achievable with current/near-future hardware?
If Apple were to implement this, I'd imagine it would be N x TB3/USB3 USB-C format ports, an ethernet port and maybe HDMI (though a dongle would possibly negate that – if 2.1 can be achieved that way)
But the sacrifice is that you actually only have 80 lines of PCIe, and it all has to be coordinated. Dual-socket computing does not quite double your processing power, and a switch adds latency. Your overall bandwidth will still be limited to what you have coming from the processor.
Instead of fancy design, I would rather see some fairly boring box that would enable Apple to use almost off-the-shelf hardware. Kind of hackintosh, but made by Apple. This way it would be much easier for them to bring regular (yearly) hardware refreshes.
Yeah I noticed that also. Seems strange to make such a mistake with so much attention to detail.
Pretty design but by my count that machine would need to be a quad CPU (not core) system to support all that I/O. Where are those 4 CPUs gonna fit and how are you gonna keep them cool?!
I like the size of this. The expandability of the machine is nice considering the current lack of refresh on current apple desktop hardware. I'm not holding my breath though.
I have the old mac-pro (cheese grater), and it was remarkably expandable (and easy to do so). Its remarkably heavy (theft deterrent).
You can go see what the hackintosh people are building with commodity hardware:
Reading comprehension: "bigger trashcan without a proper..." means that this is a bigger trash can that does not have a proper cooling system. The regular trash can thus has.
Somehow the grey metal Mac Pro case shown beside the current Mac Pro and this Mac Pro concept design looks to me much better and functional than any of the other two.
It makes a much more professional, high-quality, clean impression and has definitely the potential to house at least the same if not more hardware than the cyclindric ones.
Looks nice but I don't think that's practical (or realistic).
That's way too many USB ports, SSDs on one machine.
BTW for devs out there that need that much horsepower, what do you do? (I understand needed 32 GB of ram which was not available in the latest MacBook pros, but when do you need that much storage?)
At my last job, we had servers with 64 cores, 1TB ram, and 100+ TB of storage that were taxed to their full extent by one or two people at a time. The job was essentially taking multi terabyte datasets and matching, filtering, and mutating them based on some customer requirements.
Those servers cost a ton of cash, and we had a handful of them. I could easily see my workflow being improved and maybe even money being saved if that much horsepower was given locally to the few who used it most. But the server model also worked fine.
Personally, I have a monstrous box at home in terms of GPU, ram, and processing power, and it's mostly useful for running automated tests in multiple VMs at a time, kicking ass with ocl-hashcat, and doing hardware accelerated ML training. I don't do those things often, but I'm glad to have a beast when I do. Also, it's fun to eyefinity dual 4k on full settings ;)
Funny enough, the thing that had me thinking the most was the double ethernet ports. What would that be for? Two different networks? When one cuts out you just switch to the other one?
The current real Mac Pro actually has two ethernet ports too. There are a number of ways this could be useful.
Number one would obviously be to connect to two different networks. It could also be useful in a scenario where the machine is used as a server and you want a semi-redundant backup network line. It could also be used to directly connect to a NAS box via Ethernet.
you can bridge network cards to double your connection speed. You can connect to 2 different networks (load balancing or 2 different Internet Server Providers). You can run multiple virtual machines on your mac and separate traffic instead of putting all into one adapter. Multiple LAN is a norm now, especially if you in IT.
The dual ethernet in the current Mac Pro are only 1 GigE and gets around 100MB/s (real world)[0]. I aggregate the two cards to get better speed. Even a 10 GigE couldn't max out the SSDs in these machines which are 2GB+/s r/w.
[0] by real world I mean on a properly configured network. Many companies don't have that though so you often see "gigabit" speeds in the 30-50MB/s range.
1gbs is pretty slow nowadays and if you do any work that uses huge files that need to be copied to a NAS or a file server then moving up to 2gbs via bonding cuts transfer times in half.
10gbps is expensive, uses a lot of power, requires cat6, and rare outside of servers and SANs currently. Interestingly enough, there's a recent 5gbps spec that works with Cat5e that might be picked up by OEMs to replace 1gbps. Still no chips or cards out yet, so it will probably be a while. In the meantime, bonding two 1gbps interfaces gives you about a 200 megabyte/sec write & read speed off a SSD based server/NAS.
Should build in dual Solarflare or Mellanox 10GbE interfaces with SFP+ cages. Then you can use a TP transceiver if you want GbE, or DAC/fiber for 10Gbps.
Looks great... just needs 10GbE instead of 1Gb. Add an SFP port and a RJ45 if all you got is a 1Gb or want to try your hand at getting 10GBASE-T working.
While I really like the aesthetics of this, I think it still suffers the same problems as the current Mac Pro.
It needs more PCI card slots. It needs to support more than two GPUs. To suppprt that it's also going to need one helluva power supply... it needs its workstation credibility back more than it needs a beautiful design.
I still think the previous generation of Mac Pro look great. They'd look even better in Space Gray.
The Mac Pro design was fun to look at, but the concept firearms on this site were the most impressive to me. Well thought out and engineered from a concept perspective, addressing pain points for the customer that requires some pretty serious knowledge. I'm surprised somebody hasn't tried to manufacture the Thor A1 at this point, as they seem to fit a pretty sizable PPW market.
I think the next Mac Pro will need to fit within the Rack.
Basically merging the Xserve and Mac Pro together. It will need Dual CPU and Dual GPU space.
But May be the consumer Mac and iPhone is simply too large of a market, where even small minor profitable business like Server and Pro market are being ignored. But sometimes it isn't about profits, but ecosystem.
I don't want a system that "anticipates" what I may need. I want a system that allows me to update and upgrade easily based on what my needs are, and how they may change. In that way, the cheese-grater Mac Pro was ideal. Apple needs to get back to that mode for the desktop Power user.
Not bad. Reminded me of the internal arrangements in the MSI Nightblade MI2, a pleasant small desktop gaming machine. You can buy some of them for under 1k USD.
I also had one big black page. I thought it was a joke and it meant the author thought there was no Mac Pro 2 coming. Then I remembered this is not Reddit and there are no funny links on HN.
I don't know where you get this impression from unless you're only looking at what's currently (or recently) shipping. They've always used both and played them off against each other. Which vendor is used for a given model changes from one year to the next, although sometimes there will be long stretches of seemingly only using one or the other. There've been plenty of times over the last 15 years where one or the other has been predominant in shipping products, but it never lasts more than a couple years. There've also been plenty of times when the currently shipping product line featured both vendors.
This looks cool...but the inside of the machine is so unlike Apple. It is scary and futuristic in an Alienware way. Apple's design (even inside) is always more human and approachable.
Nice work, but it looks like it's transitioning from a small circular garbage can to a rectangular garbage can. The original aluminum Mac Pro shrunken down would have been the ideal design IMO.
But it's just a concept, and not a great one at that. They want dual PCIe-x16 graphics cards, which requires 32 lanes, and also SIXTEEN Thunderbolt 3 ports? For full speed, those require 4 lanes each.
Not to mention multiple other PCIe lanes allocated to USB3, SATAIII, M.2, TB2, network, and so on? By my count, the specs are pushing 100 lanes of PCIe. Even a big Extreme edition processor only has 40 lanes of PCIe, and a PLX PEX multiplexing chip only goes so far. This is basically calling for dual-socket 135W $4,000 each Xeon Broadwell E5's PLUS a PCIe mux. And thermally, even doubling the current Mac Pro, you're not going to fit 500W of graphics and 300W of processors in a 30-liter case.
It's a pretty graphic, and I like the concept of lots of ports and expandability, but it's not based on anything realistic.
It loads for me. but there is no content. There is a header/menu at the top and a long, black page I can scroll through, at the bottom there are links to other concepts but the main content of the page is just missing.
Is your car an entry-level bicycle? What kind of servers are you running that cost 30x an actual car? We run some expensive servers but none that cost anywhere near 30x the cost of my car.
I'm including the support contracts in both numbers. I work for an ISP, so our storage systems are both large and very sensitive to downtime. You don't want x0,000 customers calling about their email, HBO Go, and billing logins failing. So the support contract is a bit extreme to match. The exact price is under some kind of NDA but let's say my car was a bit over $30k.
That must be some impressive support contract. For close to a million dollars per server I hope they come with an indentured engineer who lives in your colo ready to do a hot swap at any moment.
It's just ssh. My local library did have a dial-up line for the longest time but I think they finally retired it, that's the last one I might have actually used.
I think the last one I used was in 2003. My university offered it for internet access. It was actually just dial-in shell access, but you could tunnel an internet connection using Slirp. Nice and slow.
looks great but will never happen -- the mac pro is not a priority for Apple -- sometimes I wonder, what, if anything outside of the iPhone, really is a priority for them.
cool but if you can upgrade the gpu and storage, than you should also be able upgrade the cpu, ram and psu? at that point, it's just the old ATX mac pro design
The next Mac Pro will just be really really thin, have no ports, and gets it's power wirelessly from a mat underneath it, but no batteries. It'll have 8 cores and max out at 13g of ram
Oooh, an artist spend 20 minutes with PhotoShop and now Apple can just take his pretty pictures and start pumping out new computers. It's not like anyone needs to actually create an actual prototype, source components, set up manufacturing, market and deliver anything.
Create a concept design from a popular product and put it on a slick landing page. It shows that you, as a designer, are proactive and think beyond designing standard stuff (like webpages or mobile apps).
Moreover, you are not limited by any client restrictions[1] which hurt your work (and portfolio), you learn 3D modelling if you haven't yet (it's not hard just time consuming), if you are lucky with social news sites you get so much free promo and finally, it's the eye-catcher on any CV.
[1] A classic and recommended post if we talk about clients restricting designers: http://theoatmeal.com/comics/design_hell