My 15-year old daughter just recently got her first desktop. Why? Well, because instead of just consuming (which she does on her phone) or doing homework (chromebook), she wanted to do video editing, and also drawing using a Wacom tablet. She found that the non-desktop options were just not that great.
Had it been a passing fancy, no doubt she would have done without the desktop, but it wasn't, so she got a tricked out desktop (intended for gamers, I think) and now has the most powerful computer in the house. She still uses her phone and (when in school) her Chromebook, but when she's in creator (not consumer) mode, she uses her desktop.
> My 15-year old daughter just recently got her first desktop. Why? Well, because instead of just consuming (which she does on her phone) or doing homework (chromebook), she wanted to do video editing, and also drawing using a Wacom tablet. She found that the non-desktop options were just not that great.
My about-to-be-a-college-junior daughter has been creating for years on an iPad, and she's pretty good at it. She's transferring to VCU this fall for their graphic arts program and we just set her up with a new MBP and a Wacom Intuos. She started to practice on it but finds the experience of the external tablet and viewing an image on screen much less appealing than drawing directly on her iPad.
For certain tasks, the iPad is superior input device.
Maybe someday the iPad won't be hobbled by iOS and it will be fully suitable for desktop tasks too.
(Currently typing this on an iPad air with smart keyboard.)
A tangent, but I've found the iPad Pro to be fabulous for video editing. I haven't hooked it up to a keyboard/mouse combo, or plugged it into a monitor, but it's quite capable of both of those things.
I can tell you it's simply superior to a Wacom tablet for drawing, there I have a direct comparison. A PC/desktop will certainly exceed an iPad at some level: but now that the Pro model has an M1, it's unclear at what level that is. I certainly don't edit at a level where the difference between Logic and LumaFusion becomes apparent, and I expect that Apple releasing Logic for the iPad is a matter of when, not if.
The iPad is also as expensive as a good laptop, so in more than one sense those two categories are converging.
I'm sure your daughter made the right decision for her, I'm just taking the opportunity to build on your comment with a bit more evidence that the home computer is a vanishing breed.
This is honestly an absurd proposition. A 13 inch screen Ipad pro with a keyboard and 2TB of memory is $2500. Video editing is something you want power, and a lot of screen real estate, and a good form factor. Even with the GPU shortage, you can buy an extremely powerful, capable, video editing desktop with two large 27" monitors for $1k. Even before you consider limitations in sandboxing, multi-application usage, and freedom, the Apple Tax for this makes a non-starter. If you go with the lowest end specs, that is still $800, which you can buy a significantly more capable desktop for video editing at that point, then you will be editing video on an 11" screen, with no hotkeys or keyboard, and only 128 GB of storage (good luck).
It depends on what you're doing and what your budget is. Shoot & edit on one device, plus an excellent drawing experience, is pretty compelling when you're talking about a device for a kid to learn on, and the main things it'll be used for are video & drawing. You don't need 2TB of built-in storage for that, either.
Only way I can see that not being the option to beat, is if your goal is to learn the UI (not the concepts or techniques) for pro-tier desktop editing software. Then, yes, of course you have to have a desktop. I think "non-starter" is overstating the case against the iPad, for the particular tasks and user-profile stated.
> video editing desktop with two large 27" monitors for $1k
Most devices these days including the iPad and things like Go Pros film in at least 720P which absolutely fills up storage space in no time at all. I started filming my kids' sporting events in HD and editing them on my laptop and I completely filled up 1 terabyte of storage on my computer in no time at all. Then I had to start playing the game of moving older stuff to external storage which totally sucks if you want to go back to it quickly. 1 minute of 1080P video at 60fps is around 200MB depending on the format, an hour is like 12 gigs. The only reasonable way to manage almost any amount of HD video is on a desktop system with loads of cheap(er) storage.
Well yeah, for bulk video storage you could eat 2TB easily. Active files for an editing project at 1080P, especially if you're weeding through footage after each day of shooting (if you don't, you'll go nuts later), 512GB should be OK.
There's nothing absurd about it, you're just looking for something to get mad about.
Your reference to the so-called Apple Tax makes that perfectly clear.
What I actually said is that I, personally, find it fabulous for video editing. I made it clear that I'm an amateur who doesn't make heavy use of any advanced features, and furthermore said that I'm sure that the daughter made the right choice for her.
You blew into my mentions getting mad about price tags, which I never mentioned. Your assertion that someone can get better performance than the M1 for $1K including two large monitors (high DPI?) is something I'd need to see backed up, because quite frankly I don't believe you.
The one thing I'm quite confident of is that the iPad is superior for drawing to a PC and Wacom tablet, because I've used both. I'm quite confident from your pugnacious and unpleasant reply that you haven't used an iPad for either video editing or drawing.
If you are suggesting that desktop computers will be replaced by tablets, I say NO. They are supplemented by them.
I have an earnest question, as I have had to resort to mounting an SMB share in “Files” app because iTunes is an insufferable PÓS:
How can you access files on the device and transfer them to a computer, for example?
I think that proprietary platform limited software on iPad is not typically mature as a main video editor or DAW. I record audio and produce music, and while my iPhone and iPad are great for making midi sequences, and even though I CAN connect a usb audio interface, it’s clunky and the opposite of why I would use the device, aka on the train or away from home. I typically record my drum set or guitar or trumpet with a rather large setup involving a rackmount audio interface, hardware mic preamps of good quality, and let’s face it: drum sets are large.
I know many folks that use their tablet connected to their desktop DAW over the LAN to control parameters on a synth or effects processor. In live sound setups it’s a great remote. It’s not a replacement for a larger computer, and size and scale of peripherals and the tasks involved are large. The form factors have legitimately distinct purposes.
> A 13 inch screen Ipad pro with a keyboard and 2TB of memory is $2500.
The 2TB memory is accounting for like half the cost there. You can get a model with less built-in memory and just use an external drive and the price drops down to like $1400.
I haven't found 256GB to be much of a limit in editing video, and I do shoot 4K. But I've only really worked with a couple hours of footage at a time, and I'm comfortable with keeping my "b roll" on my NAS. 512GB would probably be a better buy.
Or Apple could stop ripping off their customers and we could just be comfortable paying them a reasonable amount of money for a reasonable amount of storage.
iPads and Apple’s processors had purpose built processes for video editing. My M1 Air is faster than my 2017 15 inch hexa core and respectably close to my 2070 Super in Handbrake. You can probably get away with an M1 iPad Pro for a LOT of tasks. The T2 in the touch bar chip on my 2017 is faster than the intel processor in some encoding tasks I’ve done.
As someone with formal training and experience in video production with Final Cut Pro and Premiere...I find editing on iPad to be an utterly infuriating experience.
I gave it a sincere try, but the UX is so utterly subpar that it's simply not worth the time or effort anymore. The file management on iOS is atrocious: support by various editing apps is questionable at best, and even if it wasn't, it's incredibly slow. Waiting while importing high-res videos off SD media to the file system, then turning around to wait an equally long time for the media to get imported into your editor is extremely irritating. There's also not a lot of serious competition in terms of quality editing apps. It seems like everything other than Luma Fusion is gimped to the point of near-uselessness for anything but the most basic of cut-and-crop type workflows.
iPad did have a slight edge in terms of render performance and efficiency versus ultraportable notebooks like the MacBook Air for a brief period between the A12x-based Pro release in 2018 and the release of the M1 MacBook Air in 2020, but now that the M1 Air is on the table, it's well worth the few extra ounces to go with a M1 MBA for mobile video editing.
Editing video on the iPad is, it pains me to say, "disruptive".
I believe you when you say it's infuriating to someone with formal training, used to the professional tools. I wouldn't know, this is how I've taught myself to edit video.
It's inferior and the software is cheaper. The hardware isn't necessarily, but it sure is if you own an iPad and don't have a dedicated machine for editing video.
And the experience is steadily improving, and at some point it might well "worse is better" the specialized hardware and expensive software which is the industry standard.
Or it might not. Time alone will tell.
It's worth pointing out that the current iPad Pro also has an M1, so the choice between an MBA and an iPad Pro is much less clear along that dimension.
I generally agree with you and recognize that I'm coming at this from a somewhat niche perspective.
I think the thing that frustrates me the most is that Apple has repeatedly failed to fix the performance and usability issues related to file management that have plagued iOS/iPad OS since the feature was introduced. Fixing the poor performance + increasing versatility of the file manager (or allowing 3rd party file managers like Android) would massively improve the editing workflow on iPads, and for me, would probably push it into the 'good enough' tier of performance that might lead me to leave my MacBook at home when editing video on the road.
There's definitely some latent potential for video editing to be redefined along broader task-and-genre lines. That is, in essence, the difference between "consumer" and "pro", because a pro is burdened by having to reshape the medium in utmost detail to fit a structure that they planned, while a consumer expects a readymade media which simply "filters" their efforts through a predefined strategy.
Since video editing is no longer governed by literal machinery, it can deprofessionalize into a set of smaller consumer languages, though if you imagine all the forms of editing that exist, it's unlikely that we'll have total coverage anytime soon.
>> I can tell you it's simply superior to a Wacom tablet for drawing
IMO, yes & no. For certain types of work, an iPad Pro is amazing. For direct sketching, drawing, for sure. It's easily my favorite drawing table for these.
For certain other types of work, for example: colorists work of filling large areas precisely within black lines (like comicbook coloring) or maneuvering around a huge document, a commercial grade Intuos tablet (or the like) is a better match.
One of the biggest reasons is that the Intuos has physical, raised "quick action" buttons which can be programmed to switch between tools, so you can zoom, rotate, brush, erase without ever looking at a keyboard or even looking away from the screen. This is crucial in certain workflows. Edit: also, the fact that your hand is never in the way on an Intuos is really key.
Admittedly, this is a very specific use case, but I figured I'd mention it for completeness sake.
She's doing storyboarding-type work, and we did check into the iPad option first before getting the Wacom (which I think might be Intuos). It did not measure up. Of course, the iPad does all kinds of other stuff, but I think that might be the problem. The Wacom is made for drawing, and that's about it, and thus it is very good at it, apparently.
Ipad Pro, procreate, and if you really need to a paper-like sheet over it (various manufacturers).. it's 98% of what wacom experience is like (2% being the physical buttons). On the flip sode, taking your ipad pro out and drawing outside... A lot better than taking the portable wacom outside.
I have decades now (shit, I'm old) experience with these tyoes of devices in professional settings. There are two classes now, one with direct drawing over screen, the other indirect one. I have my brain and muscles trained for both, and I'd rather take the former than latter any day. It's more natural and in some cases necessary (hand-drawn animation is way slower by indirect method, if impossible).
Having said that, I'd only consider and compare three devices (there are more). Cintiq attached to a computer - fantastic experience and highly non portable and requires a lot of space. Allows for photoshop, which is still no 1, as well as pro 2D animation tools. Portable wacom/cintiq which is a pc in a fat tablet form.. not that great (heats up, battery, fat), but allows all wibdows tools you need.. and then there's ipad pro with its pencil. Procreate is fantastic and saves the show on the app front, apple pencil and screen surface is subpar experience to wacom nibs and surface (not cintiq one!!), but it takes like half an hour to get used to it and never give it a second thought. There are screen overlays which mimic wacom surface though, I just don't need it. Also, ipad pro has the advantage there of having less space between the tip of the pencil and the surface being shown (less of a gap between glass and drawing) and that amazing refresh rate is, well, amazing! I have "unlimited" budget for these things and my first tool to grab for drawing and painting is ipad pro. For animation and matte work (where you need different tools), it's cintiq on a windows machine (why not a mac is a different topic altogether).
That's the difference in a nutshell: I simply can't get comfortable with indirect drawing. Maybe with enough practice, but there's never been a forcing function to get that practice. It almost feels painful, to either look at my hand or look at the drawing.
Which leaves the Cintiq, and when I used one, I thought the screen quality couldn't touch the iPad, plus it's at least as expensive for a single-purpose tool. It wouldn't surprise me if the Cintiq's have caught up with iPad on that dimension now, my experience is about seven years out of date.
>> The Wacom is made for drawing, and that's about it, and thus it is very good at it, apparently.
They truly are. One thing I might also mention is that in my experience, the wireless connectivity is less than great for Wacom tablets. Way too much lag. I recommend using a wired tablet if at all possible. It may be that this is old advice, but I couldn't deal with any lag whatsoever when using one for hours at a time and disabled the wireless mode of my Intuos 4 (yes, it's quite old, but still works fine)
> A tangent, but I've found the iPad Pro to be fabulous for video editing
Doesn't the sandboxing and limited access to the file system get in the way?
Can you export video to non-Apple formats, collaborate with other people via a version control system, easily import footage from non-Apple devices, and so on?
Can you even cleanly separate your video editing project from your personal holiday photos/videos? (Does iPadOS expose a directory structure to the end user yet?)
> Doesn't the sandboxing and limited access to the file system get in the way?
Why would it get in the way? Any video editing software on it's going to be designed with that in mind, and will share with other video-related software just fine. Lots of desktop video editing software (Premiere, to pick one) start with "OK, step 1, import all the clips/videos for this project into this program" so even in the worst case it's gonna look a lot like the same workflow as video-editors are used to, and those workflows exist and became the norm on desktop before iPads even existed.
> Can you export video to non-Apple formats
Of course. Why wouldn't you be able to?
> collaborate with other people via a version control system
Never seen it for video editing, including when I worked at a video production company, though that was years go so maybe version control's normal even for amateurs and small shops now? If it's around and popular I'd be surprised if there's not "an app for that" or it's not built-in to some of the higher end editing software on iPad. If things still operate like they used to then you're talking a NAS shared over SMB for largish projects, or Dropbox for smaller ones, for collaboration, probably, and both those are fine with iPads.
Not sure that they do. They could, but I don't know of any significant video editing plugin ecosystems on iOS. Stuff's too rich for my blood—$200 to make overlay text look a certain way? $400 for a noise-reduction algo? Not for me and my casual stuff. Again, if you want actual pro software, you're looking at one of a couple desktop application vendors, that's true. If you're looking for pro-tier software for home use, and learning that pro-tier software is your main goal and/or you actually need what it can provide, then you're probably talking Adobe Premiere, specifically, and yeah, that's desktop only (and about $250/yr). If you just want to learn the process and techniques of non-linear video editing and put together (potentially quite good) edited videos or films, the iPad's capable enough (with the benefit that, in the original use case posted here, it's also a damn good drawing tablet and you can use it to shoot your videos, too, again, if you don't have actual pro-level needs), but I wouldn't argue it replaces a desktop with Premiere, for a bunch of reasons including that everyone assumes you've got Premiere (and often Aftereffects, too) available, when it comes to plugins.
Haven't used one since the iOS/iPadOS split, so honestly don't know how much more flexible they've become.
More familiar with iOS on iPhones though, but can't see how I'd do 'serious creative work' - which usually ends up involving large numbers of files that need keeping organised - on an OS that's traditionally tried so hard to hide the underlying file system from the end user.
And I've never tried connecting a camera or video capture device directly to an iOS device, so not sure what is or isn't possible in that area.
> Doesn't the sandboxing and limited access to the file system get in the way?
No.
> Can you export video to non-Apple formats, collaborate with other people via a version control system, easily import footage from non-Apple devices, and so on?
An iPad Pro is great, but not superior to the Cintiq Pros imo. Especially the 24 inch and above. Once you’ve had the extra space you wouldn’t be willing to give it up for a 13 inch screen. I keep hoping for an iMac that you can use the Apple Pencil on for that reason ;P
The iPad Pro is a device that definitely can edit video, but I really only use it as a last resort. For me it has never been a question of power, but a question of comfort: the M1 could be the fastest CPU in the world, but what do I care when I can't run 32 bit software on it! And then there's the ergonomics... unless you're using a mouse/keyboard with the iPad, it's an insufferable experience: even then, it's not a given that your NLE of choice will even support keyboard shortcuts, and then we're back around to square one again...
Arguably, a lot of the same things that are wrong with 32 bit software. If we're being pedantic, both of them should still be supported: but they aren't.
> So, I talked to Steve on the phone [about adding a standard pen and penholder]. I said, “Look Steve. You know, you’ve made something that is perfect for 2-year-olds and perfect for 92-year-olds. But everybody in-between learns to use tools.”
-- Alan Kay
Non-slave adults are the vanishing breed, the onslaught against general and self-determined computing is just one of the symptoms of that.
> So, I talked to Steve on the phone [about adding a standard pen and penholder]. I said, “Look Steve. You know, you’ve made something that is perfect for 2-year-olds and perfect for 92-year-olds. But everybody in-between learns to use tools.”
-- Alan Kay
Steve Jobs was right since most modern smartphones don't use styli. (Nothing against the Samsung S pen though.)
Alan Kay was right because the Apple Pencil is pretty great:
It's superior in the short term, more convenient, but in the long run it will leads to those people going away, and having no means to fight back against being made to go away.
> tricked out desktop (intended for gamers, I think)
Generally this is just a marketing/SEO thing. A huge percentage of desktop hardware and accessories get the word "Gaming" slapped on the front even though they have nothing to do with gaming. Keyboards, mice, mousepads, headsets, speakers, monitors, cables, etc.
Principally, I think "gaming" now means "graphically high-powered". If you are going to be word processing, internet browsing, etc. then this is not as important and may not be worth the cost. So the desktops with a beefy GPU, etc. get called "gaming" as a shorthand for that.
Indeed, I know a few people who have been holding out for months and months to buy a GPU on its own.. and they have settled for just purchasing a whole prebuilt system because they're sick of waiting. Pretty frustrating all around :(
For the average person this might be kind true, however I can think of at least three places where desktop computers are very much cultural and symbolic:
* “Gamers” or other enthusiasts with their battlestations [1]
* Finace workstations with many monitors and Bloomberg terminals [2]
All of those musician's workstations have the actual computer hidden somewhere out of sight. There's a monitor or two, a computer keyboard lying around(all wireless, it looks like?), but all those desks are dominated by musical keyboards, mixer boards, and speakers. Sometimes there's a laptop sitting on it. All the cables are hidden, out of the way.
Take away the monitors and the computer keyboard, substitute a couple of multi-track reel-to-reel tape decks, and it looks exactly like what I'd see when I'd visit my dad at his audio engineer job in the 1970s.
The computer itself is just an appliance, stick it under the desk next to the mini-fridge. Maximize the available surface used for getting the work done; as long as the computer's got enough airflow to not overheat when you push it hard, who cares where it is?
Well, gamers care I guess, they all have those crazy light-up cases with clear sides to show off how much they spent on their custom-built rig. The computer's become an object of desire there.
I'm a gamer and my rig is a mini itx tucked away. I like it quiet and performant. Yes the market slaps RGB lighting on anything and everything gamer but it's up to gamers to buy them.
I like having my case light up with the temperature of the components.
I've never succeeded making it anything but a deep green and it gives me a nice and fuzzy feeling, knowing that everything is at an optimal temperature while the fans lazily spin behind the tempered glass.
Ha! That's such a fantastic idea. I always keep my lights either off, or on white because I kind of dislike RGB lighting. I might have to look into this, did you use any specific software for this?
It is a setting in the nzxt control panel for me (I only used nzxt cooling components in my current build), but other manufacturers often have plugin architecture for the LED control, so it's usually not that hard to setup on others as well.
SpaceMouse works just fine on laptops, it's just kind of pointless when your software is pushing 15fps because of the shitty graphics card. You can't get Quattro cards on laptops, as far as I know. So anyone using CAD software regularly is stuck using desktops.
If it is central to home computing, I'd say audio/music workstations are a bit outside of that. However, if you allow that, then a video workstation would also follow.
Eh, it depends on what kind of work you're doing within the music industry.
Solo studio tracking artist? Yeah, an iPad is probably good enough for you to practice with and do some rough scratch recordings for demo purposes.
If you're running a studio, you need the I/O to handle at least a few physical devices with dedicated PCIe lanes, and a high resolution input device like a mouse or trackpad to make precise edits in a DAW. Plus a keyboard. Audio/video editors love their keyboard shortcuts just as much as developers!
While it may be possible to connect any one of those devices to an iPad, connecting all of them and making it easy to disconnect the iPad, makes it impractical in reality. The iPad in it's current format is not a great device for professional audio recording and mixing, and if you changed the format to better fit that workflow, you've now described a Mac Mini and you should buy one of those instead.
These aren't cultural or symbolic uses, though. People who do these things need powerful chips and a cooling solution that allows them not to throttle under load. Most modern laptops are ultrabooks now with poorly designed cooling solutions (due to the thinness of the devices) that throttle under load and are unsuitable for these uses. Even M1 Macbooks have issues using multiple monitors.
I'm a filmmaker and visual effects artist who works with ray tracing rendering a lot, I'd say we're definitely a significant niche! Recently bought myself a new rig so I could render things 10x faster, totally worth it.
If I would not be a software dev I would ditch laptop/computer and go with mobile phone alone. For day to day reading/banking/mailing all I need is there.
I think I could go full mobile phone if I would have cloud server + cloud IDE because I can connect big screen to the phone with mouse and keyboard anyway. Only dev tools are not really accessible on Android/IOS.
Having everything I need to work on single device that allows me to jump on the plane and go anywhere, that is great idea. Well maybe I would need to carry a keyboard and mouse but I hope that screens will be available in every hotel room/office space with option to connect with a phone.
> Finance workstations with many monitors and Bloomberg terminals
The only person I know who has a bloomberg terminal uses a remote machine and just a laptop at home and he works from home. However he is not involved in day trading directly so maybe those would need desktop computers just for the graphics.... although maybe even that can be done via external graphics card since the processing power I think is already there on powerful laptops.... now I'm curious.
There's nothing better than a huge display with high resolution, keyboard and mouse. The computer might be in a separate chassis or a laptop or whatever but a huge display and a mouse are indispensable.
Honestly, I personally love the aesthetic of big whirring fans and blinking lights. If you offered me an M1 MacBook Pro but it would be buried in a 7 foot tall cupboard plastered in lights and fans that occasionally, randomly, sounded like a jet engine taking off... I would be in heaven.
I’m 36 right now and have always made sure to have a desktop computer with accounts for each family member and it’s used all the time.
I agree it’s a feature that’s disappearing, but if anything it’s because people are getting more personal computers, like a laptop or modern smart phone, than ever before.
IMO this is an underrated setup. Laptops are nice, but having an 'official', stationary, desktop computer builds up associations and mindsets that help with whatever it is you do on it. I didn't have a desktop until I was 26 and it was like going from studying in random chairs and tables to having a proper desk.
Fair point. I guess I watch (my) kids use them enough (tossing iPads/tablets into this group too) to make me feel like they are a personal computer. But you are right. They are not.
I have about a 6 year old iMac and MacBook Pro. I'm guessing, depending on specs, when I upgrade, it will be to just (the rumored) 14" Apple Silicon MacBook Pro.
My desktop computer -which doubles as my workstation- is going to be 10 years old soon. The replacements during these years make it a beast to this day, it beats a $5000 Macbook Pro into a pulp for a fraction of the price. I'm not considering ditching it for now.
The place desktops will always win is sustained performance. Sure, a $5000 Mac can run very fast in sprints, but as soon as it's loaded up and running at 100% CPU for more than a couple minutes it's going to start throttling. That's pretty much unavoidable for a mobile computer - there's only so much space to move air around. Processors are just fancy ways to convert electricity to heat when it comes down to it, and desktops have lots of surface area to dissipate that heat.
The only laptop I've ever owned that had good sustained performance was a Dell Precision workstation machine[1] which could run at quad core i7 full load at 3.8 GHz for hours thanks to its pair of massive internal blower fans.
iMacs are usually laptops in all but name. To hit the same thinness with a giant heat-producing screen, most of the parts are their mobile equivalents.
Edit: on slightly closer inspection, this isn't true for the 27" models!
The 27" models have traditionally used off the shelf socketed CPUs though, which is unusual in AIOs. Regular 27" models use desktop i9s and the Pro uses a desktop Xeon. Both can technically have their CPUs upgraded even if opening them is a pain.
Thank god for them being thin, I actually had to throw away my 2013 iMac because it was just unbearable reading my emails with x-ray vision and noticing just how thick it is.
What is hilarious is that the thin imac weighs the same 20lbs as the one that was a few inches thicker and had better cooling and user replaceable parts.
I don’t see how that is possible. Desktop PCs have almost unlimited space — cases can be measured in cubic feet. MacBooks are always space constrained because of their case size and weight minimizing, so there is an inherent disadvantage in the size of components like graphics boards. And a size disadvantage means fewer transistors and less cooling.
To be fair the mac pro went through its own horrible shrinkage phase where it was not much larger than my $50 air filter. The old aluminum case used to be pretty large and pretty spacious internally, with many extra bays waiting for you to give them purpose. Not sure how the newest one stacks up to the first mac pro/mac g5 case.
The M2 isn't out yet, so we don't know what it can beat. But core for core, Apple Silicon has the best performance by a wide margin with much lower power consumption than any x86 part (because while we weren't looking, RISC architecture really did change everything). Plus, its on-chip memory will equate, in a Mac Pro-like system, to gigabytes of L3 cache. Add on to that that Big Sur and later are specially optimized for the Apple chips, meaning that performance cores won't be burdened with the background tasks the OS performs, leaving 100% of their throughput available for user applications. So yeah, core for core the Threadripper will get smoked, and I wouldn't be surprised if an M2 with half the number of performance cores as a Threadripper has remains competitive with the TR for representative tasks (such as video editing, rendering, and encoding).
I think a lot of the hype is unwarranted with the M things and you will be able to match or exceed their performance for serious loads.
However... they're QUIET. So unless they manage to fuck up the desktops completely, i'll be getting one. Not with the laughable current 16G RAM though.
My M1 Air renders and edits 4k as well as my ryzen 3700x with 64 gigs of ram and a samsung nvme.
It loses out in encoding though. I cant get the same quality/size ratio with VideoToolBox as I can with NVENC.
Upgrading a 10 yo computer (maxing RAM, swapping HD for SSD, adding a Graphics Card) is a lot cheaper than buying a basic new one and the result is a lot better.
For the same price point of a macbook pro you can literally get a more powerful desktop, and a more powerful laptop. Or an even more powerful desktop, and a laptop and just use a vpn with remote desktop like a normal person.
I have a Macbook Pro (though 15") and a PC, but also a ridiculously underpowered Chromebook with a crappy 11" 1366x768 screen. Lately I find myself when I'm on the run in the city to grab the Chromebook and ssh to my desktop (I've got Zerotier set up). This is my ultimate portable duo. It could be better, but it is lightweight, battery runs full day and I have all power I need. Then when I get back I have it even better with less latency.
That's a setup I wanted for quite some time and I know it was available for some time, but ubiquitous fast and cheap wireless Internet is something new I guess, also with all-day batteries all in a lightweight package.
No. In the heyday of mainframes people would remote in, with a terminal. That's where the terminal protocols many people still use on a daily basis actually come from. Sending text over the wire to display on your terminal screen, you type into a keyboard which transmits your input across the wire to the mainframe which responds with new screen updates.
Using your desktop remotely from a cheaper but more portable device is pretty much the same model (not the same thing, but model) as the way many people interacted with mainframes last century.
Portability is not a problem, my employer provides me with a powerful laptop that I keep permanently. They are different tools that serve me in different situations.
I can only imagine how the OP would react to a reality where I'd estimate the majority of people use a thin candybar touchscreen brick with maybe 3 buttons as their primary computing device outside of work.
I have more "home computers" now than ever before. It all started when I figured out you can network boot the Raspberry Pi. Now I have one connected to a cheap monitor in nearly every room in the house. Because you can switch what OS they boot into just by renaming a symlink on the server, sometimes they play music, sometimes they show movies, sometimes they show live TV, sometimes they browse the web.
I'm probably a little eccentric but I really enjoy this a lot more than simply carrying a laptop around the house.
I've just bought my first desktop in about 20 years. The main reason, the bloody clutter.
I have two (work issued) laptops, one that runs Windows and one for NixOS, both with SSDs that are too small for me to dual boot effectively. Then there are my two external monitors, my mechanical keyboard and mouse. I need a keyboard and mouse switch as there are an inadequate number of USB ports on the laptops and I want to share peripherals, USB-C hubs on the laptops to get video out to more than one monitor and also provide power simultaneously and read/write SD cards.
My desk is an absolute cluttered mess.
Minimalist laptops/tablets just end up producing more clutter for me, are hard if not impossible to upgrade, and tend to suffer from planned obsolescence. Now I don't need to travel to work every day, I'm happy to have a desktop at home and then use a co-working space with a Raspberry Pi when I do need to go into the office.
The biggest desk/workstation upgrade I have had in years was putting my monitor onto a wall-mounted arm. The desk space under the monitor magically opened up into a vast empty plain that I could then fill up with... more cables. But at least I can do more now :-)
Along the same lines, have somewhere to put your keyboard and mouse.
Bam! You now have a full normal desk to draw, paint, glue stuff, solder etc.
I live in a 2 bed apartment with a 3 year old. My desk is an Ikea Ivar shelf with the computer sitting on the highest shelf to keep it all away from my toddler...
I'm of mixed feelings on it. On the one hand, it sure is hard to read. On the other hand, in small doses I kind of enjoy it after the super sanitized corporate nothingness most of the web has become now. One glance at this page and there is zero doubt in my mind that it represents the genuine expression of an actual human without layers and layers of forced censorship and editing by some "platform." The web was never supposed to have "platforms" in the first place: Youtube/Facebook/Twitter etc are all fundamentally rent-seeking parasites in a way most people don't even appreciate today.
The web was supposed to just be a place anyone could put up a document with some links to other documents. Cheap, simple, easy, fast. Then some services cropped up to help make some of the frustrating parts easier and quicker, and eventually the "services" ended up eating the web alive.
The correlation between creation and consumption based on the device form factor is something to consider.
A desktop computer often provided a good balance between the two.
Now with the corporate web, perhaps it’s no accident consumption devices have become so widespread.
I mean, this page is not simple, easy, or fast to develop. Instead of just writing a simple article, the author had to source/create at least 2 dozen images with no value only to lay them out on the page in the middle of paragraphs, making the article as hard to read as they could muster.
Enjoy the nostalgia and break from modern clean websites, but anything else justifying the look of this page is just crazy. And connecting it to modern services and the downfall of the internet or whatever just sounds like you're trying to be edgy.
We are in a massively better place now than a decade or two ago as far as the internet being accessible to everyone, both on the publisher and consumer side, despite the hate for big corps.
This particular page design can be read as an intentional response to Web2.0, something the authors of this article considered to be part of a broader campaign to remove "amateur" production from the web. They published a book, Digital Folklore, expanding on this idea a few years later: https://digitalfolklore.org/
Design is both a linear progression and an infinitely looping circle/pendulum. Things were really free and organic at first. People got frustrated with this, and rising tech companies gave them "clean, modern" design instead. It was embraced because it was easier on the eyes, and faster to use in general. But now I think there's a growing resentment toward the "clean, modern" look, because it's simplified every drop of soul and humanity out of digital interfaces and expression. The pendulum will probably start to swing in the other direction soon.
I recently rebuilt my gaming PC with a Ryzen 5000-series. Surprisingly, it has become my main work PC, and I haven't used it much for gaming. It's faster than my i9 Macbook pro, and another benefit is that I can only work when I'm sitting at my desk. With a laptop, I can do work pretty much anywhere. Which is a double-edged sword.
Regarding the physical presence of a small computer, something I actually miss is people hitting the monitor, when in despair, rather then the true culprit (as in "the box"). – There's actually some kind of magic in this misdirection.
With flat screens, this isn't a thing anymore. (Also, the kind of despair that would urge someone to attack the machine, has become increasingly more abstract, as has the underlying architecture causing the despair. What would you do? Take a plane to hit "the cloud"?)
Right before the pandemic, I was shopping at Best Buy, and saw HP 32" monitors on sale, cheap. (The old 2K, not 4k)
Phoned home to check with the Misses, and she said no, get 2 of them, one for yourself, and one for sproutlet.
Having a nice big screen really helped during the quarantine after getting sick, and the time after.
It doesn't really matter what your "computer" is, be it an old Mac Pro tower, or an HP laptop, the Monitor, real Keyboard and Mouse are what make a desktop these days. The old beige box is optional now, the rest isn't.
I detest mobile: I find the interface very difficult to use. I would give up mobile way before giving up my desktop. I realize, though, that I’m in the minority.
A friends family doesn't have a single desktop/laptop/tablet/printer in their house. They all have decently highend cell phones. If the children need to type up a book report one of the adults brings home their work laptop.
Meanwhile, I've moved back to desktops. If I ever need a laptop again it will be the lightest weight one I can get, and will use it to remote access my main machine.
Per usual, clickbait polemics that propose absolutes cause controversy.
I realize that the comments here will largely be filled with people variously proposing that one do EVERYTHING with the iPad, and folks opposed to that. As this article is from 2007 and desktops are not extinct may I humbly suggest that a tablet device SUPPLEMENTS and COMPLEMENTS desktop and laptop computers running “full featured” operating systems (ones that allow you to manage files conveniently, for example, or configure the system as you wish it to be).
The question is not so much about processing power, but of form factors. A PCI(e) based audio interface is still superior to a USB one in latency and jitter, commodity RAM in standardized slots as an easy consumer upgrade, one can speak of the physical space required for proper USB ports if OTG and a zillion fragile dongles aren’t going to work for you.
The vast majority of content creators that I know, (including myself) use a tablet in conjunction with a traditional computer form factor. They use the tablet variously as a remote control, as an input device, and as a portable idea capturing device. It doesn’t replace a sound or video studio, it’s a welcome addition to one.
I remember being able to borrow the big old Apple II from school (my mom was a teacher so in the summer I could borrow them for a time... otherwise they just sat dormant in school).
It sat on the kitchen counter like an appliance / shrine.
The home computer has been replaced with the iPhone.
What boggles my mind is that the industry accepts Apple's restrictions on what kind of computing can happen, how their software can be distributed, and how their gross/margins are taxed.
These devices have done a perverse thing to our field.
Consumers love them, but fuck are they awful for freedom, business, and small time success.
We used to build platforms in the open that we all benefited from (ie. the Web). Even Microsoft Windows was and is more open than these goddamn monopoly devices.
The DOJ needs to rip Apple's control of the iPhone away. It's not their device anymore. It's the industry.
Computing belongs to the people. It'd make for a healthier playing field, and Apple would still be making bank.
Personally I'm in love with the fact that Apple found a business model selling privacy and safety, while I'm also torn on the idea that Apple has been using their weight to bludgeon the competition into giving up a massive share of revenue.
At the same time, the weighing of leverage is precisely how the market has and still works to this day. Thus if you send me out there to negotiate with Popeye's (which really wants you to create an account), or Walmart, Facebook, or even a small company making todo apps, I will simply not win, and neither will my family members. I love that cross-app tracking is now opt-in, because that's not something I'd ever imagine negotiating with any company, ever. I love the fact that all subscription-based services purchased through the iOS platform can be managed in a centralized place.
And I will of course resent in my tiny, futile, pathetic way, those voices which say "Well, you as a free agent made these concessions in the marketplace. Didn't you get what you deserve?"
Why can't there be a consumer choice to throw your weight behind a mega-giant who intercedes forcefully on your behalf, in a marketplace that is all about the weighing of leverage?
There are dozens of hardware platforms running hundreds (if not thousands) of software stacks that a user can play with that aren't Apple's ecosystem. Everything from a Raspberry Pi to a Lenovo ThinkPad. It's a color-wheel of hardware and software out there.
All 300 million American consumers doing all of their business, emails, dating, instagram, phone calls, photography, etc. on Raspberry Pi?
No. The solution is to regulate iPhone and make it a common carrier / open platform. It won, and now Apple has to bear the consequences. If their device was in niche usage, you might have a case.
300 million American consumers aren't using iPhones. Why this obsession? Seriously, 99% of your comments on anything remotely connected to Apple end up being this same thing. "A New Apple Variety has been Developed"
What is stopping 300 million American consumers from doing their business, emails, dating, instagram, phone calls, photography, etc. on Raspberry Pi?
... and if it's "They don't want to; Apple's user experience is better for their use case," why should the government intercede to take that away from them?
> why should the government intercede to take that away from them
Because Apple is the new Standard Oil. We've decided that too much power and control in one vertically and horizontally integrated company is bad for consumers, bad for business, and bad for the health of our capitalist enterprise.
Under Apple's thumb it's much harder to launch a new product and be profitable. Apple takes too much and demands unreasonable things.
Apple is the face of computing today. They've installed themselves as the troll on the bridge. They need to be reminded that this is not the correct or healthy role for them to play and that there are plenty of other opportunities for them to pursue.
Microsoft post-DOJ emerged a healthier company. Apple will too.
In contrast to the EU, where monopoly protection guards the market itself from lack of competition, the US standard centers on harm to consumers.
... And that's the hardest part to prove regarding bringing antitrust against Apple. Millions of users and they aren't switching to another computing platform when the barriers to transition are extremely low, which suggests they're happy with Apple's products.
"Satisfies customers' needs" isn't harm to consumers.
Turns out the PC was an accident of history. Not an endpoint like some of us saw it, but simply an unfortunate stepping stone to the ultimate goal of making a really smart toaster. The end goal was never really to empower people, it was just to create a new market for rent-seeking via lock-in on a shiny new black box. No repairs allowed. No modding allowed. And you can go fuck yourself if you want the source code for anything. "Micro-computing," as it used to be called, was never intended as way for ordinary people to gain freedom and agency. It was just supposed to be another tool for the rich and powerful to extract wealth from everyone else. And now, thanks to always-online requirements, inescapable updates and aggressive planned obsolescence, that dream is finally a reality. Welcome back to the farm serf. You never really left.
Had it been a passing fancy, no doubt she would have done without the desktop, but it wasn't, so she got a tricked out desktop (intended for gamers, I think) and now has the most powerful computer in the house. She still uses her phone and (when in school) her Chromebook, but when she's in creator (not consumer) mode, she uses her desktop.