I understand the argument that forcing content through a proprietary jack opens the door for controlling said content and has huge implications for hardware manufacturers and the way people use their stuff.
What I am having a hard time understanding is why is that any different than the OS influencing what gets sent to the 3.5mm jack? It isn't like apps running in iOS had to interact with APIs to do all the other things (e.g. camera) but not the jack. Is it?
>why is that any different than the OS influencing what gets sent to the 3.5mm jack?
Their issue seems to be that it is not analog. The lightening port and bluetooth connection are two-way data connections, and the phone will be able to identify what is attached (rather than just outputing an analog signal). Here's the quotes that make that clear-
> But intentionally or not, by removing the analog port, Apple is giving itself more control than ever over what people can do with music or other audio content on an iPhone.
>With Bluetooth, the phone can distinguish between different types of devices and treat them differently. Apple can choose which manufacturers get to create Lightning-compatible audio devices.
Yes, they published "Thoughts on Music" on apple.com arguing against DRM on music. Then they took the essay down, and now use DRM on streamed music (and they never stopped using DRM on video or ebooks).
Define "suits them" then. Microsoft locks down everything, even the computer itself with Secure Boot. Apple does no such thing.
If you want to run OS X/macOS on your own hardware it's doable. The only trick is finding drivers, not cracking DRM or license protection.
If you want to run another OS on your Mac you might need to fiddle with the EFI settings and/or update that with an EFI mod tool to make it more compatible, but there's no real impediment to installing anything you want.
Apples to oranges. OSX only runs on apple computers - you 'pay' for OSX when you buy the hardware. However, MS has to sell the software itself. You also downplay the magnitude of the problem "finding drivers".
Yes, the parent post is oddly off topic. Continued invocation of the unrelated windows platform (despite an explicit reminder that this is a Tu quoque fallacy), as well as a strange focus on OS X, as if iOS doesn't even exist.
Obviously apple will gladly lock down, control, restrict, and regulate their users (and even their devs) when it serves their purpose. Just look at iOS. The entire platform is restricted from top to bottom.
That's a phone, and they've taken a different approach with security for those. The product they're offering is one where you're more restricted in what you can do, but you're given more security as a trade-off.
For their iOS products, more locked down equals more safe. They're treating them more as appliances than as general purpose computers. For consumers this has some appeal: The risk of malware and trojan/virus like applications is effectively zero on iOS.
If there was a way to offer security without locking things down they'd probably do it, but I think that's a logical impossibility.
If you don't like that model you have a ridiculous number of alternatives, more so than in the PC space.
So this is less a case of Apple using DRM when it suits them and more a case of Apple using DRM when it suits the consumer. Running only trusted, signed applications is a limitation, but it's one that is not without benefits.
It would be so easy to beat this though (if you want the analog hole route), ultimately that gets decoded and sent to the speakers in the headphones, rip the headphones apart.
Unless they start doing serious hardware anti-tamper in headphones I don't think it's a real risk.
Bingo. We already see this with HDMI and HDCP where you can get a diminished experience, or none at all, because the playback device and the TV can't get their ducks in a row.
Amazon will serve me HD video on my smart TV using its Amazon app, but not on my computer using the same TV as a second monitor. And lest you think the problem is the computer, I can get it on my PC just fine using a regular computer monitor...it's a bummer to think about this sort of thing with simple audio as well.
Huh. I often use my non-smart TV as a second monitor, but I'd never tried playing an Amazon video on it. Sure enough, the same video that plays in Firefox on a VGA monitor gives me a "we're experiencing a problem playing this video." message if I move the browser to the TV.
Yeah, I'm imagining some scenario where say Spotify partners with Bose, and if you plug in Bose headphones then you get higher quality audio than you can get elsewhere. Or say tidal (Jay Z's streaming service) says you can only listen to some songs via Kanye's new headphones.
That's actually really interesting. Especially as everything is moving toward personalization. If you have a set of headphones that have a unique ID tied to you, the artist could even intro a song with your name - or in the future procedurally generated music could substitute a reference to your city into a song.
Thats a cool idea!
Arcade fire did a really cool music video in a similar vein that was interactive/personalized: http://www.thewildernessdowntown.com/
This is totally off topic, but just checked out your website and damn, you're a hell of a runner. I'm thinking about signing up for my first ultra real soon!
It's not that society doesn't care, it's that people don't spend time figuring out all the distant-future implications of their choices. While you can't boil a literal live frog by turning up the heat slowly, this principle works extremely well on the free market.
But if a manufacturer wanted to only play audio to their device, they could have already made it a lightning-connected device.
If they do give the ability to distinguish the lightning-to-3.5mm adapter from other lightning headphones, that isn't any greater than the ability to distinguish between the 3.5mm jack and the lightning port is it?
The issue not likely to arise with hardware manufacturers. However, content license holders have been eager to have powers to restrict playback based on the device that will be playing back, see region encoded disks and HDMI.
Apple has been very good at predicting (or perhaps directly causing) the demise of certain technologies: the disk drive, the CD drive, the ethernet port. They removed them much to the chagrin of many a loyal customer, and a few years later they nearly ceased to exist across the entire industry (we're still waiting on the ethernet port to go away but give it time).
How many times will it take you forgetting your dongle to just pony up and buy the apple approved headphones? How long will it take you using the non-default, non-apple, non-sleek, clunky dongle as you show off your fancy new iPhone 7 to your friends before you decide it's got to go? What if they're using it the apple-intended way?
How long will it take before nobody buys headphones anymore? A few generations until they deprecate the dongle since nobody is using it? Where you can no longer buy a $10 set of headphones at a drugstore and plug them in? Where 'every audio device must provide analog audio output in a universal format that every device has been able to read since audio was invented or it is functionally useless' is no longer a maxim?
Long-term, we are giving up a universal and open protocol that all devices work with for a proprietary one. If you don't expect companies to abuse this to make money and to stop you from doing what you want with your equipment, well, I've got a bridge to sell you.
Floppy disks, optical disks, ethernet were all replaced by unencumbered standards and they all lead to a superior experience (well, maybe not ethernet, but there are definitly advantages to wireless). If you're old enough to have used a floppy disk drive you do not miss them.
Any time you introduce two way communication, you introduce the possibility of DRM, but that does not guarantee it. I can still use my HDMI with my linux PC. DRM is exclusive of the technology and I see no reason to hold back the technology because we are worried about DRM.
There are lots of compelling reasons to dislike this change. Among them the fact that I have a large investment in traditional headphones and devices that work well with them (my iPod, my HTC Phone, my stereo, TV, piano, guitar amp, etc etc etc), a decrease in quality, convenience and weight of wireless headsets (how long do they work? How much do they weigh? Can I swim or workout with them?). But DRM is not automatic.
>If you're old enough to have used a floppy disk drive you do not miss them.
I am, and I do. Floppies were ubiquitous, durable, reusable, and cheap enough to give away. There's no replacement today - flash drives are the closest thing but when was the last time someone handed you one with no expectation of getting it back? Or bought a box of 50?
Sneakernet became much less vibrant with the death of the floppy.
Flash drives cost almost nothing. I'm not going to lose any sleep over a $4 flash drive being given to someone and never getting it back. In terms of inflation that's cheaper than giving someone a floppy that cost $1.
Floppy disks were always terrible. Slow, unreliable, prone to failure at the worst possible time. A simple magnet could trash them beyond repair. A bit of water could render them unreadable. Leave it loose in your bag and it gets bent? The thing was toast.
In the dying days of the floppy disk, around the time Apple introduced the iMac with no floppy drive, they were already obsolete. 1.44MB could barely hold anything useful at that time, most people doing any serious exchange had already moved on to Zip drives because they held a more reasonable 100MB, or CD-R since you could burn six times more than that onto them. If you had tiny WordPerfect files then floppies were adequate, barely, but what kind of a market is that?
I'll give you 3 of those points, but definitely not durable. Some of them lasted a long time, and some of them stopped working before I could finish writing to them. And since writing to them was painfully slow, something frequently went wrong.
Of course with flash drives we can wait the short time it typically takes someone to copy the contents off to get it back. And since they can hold significantly more data, they are also far more reusable.
You can buy USB flash drives for $2. People don't refrain from handing over flash drives because they're expensive, they do it because they have a reasonable expectation that you're able to immediately copy the data, or that you can receive it over a network.
The death of the floppy didn't kill sneakernet, the rise of portable networked computing did.
Floppies were NOT durable, and I just bought a box of 10 8GB USB sticks for under $20 - that's similar to the cost of the 1.44MB floppies we used to buy.
> Small enough to fit in your pocket but large enough not to get lost from your pocket
You can get keyring flash drives, I'd rather that than something larger. Anyway a floppy barely fit in a pocket.
> Cheap enough not to care about
Flash drives are cheap enough not to care about... but my data isnt!
I would definitely give a flash drive to someone if they needed it. They're like $4 each or something. And on a $/Gb comparison they obviously blow away a floppy.
> Disk could be ejected but still sitting in the drive not sticking out far enough to be a bother
You can safely remove a flash drive and leave it plugged in if you like.
The headphone jack was already destined to be replaced by USB type-c on many (not all) portable devices. Apple's move may change the time line, but does not change this fact.
"ports older than USB" meant serial and, later, ethernet.
My new-issue Dell work laptop lacks an optical drive and Ethernet. My dad just bought an even beefier Dell laptop and the first thing he asked was "where's the DVD drive?" Dell offered to send him a free external USB optical drive, but told him that internal optical drives were simply not an option on that make of laptop.
The question is, who is going to win: the legion of teenagers and young adults who listen to audio via the headphone jack, or device makers?
(It's the legion.)
If you had a replacement that was better - cheaper, less of an annoyance than $5 headphones plugged into a cheap phone - then you might win. But the proposed replacement is superexpensive, battery-powered wireless headphones. When the no-strain-relief Apple cables break, probably within weeks, you're supposed to buy another superexpensive set. That's not going anywhere with the horde of teenagers. It's a non-starter.
> we're still waiting on the ethernet port to go away but give it time
You have it precisely backwards.
Far from going away, everything became Ethernet.
Take a look at the modern interconnection standards: HDMI, USB 3.0, Thunderbolt, etc. They are all packetized transfer across an impedance controlled double pair of wires.
I think parent was probably implying wireless would take over.
I don't agree however, call me a luddite but I'm partial to the simplicity and speed of wires. Not for my phone of course, but for my main desktop PC, I'm happy it's not wireless.
How long will it take me to buy the clunky charging dongle - one for my bag, one for the car, and just leave them there? I'm an outlier - I don't use the headphone jack to listen to music but rarely - I use it to make phone calls - because both the audio quality and the volume offered is much superior to any reasonably priced wireless solution I've seen - with the added advantage of me not needing to charge it.
I suspect that Apple has enough usage data reported from customers to indicate how many of its customers use bluetooth (which is a universal and open protocol) to justify this from a business perspective - while it'll be inconvenient for me - I understand that in many ways, I'm not the typical Apple customer.
The EFF article? its FUD, pure and simple. A couple of the early android devices had no headphone jack - there was no great outcry about the encroaching DRMed world, some minor grousing about the annoyance of not having a headphone jack was all. Apple will likely support the adapter as long as most everyone else does - as in the past they were never the first to get there, just the most notable.
My understanding is that it is about what is the output. The 3.5mm jack outputs the regular audio data. You could just make it output encrypted data, but that is the same as removing it, all the headphones won't work. If you replace it with a proprietary system you can add in DRM, like not outputting anything if the connected device (which can be an adapter) is not certified. Everything has to get transformed into regular data at one point, but the content mafia is trying since many years to control as many parts of the system as possible. This is one further step.
>The 3.5mm jack outputs the regular audio data. You could just make it output encrypted data, but that is the same as removing it, all the headphones won't work.
The 3.5mm jack outputs an analog signal. It's already been converted from digital data to an analog signal by the phone's DAC chip.
Exactly; Apple has always controlled what can be sent to the DAC on the audio jack. (I don't know if it has a dedicated DAC, but even if it shares with the speaker, the device knows when something is plugged into the jack).
How secure are these dongles? Will we see third party ones, or do you need key material from Apple before they will light up?
The Xbox is full of this kind of thing, btw (most peripherals require a "security chip" that just needs to handshake; purely revenue protection, and nothing to do with security).
Apple products have always been built with a Walled Garden approach, so nothing new here. The same ios software/hardware combo has always been fully controlled by Apple.
I'd think that the OS is irreparably "blind" to devices connected to the 3.5mm jack, unable to identify them, and unable to discriminate among them, unable to send audio output to some of them, and not to others. A proprietary audio standard might include a way to identify devices (since manufacturers would likely have to use some kind of ID in the devices they make with Apple's blessing).
What I am having a hard time understanding is why is that any different than the OS influencing what gets sent to the 3.5mm jack? It isn't like apps running in iOS had to interact with APIs to do all the other things (e.g. camera) but not the jack. Is it?