Hacker News new | past | comments | ask | show | jobs | submit login
Intel driven MacBook Pros have secondary ARM processor for Touch ID and security (techcrunch.com)
292 points by eth0up on Oct 29, 2016 | hide | past | favorite | 134 comments



Not really that surprising. Your average laptop nowadays probably has more ARM cores than x86 cores. They're in all kinds of peripherials like wifi controllers, power supplies, harddisk controllers, GPU job control, heck even AMD's recent APUs come with an integrated Cortex-A5.[1] ARM cores are everywhere, people just don't know about it.

[1]http://www.anandtech.com/show/6007/amd-2013-apus-to-include-...


Laptops have what is known as an "embedded controller" to handle peripherals like keyboard/touchpad, power management, and fan control. Most of them seem to be low-power 8/16-bit MCUs like 8051 and H8S, but I wouldn't be surprised if newer ones use ARM too.

...and don't forget Intel ME, which is not ARM but another processor in the system.


System-on-chips (most big chips these days) are so complex they usually have the same thing one level deeper: On chip embedded controllers, often an mcu.

You might find that 'one arm chip' has many independent arm cpu cores carrying out different functions mostly completely hidden from end user behind firmware or rom


Amusingly enough, the embedded controller on some of the newer Allwinner ARM SoCs is apparently OpenRISC of all things. Guess it saved them some licensing fees.


Most Chromebooks have embedded controller running on ARM specifically.

https://www.chromium.org/chromium-os/ec-development


New ME is apparently a SPARCv8 core. Older ones were ARC.


Intel platforms also had an ARC processor (they approximately cater the same markets as ARM minus mobile) for years in the ME.

AMD does the same thing with an ARM core.

Practically all peripherals contain at least one programmable processor.

Remote management / IPMI also usually works with an entirely separate ARM computer that's integrated with the main computers graphics unit -- basically a shared framebuffer, just in hardware.


Fun fact: ARC derives originally from the SuperFX inside some SNES cartridges like Star Fox.


It blows my mind that the Super FX not only survived the collapse of Argonaut but went on to become so widely adopted by companies that presumably could make their own tiny microprocessors.

So cool.


ARC had been split out into Argonaut Technology Ltd. long before Argonaut Software hit the buffers.

The success of the SuperFX chip opened doors for future work, but nothing from the SuperFX was carried over - it was 16 bit, with a compact instruction encoding, a gate level design using ECAD.

A string of graphics hardware projects followed with a wide range of requirements, targeting 'interesting' (ie: cheapest) processes/standard cell libraries. These were all VHDL, and a flexible 32 bit embedded risc control processor emerged as a common theme.

The configurability and robustness in the face of sketchy ecad toolchains made it a very saleable thing in it's own right.


any source for this? should be an interesting Read



> Intel platforms also had an ARC processor

That is true for older designs. But didn't Intel switch from ARC to SPARC for their ME core not that long ago? Discussed here a while back:

https://news.ycombinator.com/item?id=8813029


They're ARC cores, not SPARC. That thread is full of misunderstandings of a PDF version of a presentation that incorrectly concluded it was looking at a SPARC core when it was looking at a newer model ARCcompact

SPARC would be a terrible mismatch for what Intel wants out of its ME


AMD CPUs and GPUs both have an integrated ARC processor that mainly deals with power management.


Whenever this topic occurs, I send this in http://spritesmods.com/?art=hddhack&page=1

Ubiquitous com...fusion.


I love that.

I had a hard drive that had a firmware fault, and I repaired it using a USB-Serial converter, taking the pcb off the drive to temporarily mask off a connector, and then issuing a couple of commands. http://superuser.com/questions/365999/how-do-i-recover-data-...

http://superuser.com/questions/365999/how-do-i-recover-data-...


You must have suffered full mental god complex for a few days after that. I'm happy when I succeed in creating a bootable live iso usb key with grub2...


I call that an ego boner.


Thank you. That is one of the best hacks I've read. Very enjoyable.


Don't forget that many (if not most) SSDs are dual core ARM chips.


Often more than just two cores. Samsung's latest PCIe controller increased from three to five ARM cores, and most of their SATA SSD controllers have also been 3-core designs. Phison's mainstream SATA SSD controller has four cores.


What is the AMD equivalent of Intel's open-source tboot for trusted execution, to take advantage of ARM TrustZone on an AMD processor?


Hell some (all?) Macbooks even have FPGAs


I was surprised about that and looked for a source. Here it is: https://www.ifixit.com/Teardown/MacBook-Pro-15-Inch-Unibody-...


What is the FPGA used for?


To allow for fixes in the field. You can't really repair hardware once it is shipped, but you can reprogram an FPGA to fix a problem in a bunch of logic so it is better to have that logic in an FPGA than to have it hardwired.

Don't be surprised if an upgrade to your OS also reprograms that FPGA.


I'm not sure about the FPGAs the parent comment is specifically talking about, but often FPGAs in general purpose devices are used just for glue logic - that is, buffering and interfacing between different types of busses.


The one thing I will say is even though Apple's software quality lately has taken a hit. Their commitment to security has only gotten stronger.

The secure enclave, hardware level security, all of the things that came up with the FBI request have become a self serving prophecy for them.

I applaud them for this and it looks like the MacBook Pro is going to be one of the most secure laptops around. Nothing is perfect of course.

Even though I applaud them for this I am still pissed about the headphone jack.


I have a nagging suspicion that Apple will eventually lock the hardware and OS down to the point where their computers will only run software delivered through the app store.


It certainly seems like they want to move in that direction, but it's critical that they retain the developer segment. It'll be game over for them if developers are unable to be productive on a Mac.

There's likely a way to make both of these things happen, to some degree. For instance, they could start with just locking Macs to the app store as a default that is changeable. (similar to how Android is locked down)


This is already in place - by default macOS does not allow apps to be installed which do not include a signed Apple developer ID. The setting can be turned off in System Preferences.


As of Sierra this can not be disabled in System Preferences, though there are workarounds.


I can't believe what I just read.


What are the workarounds? I upgraded, noticed the option 'anywhere' was gone and wished I hadn't...


I meant workarounds as in there are still ways to launch non-signed apps, like right-clicking (or ctrl-clicking, or using the Finder widget menu) the app and selecting Open.


Just execute "sudo spctl --master-disable" on Terminal. And the option "Anywhere" is be available again.


Do you think they still care about developers or other creative types? Judging by what the did to Final Cut and then to the Mac Pro line I'm not so sure.

There are several good alternatives to the Mac Books now and I'll be moving to Linux for my development efforts the next time I buy a machine.


I wouldn't lump "developers" together with "other creative types." They depend on developers to make the apps that make their devices desirable. If devs stopped buying macs there would be nobody to make apps for iOS. They don't depend on film editors in anywhere near the same way.


Exactly. I suppose if they did lose the developer segment they could open up iOS development on other platforms to remain relevant, but that whole path just seems very undesirable and a great thing to avoid.


Just a note - though FCPX had a bumpy start, there are now a lot of professional editors back using it. After the 10.1 update it most editors biggest complaints were fixed - and that was years ago.


Not to be argumentative, however :), I'm primarily a programmer but have been involved in producing two feature films in Hollywood, and do not know of a single professional editor who uses FCPX. Admittedly, the majority seem to be using Avid and may not have been using Final Cut Pro previously.

I would like to add that I don't see the Apple garbage can as a suitable hardware offering for serious editors.


What are some decent alternatives for the MBP? I was hoping to replace my mid-2014 MBP Retina with the ones that just came out but honestly they seem really undesirable, especially with the lack of physical ESC key. Surface Book 2 seems interesting but I don't want Windows 10 to spy on me. How is Linux on the Surface Book series?


You may want to give the Razer Blade series a try. The XPS 15 is also expected to get an update soon.


I doubt the big players like Avid and Adobe will allow them to do that. Either that, or they'll no longer develop for MacOS.


You mean the headphone jack that is still there?


He's probably referring to the iPhone 7's lack of one.


...and on the wrong side. ;-)


As a lefty, I'm looking forward to not having the headphone cable dangling across my papers area. Suck it, righties! It's your turn to be slightly inconvenienced!


But...lots of headphones only connect on the left side, and run the connection for the right earphone along the headband. I'm pretty sure it's always left or both, never right. Putting the jack on the right really screws that up, even for lefties!


That's fine, it can loop around the back and run along the left edge of the computer. Better than having to stick straight out there.


Just as a counterpoint; Sennheiser HD25 headphones connect from the right headphone in the default configuration, however there are plenty of replacement cables of various types, some of which may change this. I agree that its quite unusual though.


Wired has an interesting and in-depth article on this. Also, the camera seems to have more security than earlier.

> In the MacBook, the Secure Enclave is part of Apple’s new T1 processor, meaning it’s tied explicitly to the touch bar and Touch ID. It’s also, though, in charge of your webcam, a small but important difference.

> “In previous generations of Macbook the webcam light was software controlled—which meant that an attacker who compromised your OS could potentially activate the camera without turning on the light,” says Johns Hopkins University cryptography expert Matthew Green. “Adding a separate secure processor could make this much harder to do.”

https://www.wired.com/2016/10/macbook-pro-touch-id-secure-en...


Or, you know, spare one of the trillion transistors to switch power to 1) camera and 2) LED.

There, solved. No full-powered ARM needed to control an indicator light.


I'm guessing it's not that simple because there may be situations when the camera has power and is capturing an image (think hot-standby) but is not transmitting image data to the world outside the sensor package.


I don't want my camera to capture pictures without the LED being on, ever.


And what if I turn on the camera for a tenth of a second, take a picture, and turn it off. You won't notice the LED being on for a tenth of a second. Software makes it easier to prevent such things.


You definitely don't need software to add a little delay for turning off the LED, but sure, it would be slightly more complicated than just a dang wire.


Just wrote a test app on macOS that initializes the camera, gets the first video frame, and turns it off. You definitely notice the bright green LED being on for a split second.


You only notice it immediately if you're looking for it. I've gone minutes with the camera on before noticing the LED while I was distracted by other stuff on the screen.


Source code?


Why not both? It is trivial to make a circuit such that the LED is on if either there is power to the camera, or the software tells you to.


I would guess you could do the same with a circuit, even if that means you lose on flexibility (but given the concerns of many users, I'd say it's worthy).


A small piece of black tape or a physical shutter is likely a better choice. Who cares if the camera is on if it can't pick up anything?

The microphone, on the other hand, is kind of a bigger deal and still has that issue.


Yeah, exactly since the camera in the previous generation MBP actually had 4GB SD storage on it.


The ARM was not bought to power the LED, it was bought in for TouchID. The camera LED is the icing, not the cake.


They have a point though. Why would there be any software that controls the LED? Is it not more secure to have it hooked up directly to the camera power line and turn them both on and off inseparably?


Turning off the camera power would mean it wouldn't show up on the bus, so the computer wouldn't know it existed.

It would make sense for the camera itself to drive the LED, instead of the computer/drivers doing it. I don't know if the camera ICs support this or not, though -- if not, it's not really something Apple can add if they're integrating a standard camera component. On the other hand, they control the T1 entirely, so they can easily do this with the new architecture.


From what I recall when it was in the news, the camera does control the LED, but it's just a USB camera on the USB bus, so the demo attack worked by reflashing the software on the camera.


I write a piece of malware that pops the camera on long enough to snap a single frame and then turn back off. The LED blinks, quite possibly too fast for you to notice.

OTOH, you can delay the LED and have it very noticeably on for a second or two when it's software-controlled.


How 'bout some pulse shaping. Rising edge of the camera power triggers a latch, whose output is pulse-shaped so that you get a little attention-catching "LED flash" when the camera turns on (brightness tapers off quickly) and is just dimly / visibly lit beyond that. This could be done with 1970s hardware with maybe two chips and some passives. Maybe one chip if you're getting ingenious.


An attacker could piggyback on user initiated sessions of video camera use. After the user closes the connection, it could immediately trigger an extra 0.1s session that would go unnoticed because of the proximity with the user intended session.


Good point, but then why not implement the delay in immutable (and trivial and cheap) hardware? Is this not both simpler and more secure?


I guess, yeah, you could do it in hardware. But I don't think, then, it ends up being directly connected to the power feed for the camera itself, yeah? You start creating false-status cases. (This is speculation; I am not an electronics wizard.)


One word: capacitor. Seriously, solved case for many many decades


Power->delay off circuit->LED. Seems pretty direct to me. The delay off circuit would probably have its own connection to power as well.


Post 2013? that's exactly how it was hooked up.


Why not just have a hard switch to cut the power to the camera. It is pretty low tech. Would be hard to hack without flipping the switch. No way camera could operate without power.


A physical power switch would take more volume in the computer than the camera itself and would have a failure rate at least an order of magnitude higher than the camera. Things which people physically operate need to be physically more robust than electronic components on a circuit board.


Yo dawg. I heard you like ARMs, so we put an ARM on your ARM.

https://github.com/patjak/bcwc_pcie/wiki/Specification---Fea...

^^ did everyone already miss that the webcam was an entire SoC and not just a sensor hooked up to UVC?


Hence why you power-cycle the thing.


What? How is that supposed to help anything. It's not like you can physically disconnect the webcam without disassembling the panel.

The crazy part is you don't even know if it's still recording with the entire laptop off. Could it not, in theory, draw power off the main battery independent of the system being on? It certainly should be able to while in sleep mode.


Before we shrinked transistors to a mere atoms wide their main purpose was in switching power.

And no, it's not connected to the main battery, because presumably it runs on 3.3V and maybe even some more exotic voltages for the image sensor. That needs to be regulated.


I think it's a pity that the Touch Bar was not designed with a dedicated area that can only be rendered by the T1 processor directly. If that were the case, then network services could communicate directly with the user via the Touch Bar, authenticating themselves with cryptography that cannot be broken by anything running on the main system processor.

This would have provided a truly trusted mechanism for apps to communicate with the user. For instance, your bank could authenticate itself with you via the Touch Bar.. If the Touch Bar display is controlled from the host OS, then malware could pretend to be your bank, on the Touch Bar.


Apple's demo showed payee name being rendered on the right side of the Touch Bar, https://www.wired.com/wp-content/uploads/2016/10/Gallery_Tou...

Ars claims that the rightmost 608 pixels ("Control Strip") of the Touch Bar are reserved for Apple, http://arstechnica.com/apple/2016/10/15-hours-with-the-13-ma...


But that would complicate using it as a screen for displaying ads.


AdBlock problem solved! Just get everyone to use keyboards that have dedicated ad screens embedded inside.


Though transmission of data is handled by the main processor, Apple Pay dialogs on screen are completely rendered by the T1 to take advantage of the Secure Enclave, a portion of the chip set aside for personal information just as it is in iPhones and Apple Watch devices.

That sounds similar in concept to Intel ME, another "secure" coprocessor that can do a lot of other things that the more paranoid are freaking out over...

It also makes me wonder if these MacBook Pros also have Intel ME.


> It also makes me wonder if these MacBook Pros also have Intel ME.

They almost certainly do. Intel won't sell you a CPU without Intel ME.


My impression is that this refers to the Touch Bar screen, not the main display.



Any idea whether this secure enclave will work if you are running Linux? (I like the hardware, but I am more used to the Linux/GNU/gnome software)


Given that iSight and Facetime HD cameras required drivers and firmware to get running under Linux, I would be positively surprised if the new camera is easier to set up, meaning I don't think it works right now. Even the existing camera support isn't 100%, with features like suspend/resume being flaky, all due to it being reverse engineered.

That said, the options for developer laptops has shifted in favor of general x86 computers, including very light-weight machines and ones with mechanical keyboards. If I want to use Linux on it, my first choice wouldn't be a macbook. So, unless you require macos for work, you have have a more diverse variety of laptops to find one that suites you best.


Sure, but the laptops you describe do not have secure enclave. If I want that level of security on Linux what can I do?


What would the enclave do for you if you're running an OS that doesn't support Apple Pay or Touch ID?


It is used for the camera too.


On Thinkpads you can disable the routing of signals of the mic and camera in the firmware. Afterwards, it's not available to the kernel. Whether that's somehow hackable via a UEFI bug I don't know, but there are also laptops with physical switches the mic, wifi and a lid for the camera. Your safest bet is to open it up and disconnect the device, which is less likely to raise alarms on Linux and BSD than it is under macos, I assume.


Build your own? Its an ARM chip running a relative of SEL4. Would be an interesting project.


Based on a talk at the recent blackhat 2016 conference, bootstrapping and communicating with a secure-enclave processor can be a huge task. If it is ever possible, it will require some fairly large engineering (and reverse engineering) efforts.


I've never really understood why people want to run Linux directly on (most) laptops. These computers usually have all sorts of one-off proprietary chips embedded in them, that didn't come from some manufacturer that cares about standards, but rather were written under contract for the OEM-integrator itself. In this regard, Apple is no different from Dell or Acer or Toshiba. (The only modern exceptions are HP and Lenovo's respective enterprise departments, who both try to ship hardware that Linux already has upstream support for.)

Both Windows and macOS have pretty cleanly-separable userlands. Either OS can be set to a single-app full-screen "kiosk" mode, where you boot straight into a specified app. And that app can be, say, VMWare. Running Linux.

Computers used to have BIOSes: a chip that is loaded with firmware, and exposes a standard abstract-device protocol for each device class the computer supports. It's not a far leap to imagine a stripped-down copy of Windows or macOS, running on one core of the CPU, as a modern kind of "BIOS."


The T1 is running firmware from a ramdisk initialised by the host, like many embedded devices. So at the very least, you'd need to rip the necessary firmware images from macOS.


You mean that they are using a tried and proven architecture that helped the 4.77 MHz IBM PC find success in 1982?

Yawn....

The IBM PC keyboard had its own Z-80 CPU way back then and likely every single PC of any sort since then has done this. And then there are the disk controller CPUs. Wasn't that introduced around 1984?


Do you have any further information on this? Having opened up an IBM PC keyboard, and knowing that Z80 CPUs were still fairly expensive in 1982, I'm a little surprised.



Quick question -- what's so good about ARM specifically? Aren't there other minimal architectures, maybe even one with lower/no licensing costs?


In short: it's not Intel.

Longer: Intel has continually sucked at mobile (likely due to innovator's dilemma and it's love-hate relationship with Microsoft which also sucks at mobile), and this may mean a stronger shift by Apple to integration with it's priorities (security, iOS/WatchOS friendliness, better battery life).


I believe MIPS is cheaper for the average Joe to license, but Apple is already an ARM licensee with high volume so they probably get a hot deal. Also they already have ARM IP and >20 years experience integrating it.

Besides, ARM is a fine architecture and it's popular these days, so why not?


They already have a lot of in house expertise in ARM. It could even just run their existing watch OS, enabling everything that already has such as display drivers, tools, firmware updates and boot system.


ARM processors are the best at low power and this allows longer battery life [1].

[1] http://www.androidauthority.com/arms-secret-recipe-for-power...


The most likely answer is that Apple has one of the best engineering groups at designing and implementing ARM processors in the industry. The A series chips for example have been absolute power/performance leaders for many generations, and the things hey can do with the tiny S-series chips in the watch are really remarkable. It would be shocking if they were to consider anything other than AMR?


developer tools, operating system and software ecosystem, probably other IPs (like GPUs) integrate well with ARM but that might not be the case for a brand new processor, and a history of success (selling processor IP is enterprise sales).


Touch ID, security + Apple pay.

To keep the cash flowing and replace banks.


Tbh I really see this might happen. They have internationally widespread devices (unlike banks which don t usually operate in too many countries) they have superior technology, much easier for clients to use and also a better reputation than plenty of banks. Then the remaining issue is regulations, can they solve it? Would an european country back up < 100k deposits for Apple? :)


They want to be monopoly by entering in our home as an IoT devices, pocket as a smart phone, desk as a laptop, office as a desktop, road as a car, wallet as a digital credit card etc. They still have not entered inside our toilet.

I guess most people can see the problem and their potential as our next dictator.


People who claim that every computer is full of microcontrollers are missing the point. This is not a microcontroller, it's full SoC running (more or less) the same kernel used on the main CPU.

It's more similar to the LOM and IPMI systems found on server computers rather than anything else.


"The T1 also sends pixels to the Touch Bar though the MacBook’s main processor is what actually renders that content which is then sent over."

I am a bit worried about the author. Is Matthew having a stroke?


That's a valid (albeit awkward) sentence.

The T1 also sends pixels to the Touch Bar, although the MacBook's main processor is what actually does the rendering for said pixels (which are sent).


And the battery has a MCU for charge management, and the trackpad, and the keyboard, and ...


[flagged]


We detached this flagged subthread from https://news.ycombinator.com/item?id=12824767 and marked it off-topic.


Why is it off-topic? According to the article, the T1 secures the camera, and according to other commenters, one way it does it is to control the LED. Many other comments talk about the LED.


That was my mistake in the above comment. We've not marked it off-topic since it was killed by user flags, which is why we detached it.


Earlier, it was software controlled. Now, it is not. Isn't that a good thing?


It is still software controlled; by software on a separate processor, and which is far harder to access than before. I'm not sure if that's a good thing.


Why not?


The basic idea is that this is an area of the system that even you as the owner cannot inspect, but I'll let others do the bulk of the explaining:

https://www.gnu.org/philosophy/right-to-read.en.html

http://boingboing.net/2012/08/23/civilwar.html

http://boingboing.net/2012/01/10/lockdown.html


It's interesting to me that you feel this way. Do you also object to the Secure Enclave architecture on the iPhone?


I object to the iPhone entirely, and to a (only slightly) lesser extent, Android devices. Securing against non-owners attacks is one thing, but securing against the owner is definitely repulsive for me. It is sad that the single adjective "secure" has been confusingly used for both.


That makes sense, and is a totally coherent point of view. You're more than typically clueful about these subjects, and it's interesting to get the "against" take on the SEP. Like a lot of systems-y security people, I tend to see it as an unalloyed good, and a big part of what makes the iPhone the most secure computing platform in the industry. But nobody can reasonably argue that it's the free-est.


I certainly do. You must, by definition of Apple's iOS security, trust them it is actually secure. We cannot know how secure it is because we cannot know how it is implemented in hardware or software because it is all proprietary.

Meanwhile, I can reasonably trust, say, crypto++ and its component libraries, because I can (and when I use it, do) audit its source code. I may not be able to trust the processor that eventually compiled binary runs on, but that is another problem that several fronts (POWER8, RISC-V) are moving towards solving.

And no, at the end of the day, you have to trust someone untrustworthy with your computers. We don't have access to fabrication hardware to make them ourselves, and even if we did... the rabbit hole continues and how can you trust the fab? Instead, you minimize your risk and have the smallest surface area of exploitation possible. It is harder to make a whole CPU, whose instructions you have, operate against your will (because you can see the output and compare results) than a black box of silicon you just get cryptographic data out of.

But without source and design documentation on products I own I have no reason to believe they are anything but malicious with planted backdoors to circumvent any of my own attempts at privacy. Which makes using computers for private reasons borderline impossible without a full libreboot system.


[flagged]


> acce$$orie$. Apple, in the computer industry,

I'm sure this will be completely true, unlike every single other 'prediction' ever made without qualification, in their entire history, made every single time they release a new product.

Substituting dollars for 's' also makes the writer even more respectable and objective, just like writing 'Microsoft' with dollars.


that's nice dear. insert grandma gif here

how about more RAM, tho?


Can we expect Error 53 after you spill something on the keyboard destroying Touch strip = and computer stops booting .. but you have important data on SSD that is now SOLDERED permanently to the motherboard, and Apple doesnt offer data recovery (only logic board swap at a low low price of $2/3 whole computer) so only option is independent repair shop?


The SSD on the new non-Touch Bar laptop has been confirmed to have a removable/replaceable SSD. The SSDs on the 2013-2015 Macbook Pros also had removable SSDs, so there’s little reason to believe the Touch Bar MBPs won’t either.


>SSD that is now SOLDERED permanently to the motherboard

Citation needed.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: