Hacker News new | past | comments | ask | show | jobs | submit login
Western Digital Demos SD Card with PCIe X1 Interface, 880 MB/s Read Speed (anandtech.com)
156 points by rbanffy on Feb 27, 2018 | hide | past | favorite | 96 comments



Is there a security story for this interface? Internal NVMe devices are generally trusted and not passed around, but plugging untrusted things directly into the PCIe bus, with coherent access to memory, is a potentially huge security hole. This was an issue with early Thunderbolt.

And this isn't very hypothetical: aside from passing cards around, counterfeit SD cards are extremely common on Amazon.


IOMMUs can protect the RAM if configured to do so by the OS, udevd rules could let you opt-in to to the kernel using the devices - you wouldn't want your network connections hijacked through some plugged-in and auto-detected network device!

But I don't know what the PCIe security model is wrt. to accessing other devices, such as the HDDs.


> But I don't know what the PCIe security model is wrt. to accessing other devices, such as the HDDs.

Should also be protected by the MMU for the most part. You might be able to forge interrupts from other devices though. PCI-E message signaled interrupts are just another memory write to a special part of address space, that unfortunately isn't separated by device at page granularity AFAIR.


> This was an issue with early Thunderbolt.

Still is, right? It's an issue with any interface that has DMA like firewire and thunderbolt.


This seems to be just the prototype. Ideally, the production version would have the PCIe interface reader hardware on the system itself, and the SD Card would merely expose some bus pins.

Another solution might be keeping the PCIe interface on the SD card but adding a small proxy in the reader to restrict the PCIe protocol - but this might incur a non-negligable performance hit.


What's wrong with IOMMUs?


Except every peripheral in the Apple ecosystem uses Thunderbird/Pcie. This is as much of a security threat as any peripheral available today


Ideally one day videomakers will join the audio world in always just recording uncompressed. No longer would you have to think about whether you used the right codec for your editor. It's taking a long time to get cheap because the bandwidth of video is so many times more than other uses. Most people already are way overserved by the standards already in place.

For what it's worth the CompactFlash Association recently ratified CFexpress (https://en.wikipedia.org/wiki/CFexpress) which uses NVMe and two lanes of PCIe 3.0, for 1.97 GB/sec. If my calculations are correct, that's enough for 10-bit 4K RAW at 178 frames per second.


Or wouldn't it be better if the compression is lossless.

Lossless, fast Codec that is Open Source and Patents Free. We have FLAC for Audio. ( I dont know why I dont like the name FLAC though ), hopefully we will have something similar for Video.


This would be great for DSLRs if integrated especially those who are shooting wildlife at high speed. Contrary to my own previous belief, I thought I would need a smaller hard drive in terms of storage space when I have a larger or faster SD card, but the opposite is true. The faster and bigger the SD card is the more photos I ended up shooting with more duplicates and the more of a need for a home NAS or bigger hard drive.. I think the issue is I never considered SD card as permanent storage media and I would always create a copy of it somewhere else.


What you need is more aggressive culling. Eventually more storage space becomes a black hole for a photographer. Your photos go in- and never come out, because the sheer quantity is oppressive. Who wants to spend a Sunday flipping through a thousand tiny variants of the same picture?

If you are a talented professional who takes terabytes of unique & engaging photos, disregard- but otherwise, as another photographer, let me suggest you focus on culling instead of storage.

(Photo Mechanic was a godsend for culling; it's fast as blazes even on weak hardware, and cheaper than a new processor or more DRAM)


Yes, cull aggressively. Even talented amateur photogs that I know does not not have more than 10% keeper rate, at max, and that probably include a bunch of meh stuff too. No reason to keep the other 90% with half-nailed focus or unflattering facial expressions. For many years I kept mostly everything thinking that I one day might go back and re-evaluate a photo. That day has so far never happened in 16 years of shooting digital.


Yup, if I'm hitting 5/100 I'm having a pretty good day.


> you are a talented professional who takes terabytes of unique & engaging photos, disregard- but otherwise, as another photographer, let me suggest you focus on culling instead of storage.

Even among talented professionals, you are lucky if your "hit" rate creeps over 10-20%, especially outside of controlled studio lighting. I've always felt the amateur dream of "taking better pictures first time" ignores the reality that many pros have long shot in bulk and culled too.

I'd argue that the extremely un-glamorous secret to better photography for _most_ of us once a few basic composition rules like rule of thirds/golden ratio etc are understood is frankly to take many more photos, and yup, cull, cull, cull. This is precisely why tools like Adobe Lightroom come out of the box with such a heavy focus on keyboard shortcut support for zipping through an import and marking the winning images in a quick pass.


This got me thinking, but is there a photography solution for using ML to train selection if photos based on what you think is good/bad so that it can be automated based on your preference? i.e. Offline version of something like Google Photos so it can quickly scan through all the photos and you can keep improving the model based on your selection.


I can't agree more. I got my first "real" camera and it takes great pictures - when I frame a shot right, which is not too often. Otherwise I get a ton of data and don't have time to sift through it all in DarkTable.

p.s. I know DarkTable doesn't want to be a file manager, but it really needs to have Delete for this very use case.


Should be as simple as photo organisers/editors detecting and categorising bracketed or burst-shot photos and displaying them as a sort of batch of the same photograph, i.e. in a list of photos displaying them with one thumbnail and a visual indicator that multiple photos exist in that same thumbnail.


I stopped spending time culling a couple of years ago on the bet that in a few years machine learning will be pretty good at it. I’d rather cull when I’m generating training data.


Can I contact you on the training data idea?


Sorry, I don’t ever make off-forum contacts from forums unless with an account I’ve specifically created for that purpose.


> What you need is more aggressive culling

Or a good AI to help you categorize the pictures and look for features you may have not noticed.


Google Photos actually has some nice features in this regard: it flags the "best" picture in a burst, with remarkable accuracy in ignoring blurry photos, closed eyes, subject not centered, etc. And it generates nice collages and animations too!

But I wish it had a setting to clear out the ones that it's most confident are bad. If it's dark or I'm moving, there's a fair chance the picture will be blurry, so I (and everyone else) just take a few. Recognize that and let me cull automatically, please!


I noticed myself "making a panorama" the other day by just waving my camera around and taking a series of pictures and just assuming Google photos would choose to stitch them together (and it dutifully did without needing to be told). It's nice not having to explicitly choose what you want done up front and be able to trust it to recognize the most logical processing to do like that.

I'm sure it will also gain more ability to cull - I mean it already picks out things that looks like documents (e.g. from using the camera instead of a scanner), and offers to archive them for you. I'm sure they'll add more categorization as they gain sufficient confidence.


Can I reach out to you somehow @gascan ? Want to learn more.


See my profile, @gmail.com


This is huge! I have no affiliation, but I'm really excited about this. Looking through the comments, I don't think everyone appreciates the implications:

- It would mean a sane standard for SDcards; none of the ever-evolving, always-limited, SDcard protocol

- NVMe drivers are already there in any modern OS

- NVMe is a pretty fantastically wonderful protocol compared to all other mainstream storage protocols (SCSI, ATA, FC, etc). It has low overhead and supports ridiculous amounts of parallelism.

- This would open up uses of SDcards that aren't really possible today (basically allow for the SDcard to be a general SSD, faster than most (all?) existing SATA SSD).


What about IOPS? Assuming IOPS are decent, you could probably replace a SATA SSD with this. This seems like it is in the "middle" ground between a full blown PCIe Gen3 x4 NVME drive and a cheap SATA SSD.


A SATA SSD consumes 1 to 5 watts during heavy write activity. That's hardly worth mentioning for a 2.5" drive in a desktop enclosure, but it's a big problem for an SD card with no way to get rid of the heat!


Phone LEDs which put out 1W in a much smaller package somehow survive.


Dissipating 5W from something the size of a SD card is a much greater challenge than 1W. 1W in a phone body also has much more matter and additional nearby components to sink heat into (ex: feel how warm the front of the screen gets on a modern 6" smartphone while rapid charging).

These sorts of things have to be designed with the challenge that people will be using them in ambient air temperatures of 44C in some parts of the world.


A 1W LED is soldered to big heatsinking planes on the PCB it's mounted to, which is itself designed to sink heat away into the phone body. An SD card is wrapped in plastic, slides into a plastic slot, and touches the phone with only a few small pins.

Plus an LED behind a lens and cover on a phone or inside a flashlight is perfectly fine to run at 60C, but eject an SD card at 60C into your hand and you'll be throwing it across the room.


> but eject an SD card at 60C into your hand and you'll be throwing it across the room.

Probably not, given the very low heat capacity of an SD card's plastic casing. But I can say from experience that ejecting a 2.5" SATA drive with a metal case that's well over 60°C will cause you to drop it quickly. It's a good thing SSDs can handle falls without trouble.


They should start using F2FS, instead of patent encumbered exFAT and Co.


>> The company is not disclosing the type of memory or the controller that power the SD PCIe card, but it is clear that we are dealing with a custom solution.

Just itching to know if it's got a RISC-V core in there.


Nope. Their RISC-V stuff is still at least a year away.


Hey, this might be good to replace the silly SATA DOM [1] that SuperMicro uses.

1: https://www.supermicro.com/products/nfo/SATADOM.cfm


Too bad there isn't a single good USB-C SD card reader.


This one has decent reviews: https://www.amazon.com/Cable-Matters-USB-C-Reader-Memory/dp/...

(Linking to amazon as an example, am not affiliated with the seller)


https://www.aliexpress.com/item/New-Portable-Mini-Design-Cha... works well for me, but I must admit I mainly bought it for the hilariously out of spec USB micro connector (look at the fourth photo)


Are things still that seriously screwed up in the USB-C ecosystem? I was hoping they would have figured things about by now.


Up until USB 2.0 and to a degree, 3.0, USB was pretty straight forward in the fact that EVERY USB device plugged in worked.

Now you have USB C with a laundry list of supported features and protocols which may or may not be present on your device and most consumers are completely clueless. Does that USB C port on your device support Displayport? Does it do Thunderbolt? Can it be used for charging? What does it do exactly? Too many technical features in a consumer plug.


> most consumers are completely clueless

In my experience even many techies are clueless, myself included. I recently purchased a Lenovo* Thunderbolt laptop dock for my new Lenovo laptop that has a USB-C port, but quickly found out that this laptop does not support Thunderbolt: the dock plugs-in fine, but doesn't actually work. Apparently there is a separate "USB-C dock" in addition to their Thunderbolt dock, but I wasn't aware this was something I had to take into consideration.

I'm also left wondering what would happen if I took the USB-C charger for the laptop and plugged it into the USB-C port on my phone, or vice versa. How "universal" is USB-C anymore?

* I'm not a fan of Lenovo in general but that's what's available through work.


The charging should work if they've adhered to the USB Power Delivery standard. Laptops should be pretty good about this. Basically there are a couple of pairs of voltage/amperage ratings and the charger/device will negotiate the highest rate that both are capable of. It tops out at 100W (5A at 20V).

Qualcomm has their own "Quick Charge" system which is not generally compatible. A lot of phones (Samsung) are doing this, but a quick search says it might also work with USB-PD chargers.

In other words, it sounds nice but it's actually a bit of a mess. FWIW I use an Anker 5-port + USB-C charger at my desk with a Macbook Pro and it works fine (though it's 29W instead of 61W so it doesn't charge the battery during use).

http://www.usb.org/developers/powerdelivery/

https://www.reddit.com/r/GalaxyS8/comments/69xox4/the_galaxy...


That's one of the reasons I bought the specific dock offered by Dell for my XPS 15. The laptop and dock support thunderbolt, but it's carried across USB-C. Getting this all right and knowing it will work as expected was enough for me to pay an extra $30-$40 to make sure I got a well tested docking system.

The article I just read to make sure I had my facts straight before posting[1] said it best I think. USB-C is a connector, which support many protocols (including USB 3.0/3.1 and Thunderbolt 3).

In other words, it's the equivalent to the Ethernet in your network, over which you run a protocol like TCP/IP, or IPX/SPX (not that I think anyone actually runs this much anymore). It's confusing because for USB 1.0 and USB 2.0 the protocol and the connectors were fairly closely linked (although they did downgrade as needed).

1: https://www.quora.com/What-is-the-difference-between-a-Thund...


> In other words, it's the equivalent to the Ethernet in your network, over which you run a protocol like TCP/IP, or IPX/SPX

Not a good comparison as those are layer 3 protocols. A better comparison would be two layer 2 links such as USB and ethernet sharing the same plug along with devices offering varying levels of support for one, the other, or both.


I considered that, and while it's ultimately more accurate in specifics, I think it's much more confusing in general considering I'm not aware of much much competition in the data link layer (or at least I'm unfamiliar enough with it to make me unsure of any argument I might make). I don't think there's much difference in the important points by going up the OSI stack slightly, it's still a physical connection and a protocol over it (even if there may be additional protocols between those).


And yet, I plug an Ethernet cable into a switch manufactured 20 years ago, and it works great, albeit at slower speeds.


Twenty years from now plug a new USB-C cable into something manufactured twenty years ago and it may work too. You know what might not work too great now over Ethernet even if the connector works? An IPX/SPX device, since there's nothing to connect to on the other end. Your Ethernet network will be functioning perfectly, but you'll find yourself having trouble getting internet access because your router/firewall/etc are all running IP.


> and it may work too

It may work. It doesn't even now.

And IPX/SPX work just fine. I was actually playing a DOS game over multiplayer the other night.


Multiplayer on the same LAN (or with translation software to TCP/IP), right? Meaning you connected two IPX/SPX devices over it and they worked together. That's expected. But you wouldn't expect to plug in an IPX/SPX device and DHCP an address and use the internet (which would be possible with a IPX/SPX aware NAT firewall), just as you shouldn't expect to plug a laptop that supports Thunderbolt into a bock that support USB-3.1 over a USB-C cable and expect it to work The medium works in both devices, but there's a protocol mismatch.


USB-C's situation is so bad that you have to know which of the identical ports have certain features. Your bridging argument is just a red herring.

I can plug two devices together locally over Ethernet and expect them to work. I can't expect that of USB-C. Do you not see the difference?


it's definitely a pain to have a single connector that may or may not support several different transport technologies and/or power delivery. on the other hand, it's way less bad than having to carry around a different cable for each of those and having to cram all those different ports on the laptop.

my guess is that, in a few years, a standard set of the most useful features will stabilize and it will be cheapest to just offer that all the time. in the meantime we just have to deal with the growing pains and read benson leung reviews so we don't burn our houses down.


Same with a USB mouse from 2003.


I had a friend try to connect his Google Pixel to his TV using a USB C -> HDMI adapter and complain to me when it didn't work. I then explained that USB C has AltMode which supports other protocols like PCIe and displayport but only if the manufacturer has included such functionality. He was both confused and annoyed that his Macbook could display it's screen on the TV but his Android couldn't even though the have the SAME PLUG. Dumb Dumb Dumb.

And don't defend your choice to buy/use Lenovo. Screw what anyone thinks. I still have my 2011 Thinkbook classic running Linux to this day.


Isn't the Pixel just using USB 2 over the C connector?


yeah, I dont actually know of any smartphones implementing usb 3.x. would be kinda pointless considering the transfer speeds of the actual flash.


Most smartphones sport 200-500 MB/s flash speeds, so nearly enough to saturate USB 3. Perhaps some already could do so.



isn't Thunderbolt for Apple devices?


Thunderbolt was developed by intel. [0]

[0] https://en.m.wikipedia.org/wiki/Thunderbolt_(interface)


> developed by intel

and Apple...[1]

Usually whatever Apple touches becomes very proprietary...

1.: https://en.wikipedia.org/wiki/Thunderbolt_(interface)


They ruined firewire with the idiotic trademarking of the name Firewire causing other companies to brand their products: IEEE 1394, 1394, i.LINK (Sony) and Lynx (Texas Instruments). That lead to similar problems that USB C is having where you have confused people asking questions such as "Is this 1394 camera compatible with my iMac?" or "Oh, i.LINK is the same thing as Firewire AND 1394? So I was able to use all of those things this whole time?" Things got a little better but combined with the per port royalty vs $0 for USB it was bound to fail. And that's a shame because it was a nice bus with very low latency, overhead and memory mapping. It would be perfect for creating small Linux clusters or distributed systems with just a few 1394 cables between machines. Superior to USB in every single way save for cost.


It's been around for PCs for a while too. Correct me if I'm wrong, but it started as an Apple/Intel extension to DP.

I've yet to see it in an AMD PC yet.


Thunderbolt wasn't really an extension to DisplayPort, it just initially re-used the mini-DP connector and has provisions to carry a DisplayPort data stream alongside the PCI Express data stream. When attaching something like a storage device, none of the DisplayPort-related stuff is active.


I wonder if they'll ever start giving the plugs different shapes to differentiate them :/


I might have to wait for the plug wars to end before I buy a new laptop.

I remember when I thought FireWire was going to be the future...then thunderbolt...display port...usb c...usb n + 1


Apple encumbered tech will never be the tech that wins the future


Only one of the standards listed was developed by Apple (FireWire). Thunderbolt and USB were both developed by Intel, and DisplayPort was developed by VESA. In all three of those cases Apple was just an early adopter.


To be clear, Apple is a member of VESA, so they may have had some input into the design of DisplayPort.

https://en.m.wikipedia.org/wiki/Video_Electronics_Standards_...


Forgot FireWire was an Apple push. Nice to see they didn’t push lightening cables to Macs.

I can name others I adopted on PC including eSATA and Zip Drives.


Expensive or crap. Anker and a couple other companies are slowly chipping away at the problem, but the laptop makers have made the problem worse (see below). One of the newest Anker hubs finally has charge passthrough, a couple usb ports, a video and an ethernet jack for $70. This is the first I've seen in this price/performance ratio. To get a hub with more connectors than that you quickly get to $120, and with twice that many you're talking $200.

Apple should have retained a few more ports for a couple of years. The SD card reader and one or two other ports on for a while, and they put the USB-C jacks so close together that you have to use tethers (dongles with 3 inch cables) to plug anything in. The 'small' USB to USB-c adapters you see will overlap the other usb port, so without a dongle you run out of ports quickly. And you know, some of us still put their laptop on a riser when working and now you have all of this shear stress because the dongles aren't long enough to touch the desktop. Dongles floating everywhere.


It's just markup, you can get the same hubs (same exact hubs, I verified this by comparing the internals) from AliExpress for $20-50.

(Charge pass through, Ethernet, SD, microSD, USB Ports, HDMI 2.0 all in one adapters.)


Example links to a couple of the good ones?


The ones I bought were from "WarmSea" and "Dodocool" but I think buying whatever's cheapest and has the ports you need makes sense. It's just a USB hub after all.


The main issue is that a lot of USB-C stuff, even now several years in, is still flaky junk that doesn't work as advertised or lists the wrong features


If it's not as advertised and you pay with PayPal then you can get a refund fairly easily :)


Do they have the same shielding?


Any replacement of the utterly ubiquitous USB 2.0 & micro-USB 2.0 was always going to take many years. Give it a few more. The Nexus 5X was one of the earliest harbingers, and that was only two years ago.


The MacBook 12" was introduced in April 2015, nearly three years ago, with just one USB-C port and a stereo jack. Nonetheless, USB-C devices have been introduced in a glacial pace (I bought a 12" at the end of 2015 and there was still almost no useful USB-C accessory available).

We'll eventually get there and I agree that things will be messy for a while, with USB2-only USB-C charging cables, MacBook Pros that support Thunderbolt over USB-C, MacBooks that don't support Thunderbolt over USB-C, etc.

(I do like the plug though, it's reversible and compact.)


The standard is let's say not so well thought out. You can still kill your computer by plugging in wrong USB-C.


I get the point you are trying to make, but there’s a lot of things besides just USB that are susceptible to frying your hardware. The root cause is shitty manufacturing, there’s almost nothing practical the laptop can do to prevent a badly built device from doing at least some kind of damage.


sure, but manufacturers seem to be uncommonly bad at implementing the new USB 3.x specs, power delivery in particular.

the majority of USB-C cables on sale violate the power delivery spec in important ways.


I haven’t seen or heard of a USB-PD cable issue outside of cheap Chinese knockoff crap. Have you?


just going off my memory of this spreadsheet [1].

[1] https://docs.google.com/spreadsheets/d/1wJwqv3rTNmORXz-XJsQa...


You can kill almost any computer if you plug in a power cable that switched hot and ground. That's not a fault of USB-C. It's completely unrelated to the subtleties with resistor marking.

Edit: Why the downvotes? Was there any issue with killing computers outside that one batch of backwards cables? Seriously, if a power cable is 100% backwards that's got nothing to do with any standard.


the downvotes are probably because it's not usually a polarity issue. the most common issues are with current/voltage negotiations between the charger and device. when this isn't implemented correctly, either the cable or the charger itself (maybe even the device, idk) can erroneously negotiate a power level that is too great for the circuitry to withstand.

you're right that it's not exactly the fault of the USB-C spec, but at the same time, manufacturers would probably not mess it up so much if it were not so complex.


Chargers already need to be able to handle phones that pull arbitrary amounts of current, or they will fry on non-C cables too. Other than test-and-see, the negotiation is done with data packets, and a bad cable can't cause anything to go wrong. There is nothing that both fries devices and is the fault of USB-C.

The million cables that have the wrong resistor don't make things worse than they already were pre-USB-C.


Noticed that too, but thankfully USB-C is versatile and I can use almost (have some FW800 readers still) literally any reader I’ve owned before by just buying a new cable (or sticking adapter on end of existing cable).


There are some with USB-MicroB sockets, you can just use a C-to-µB cable with those, no ugly A-C adapter needed. (For example: Lindy UHC-300) I have about a gazillion other devices with Micro-B sockets anyway…


This could be a great addition for laptops that keep SD card slots.


I wonder if an expresscard adaptor could be made for these, after all it also is pcie internally...

This would be a way to give old laptops a real boost.


Let's say I wanted to watch a large video from this SD card, wouldn't the transfer from SD card to display be slowed down by the bus between the board to the display?

Similarly, if I wanted to transfer the same video directly from the SD card through the bus, then through a network interface over a network to another computer, wouldn't the write speed be slowed be all those components in the middle?

I don't understand the write speed at 880 MB/s if it'll inevitably be slowed by many other slower components.


> I don't understand the write speed at 880 MB/s if it'll inevitably be slowed by many other slower components.

This is very useful for high-resolution cameras, and even more so for high frame rate video shooting. No need for compression, just dump raw frames on the card.


You could levy the same criticism against any external component. It should go without saying that you won’t realize speed gains by upgrading your SD card if the SD card is not the bottleneck.

That said, 800mbps is plenty fast for watching video, as long as each second of the video is less than 800mb.

Or maybe I’m misunderstanding you?


Those "parts in the middle" are the PCIE lanes, which is the fastest bus for peripherals straight to the CPU.

Yes it will technically slow them down - although rarely is it the bottleneck.

To my knowledge, there isn't a faster bus on the whole system.


Western Digital moves pins around and gets a press release out of it...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: