Hacker News new | past | comments | ask | show | jobs | submit login

Tablets should have a HDMI/Displayport in so that you can directly use them as displays.



I'd even go one step further: we should have had a standard communications protocol like TCP for all devices. So a display would show up as just another device that we could use to read/write bytes. All devices would have a standard queryable HTTP/HATEOAS self-documenting interface. And HDMI/DisplayPort or USB A/B/C/.../Z would all use the same protocol as gigabit ethernet or Thunderbolt or anything else, so the bandwidth would determine maximum frame rate at an arbitrary resolution. We could query a device's interface metadata and get/send an array of bytes to a display or a printer or a storage device, the only difference would be the header's front matter. And we could download image and video files directly from cameras and scanners as if they were a folder of documents on a web server, no vendor drivers needed.

There was never a technical reason why we couldn't have this. Mostly Microsoft and Apple blocked this sort of generalization at every turn. And web standards fell to design-by-committee so there was never any hope of unifying these things.

Is it a conspiracy theory when we live under these unfortunate eventualities? I don't know, but I see it everywhere. Nearly every device in existence irks the engineer in me. Smartphones and tablets are just the ultimate expression of commodified proprietary consumerist thinking.


In fairness, there are standardised protocols for a lot of these things already, even if they're not all part of one giant meta-protocol. Cameras in particular have mostly appeared as a folder full of files, with no need for special drivers, for something like 20 years.

There's definitely no need to invoke a conspiracy for the lack of 'one protocol to rule them all'. It's often hard agreeing on a standard even for a relatively limited topic - trying to agree on one for all electronic communications for all devices is probably impossible.


The meta protocol exists! Sort of. Check out the USB-C specs, which tried to answer a ton of this. It’s taken years for power delivery to reach the point where I don’t feel compelled to carry a USB-C power meter to check cables and chargers in the wild. My Switch still requires some out of spec signaling to charge/dock properly.

Meanwhile, half of the stuff I get off AliExpress only charges from A to C cables due to a missing resistor.

I don’t think the markets (yet) incentivize implementations. Like how when my mortgage gets resold, autopay will only transfer over if it’s once a month; anything more complex and I have to endure a new account setup and a ton of phone trees. Same with paperless settings. The result? I just live with the MVP.


> There was never a technical reason why we couldn't have this. Mostly Microsoft and Apple blocked this sort of generalization at every turn.

On the contrary, Microsoft tried really hard with UPnP/PnP-X/DPWS/Rally/Miracast*/etc but nobody was interested.

*BTW any Windows 10+ device can act as a Miracast sink (screen) so you can link Windows laptops/tablets as extra screens without any additional software.


Extending your simile, some devices need the equivalent of UDP in order to function within the size/power envelopes that make them useful. Bluetooth vs the nRF24L01+.

There are standards like this in highly interoperable systems, but there’s a cost paid. USB-C power delivery negotiation (beyond the very basic 5V3A resistor that people omit) is roughly as complicated as gigabit ethernet. That compute has to come from somewhere and it turns out customers won’t even pay for that 5V3A resistor - they’ll just use A to C cables and replace it when it “won’t charge” from a compliant charger. :) Average person probably only cares that USB-C can be flipped and that the connector feels less brittle than microUSB.

UPnP exists. Lots of what you describe exists. Between bugs in implementations becoming canon and a lack of consumer interest, no real conspiracy required. At least smartphones and tablets are trending in a good direction - Apple’s latest supports basic off the shelf USB-C Ethernet, displays, hubs, and so on.


Agreed in general. However, I wouldn't stop anyone but having my monitor traffic go over the network would lead to a lot of congestion, especially wireless. Prefer a separate cable as the grandparent alluded.


You can plug a USB HDMI capture dongle into tablets and do this.

Any webcam viewer would probably work to view it, though there's dedicated apps intended for this like https://orion.tube/ on iPad. I know there's options on Android but don't have a modern android tablet to test them.


Do you know how come that app doesn’t work on the IPhone 15 Pro?

I don’t have the iPad, but just recently got the 15 Pro, and it’s able to do a bunch of things via the usbc port (wired Ethernet, SD card reading, driving a Pro Display XDR etc), but I wasn’t able to do something like that Orion app is showing.

Was thinking of pretty much same use case as shown in the app, where I could plug in an external camera and use the phone as a high resolution / high-nit viewer display. Are these apis only for iPadOS because the iPhones are missing some required hardware for it?


I know, I'd love to use my phone as a display via capture card so I don't have to carry a portable monitor to troubleshoot headless boxes.

The developer says the 15 and 15 Pro are only missing software, the hardware is capable:

> I’m sad to say that we’ve confirmed with Apple that it will not be working with the iPhone 15. But this can be fixed in software, so feel free to file a feedback request for UVC support on iOS!

https://old.reddit.com/r/apple/comments/16qzdtx/hi_reddit_we...


Ahh, that sucks, hopefully future iOS will also have uvc support.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: