Hacker News new | past | comments | ask | show | jobs | submit login

It’s more about network outages than latency specifically... although in several remote locations latency is permanently low due to slow providers.



Credit card transactions, mobile orders, timer synchronization, order receiving (tablets etc), iot devices (cameras, cooking devices) and other things planned for the future.


Ok, lets go thru that list:

- Credit-card transactions don't have low-latency requirements, its 'nice' if they're quick, same as everything else. The bottleneck here is entirely dependent on your ISP tho. No 'edge-computing' here.

- Mobile orders. This will necessarily go thru to a central server. This is traditional client-server stuff. Again, no 'edge-compute'.

- Order receiving. Simple local data entry. Again, no 'edge-compute'.

- IoT devices. Hopefully these have local control systems without the server being in the loop. Control systems are not 'edge-compute' either.

'Edge-compute' is generation of knowledge at the edge rather than shipping raw data to a central server. This reduces required bandwidth.

What in your system takes a high rate of data and generates a low-rate of data for transfer to a server for further use. I see no analysis of raw data into a more processed form, this is simply traditional data entry and CRUD activites.


But again, latency of what?

Requirement for low-latency implies a use of data that is time-sensitive.

What time-sensitive data is there in a single chicken restaurant??




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: