I suppose I'm struggling to see how much of a requirement for low-latency data processing exists in a restaurant.
There just isn't that much data there unless they're attempting to do real-time control of equipment (in which case, just use a regular control system).
I can imagine a requirement for logging potentially, but thats not a low-latency requirement and the use-case for edge-computing is exactly that, low-latency.
Still not seeing anything here but a dramatic overuse of technology I'm afraid.
I think you are right that this blog post doesn't lay out a specific use case to justify this setup in a restaurant- that does not appear to be the goal of the post. An exploration of problems and requirements is probably what you're looking for.
They did drop a few hints, though. They are just hints, and going by what's in the blog post, a bit of imagination is required to fill in some of the gaps. However, if you do that, I can envision a few possible needs for low-latency edge computing that are interesting and forward-looking:
- Kitchen automation. Sensors that monitor food being cooked, and help implement a pipeline to ensure quality is consistent. (example here is that neural net monitoring the staleness of fries).
- Inventory automation. Monitor and automatically restock items. Track purchases and ingredient usage in real time.
- Data analytics. Collect detailed time-series data about food quality, facilities maintenance, foot traffic, noise levels, security, etc.
These aren't standard restaurant problems. However, it's an interesting approach to scaling data analysis and automation in the physical world. All of these use cases assume the need for "real-time" data, and they assume that such a thing has business value (which is where I think your critique is coming from).
I'm not affiliated with Chick Fil-A, so I don't know how correct any of my speculation is, but that's my takeaway from reading the article.
Credit card transactions, mobile orders, timer synchronization, order receiving (tablets etc), iot devices (cameras, cooking devices) and other things planned for the future.
- Credit-card transactions don't have low-latency requirements, its 'nice' if they're quick, same as everything else. The bottleneck here is entirely dependent on your ISP tho. No 'edge-computing' here.
- Mobile orders. This will necessarily go thru to a central server. This is traditional client-server stuff. Again, no 'edge-compute'.
- Order receiving. Simple local data entry. Again, no 'edge-compute'.
- IoT devices. Hopefully these have local control systems without the server being in the loop. Control systems are not 'edge-compute' either.
'Edge-compute' is generation of knowledge at the edge rather than shipping raw data to a central server. This reduces required bandwidth.
What in your system takes a high rate of data and generates a low-rate of data for transfer to a server for further use. I see no analysis of raw data into a more processed form, this is simply traditional data entry and CRUD activites.
There just isn't that much data there unless they're attempting to do real-time control of equipment (in which case, just use a regular control system).
I can imagine a requirement for logging potentially, but thats not a low-latency requirement and the use-case for edge-computing is exactly that, low-latency.
Still not seeing anything here but a dramatic overuse of technology I'm afraid.