It is a trivial problem. Track how anyone with a smartphone and your app on it moves around the city. Aggregate which paths they choose and which turn out to be faster / better. Then propose those paths to other users asking for routing instructions.
I work on a software for routing deliveries (https://www.fleetnavi.com), and although we don't have this functionality built into the platform yet, it's the first thing to do when we get volume.
Aren't they all? Until you actually try to do it...
[In seriousness, we are excessively bad at estimating the true difficulty of problems, and making them sound 'easy' when they're not only serves to devalue the work of the people who actually spend the time and effort to actually figure it out.]
Tracking, aggregating and analyzing thousands of users with slightly varied destinations, focuses (less traffic, faster, scenic), and signals is not exactly what I'd consider trivial...
Tracking, aggregating and analyzing thousands of packets with varied destinations, priorities and signals isn't trivial either, but it's something that some people and businesses have become very good at.
It's not a terribly hard job actually. I worked for a startup in 2004 that was doing this with haulage companies. We had 8000 data points moving around the UK coming in via GPRS and 3rd parties, traffic information aggregated and routing and mapping data, pick up corridors and event data.
It was (bar the web front end) a relatively modest sub-30kloc chunk of C++.
Now the scale is different but in 10 years, the problem isn't necessarily much more complicated, just larger.
In fact I think a derivative of it now runs some of Yodel's operations.
Well the dataset is about the same size now, perhaps 10% larger. The individual problem to solve is easily distributed across a large cluster of systems.
30kloc of COM/IDL as well that is. The actual routing engine was about 12kloc and the rest was the object model and deserializers. The routing engine was built on a two week binge of Red Bull, TAOCP and some papers printed out.
It really wasn't rocket science even though we charged them like it was.
From scratch by a non red bull enhanced team using c# instead it would probably about 3 months' work for two people to reproduce.
Waze claims to do something like and it produces the most retardedly slow routes ever. I just hate it when someone uses Waze on my Lyft because I just know I'll take 10 minutes more.
I don't think it's as trivial because there are lots of things to consider. Some drivers are just SLOW and it will have nothing to do with traffic. Maybe all slow/terrible drivers use your app so all the route speed information is completely based on specific patterns set by your app.
That is odd in my experience. Perhaps it has to do with my locale.
I am in a major metropolis and I drive throughout the city, suburbs and adjacent cities regularly. I know the area and I have been driving for decades.
I have been using Waze for about a year now. The navigation routes it selects appear to be very good. They appear to be based on relevant factors too. I can only guess what variables they include in their calculation, such as weighted capacity, current traffic speed, etc. of the relevant route elements.
Google appears to already take care of the 'fast routes become slow routes due to popularity' issue. Frequently in London, Google Maps will pop up a "Faster Route Detected" alert and will divert onto a quicker path. Sometimes three separate "Faster Route Detected" alerts will pop up during a single commute.
Google Maps would also change my London commute route on a daily basis given the changes in traffic patterns.
I have often thought the applications like Maps should randomise the suggested routes (within reason) to spread the traffic load around and improve things for everyone.
I work on a software for routing deliveries (https://www.fleetnavi.com), and although we don't have this functionality built into the platform yet, it's the first thing to do when we get volume.