Hacker News new | past | comments | ask | show | jobs | submit login
Uber ordered to stop self-driving vehicle service in San Francisco (techcrunch.com)
207 points by coloneltcb on Dec 14, 2016 | hide | past | favorite | 233 comments



I want self-driving cars to happen as quickly as possible, but Uber seems like they were extremely reckless and negligent here. Why not apply for a permit and stay on the safe side?

And with at least two reported incidents of these cars running red lights, it seems like driver training and attentiveness was lacking. If the software wasn't ready, fine, but you have to be aware of that and train your drivers to be 100% attentive to their surroundings, especially since it's the first day of real, commercial trials.


Uber has enough money and power to be negligent and reckless unfortunately. I was a big Uber user but the way they treated me after an accident (where I was the hurt passenger) was truly terrifying. I thought the whole Uber is "evil" thing was overblown but it is not. Be very, very careful with Uber. They have a lot of resources to fight with, and most people don't have the emotional or financial resources to go to trial with them. They're like a playground bully who gets away with anything because even the teachers can't win against them.


Are you able to speak more about your specific experience?


I can't, I'm sorry. I could get some media around my story but I am a private person. Plus Uber would probably just discredit me and lie. I wish I didn't have to deal with them over the accident stuff because they fight really dirty.


It's too late now since you used your normal account here, but next time, you might want to use a throwaway account to speak more freely.


> Why not apply for a permit and stay on the safe side?

When did Uber obey the law? Why would they start now?

Edit: Apologies for my tone, but Uber has demonstrated quite a few times in the past that they don't have any respect for the law.


> When did Uber obey the law?

In absolute numbers, their company probably follows more laws than most people on this forum, since they operate in many different jurisdictions. It wouldn't surprise me if the average HN commenter committed more felonies than the average Uber executive, or of Uber as a whole.


It doesn't matter how many laws you follow. It matters how many you break.


What kind of bizarre argument are you making?


That if you follow enough laws, it doesn't matter if you break some.


That is such an apples to oranges comparison.


Then we are better off driving for real because `not driving, meaning not being in control, but being ready to take the helm at any time` must be more exhausting (because you have to focus on everything not under your control vs focus on everything that you don't have control minus what you actually control) and increase our reaction time (since some time will pass between the moment you notice you have to take the helm and the moment when you actually are in control) than just driving.


Being ready to take the helm at any time is required by law, at this point. It applies to Google's test cars, to Tesla Autopilot, etc.


To be clear (late posting, sorry) I believe that even if you have your hands on the wheel when the car is driving by itself it is more exhausting and dangerous than just driving the car yourself.


Are Google's / Waymo's vehicles not steering-wheel and pedal free? E.g., go to Waymo's own site here[1], scroll to "2015 World’s first fully self-driving ride on public roads", and watch the video: they give, if I'm not mistaken, a blind man a ride in a steering-wheel-free, pedals-free autonomous vehicle on public roads, in Texas. Now, granted, the original article is about the CA DMV, but I've seen vehicles that are indistinguishable from the prototype car in the video driving in Mountain View. Unless there's a model with steering wheels and pedals (it is hard to say from the outside), that would seem to imply that Google is testing fully autonomous cars in California. (Note that I'm not trying to imply that this is illegal: I'm not well-versed enough in the DMV's rules around autonomous vehicles to know if it requires "being ready to take the helm at any time" or not. I also did not realize, until Waymo's announcement recently, that they were so far along.)

[1]: https://waymo.com/journey/


even though those cars are steering wheel and pedal free, a google employee is riding inside, with a laptop attached, and a big red stop button. Here's an image:

http://www.mycoolbin.com/wp-content/uploads/2015/07/Googles-...


What happens after they hit the big red button and the car stops in the middle of the street? Do they wait for a tow truck? Or do they rely on restarting the self-driving mode after the danger has passed?


"The prototype vehicles in particular are equipped with removable steering wheels, accelerator pedals and brake pedals that allow test drivers to take over driving if desired." [1]

[1] https://webcache.googleusercontent.com/search?q=cache:0RESYe...


these cars running red lights

I watched quite the opposite problem the other day. Uber self-driving car pulled up to a red light on a busy intersection, in the left turn lane. The light turned green. I could tell that the car was in self-driving mode because it didn't step into the intersection like humans would do - not even slightly. The opposing traffic never let up enough to make a left turn, so the car stayed put, even through the yellow. Poor thing skipped a complete green light cycle. I think I heard a horn from some of the drivers backed up behind it.

Self-driving cars might be a little too polite.


Humans do this also. It makes me wonder what the driver did before left turn arrows were common. In driver's ed, i was tought to pull into the intersection blocking the right lane of perpendicular traffic (to my left), the lane just after the crosswalk. A surprising number of people don't pull up at all.


As a new driver, I would much, much rather play it safe if I'm unfamiliar with the intersection & traffic flow than pull out. Does it seem silly to you? Probably. To me? No. I don't want to be caught in an intersection that I'm not supposed to be in when the light turns red.


But a green light allows you, even as you're turning left, to enter the intersection. The light lets you in to the intersection so that you can leave it when conditions are favorable to do so.


Assuming there is space for you in your destination lane.


In Toronto where I live, you'd better pack a lunch if you have any left turns on your route. Probably supper, too.


Pardon me if this sounds rude, but where did you learn to drive? I was taught in driving school that you are supposed to pull up when turning left at an intersection with lights but no left turn arrow.


I think a lot of people don't like doing it because it feels wrong to be in the intersection and turning on a red, assuming you get stuck and turn on the red.

I got told by the instructor on my test to pull out into the intersection.


It's what you are supposed to do. Pull in to the intersection and wait until you have a gap to turn. Other cars have the right of way, but are supposed to (yeah right) yield to you when the light turns yellow. Even if they don't you are in the intersection so you just go when safe (even if you have to wait for the red).


The Goog mentioned in one of their progress reports that this was a thing, and that they'd be tweaking their code to nose forward the way humans do.


Imagine the poor Google koala car in the streets around Union Square...


> I could tell that the car was in self-driving mode because it didn't step into the intersection like humans would do - not even slightly.

So it's a bad driver.


> Why not apply for a permit and stay on the safe side?

Because in our world - 'do it first and ask for mercy later' works better. It probably depends on situation if it's a unethical/bad tactics or not, though.


Uber seems like they were extremely reckless

YES. I live 10 feet away from the garage. I've noticed over the past few months that they've been ramping in typical startup fashion: hockey stick growth.

After reading about Lyft's CEO, a narrative of "nice guys finishing last", I started realizing how unscrupulous Uber is. Sadly, cut-throat businesses win.

I recently took an Uber ride, where the driver was an un-retired man in his 70s. He recently just abandoned his failing dry clean business. This one hits close to home for me, because my parents are approaching their 70s, holding on to a failing dry clean business, too. For me, hardworking people like this, with arguably bad business acumen, are counter example to the notion of "if you're poor, it's because you didn't work hard enough." Before the dry clean business, the Uber driver ran a deli business, but that was a failure, too. No doubt he possesses an entrepreneurial spirit, albeit he probably shouldn't. I can't fault him for not being smarter. He proceeded to tell me about his idea to create a website where people upload images of their face, at different angle, and can get a custom printed 3d figurine, that he would hand paint them. I didn't have the heart to tell him that as someone in the top 1% income bracket of 20-something year olds, who consistently blows money on expensive toys / gadgets / dinners, I would never spend $300 bucks on a custom painted figurine. See, he was an artist before coming to America. He said "I want to be an artist before I die. I was too busy making a living when I came to America".

Uber gave this man a job. That's good. But he's being paid near minimum wage. To achieve that, he has to work 12+ hour days to acquire the bonuses that Uber uses to incentivize their drivers to become addicted. These are the kinds of risk Uber induces on its drivers, customers, and other people on the road. The sad reality is, he's going to be displaced from this minimum wage job by automation, much earlier than people expect. Uber doesn't care. I'm not saying Uber or any private company MUST care. I'm just illustrating how a company's values permeates itself, and precipitates to individuals' livelihoods.

I recently read an article about the unit economics of Uber. It doesn't work out. Uber, in it's current form is based on leveraging free VC money to subsidize and incentivize somewhat artificial demand. It's predatory monopolistic tactics. They won't be able to live up to their unicorn IPO valuation hype. They NEED to expand into other domains. Recently, Uber acquired an A.I. startup, which I think is an indicator of this. Alternatively, they can fix the unit economics equation to be profitable. Autonomous vehicles is the purported answer. This VC and unicorn valuation environment, I believe, is resulting in this kind of hastiness, at the increased risk of human lives.


Agreed, I think that driving only makes sense as a minimum wage job.

The notion that Lyft is significantly better is untrue. Lyft actually created the "ride sharing" portion of both apps (UberX emulated Lyft's model). Lyft didn't get regulatory approval either. Both have been ejected from Austin and other cities.

I think the unit economics work out, but perhaps not to the level their current valuation suggests. I think Lyft could have been successful if it had not been outspent/out-VC-raised by Uber. Lyft was the original innovator of the true ride sharing segment. Uber at the time was a broker/dispatcher for commercial limousines.


I think the unit economics work out, but perhaps not to the level their current valuation suggests.

Yes. I agree. This is the qualification that was needed as to not make an overstatement.


Given your assessment, at least take Lyft instead.

If I have a choice, I don't take Uber.


I have both apps installed. I use the one that offers me the significantly better rate.

Edit: Not just better rate, but better economic advantage. Money + Time. I'm not too rigorous about the precise trade off, but ideally, I will act as to maximize my utility frontier.


If you evaluate Uber like that, why do you take Uber rides?


I'm an opportunist, and not an idealist. I will gladly take subsidized rides, while VCs continue to cover my fare. Uber is consistently cheaper for me than Lyft. If pricing was equal, I will choose Lyft without hesitation. If I knew with confidence that the probability of dieing in a Uber is greater than in a Lyft, I might still take the Uber if the price is attractive enough. But I can't possibly know, so my null hypothesis about the probability of dieing in either is zero, which, intuitively, not the case. Then again, I don't put money into my 401k concerned about the United States going bankrupt, either.

There is a difference between social / moral value judgments versus economic value judgment. For me, I commented on the social value judgments, but act on the economic one, unless of course the economic one is immoral. There is nothing immoral in me taking an Uber ride. I said "I'm not saying Uber or any private company MUST care", because to suggest otherwise is saying Uber HAS to act to some social value, rather than to their economic advantage.

Edit: I'll take a ride in Uber's self driving car, too. As long as it's "subsidized" at some attractive enough price. By that, I mean they pay me. What that threshold price is, I don't know, but I know it exists.

Edit 2: I love eating McDonald's. I will not boycott and refrain from eating chicken mcnuggets, even though I know their workers are being paid minimum wage. These are macroeconomic issues that I, cynically, lack the influence to affect. However, at a micro level, I give tips to employees at fastfood restaurant. I gave a $20 cash tip to this particular Uber driver, too. That's against Uber rules. That's also against what I just claimed about being economically maximizing. I don't consider the tip as part of my expense for some service / good, and rather just helping people out in general.


> I would never spend $300 bucks on a custom painted figurine

Not for yourself. But there are many people who love to have portraits of themselves on the walls, especially if the artist has a knack for that business ...

(This 'selfie' thing didn't come out of nowhere at all, to think of it ...)


> I recently read an article about the unit economics of Uber. It doesn't work out. Uber, in it's current form is based on leveraging free VC money to subsidize and incentivize somewhat artificial demand.

I don't buy that argument. By that logic no taxi service could ever work out. It might not work out at the current price levels, but once they squashed all competitors they're free to raise their prices (similar to what Amazon does).


And that's supposedly a good thing?

Maybe regular taxis aren't that bad after all...


It's not that hard to set up a business like Uber if you only start with one city. You don't need a global business. Taxi businesses also always worked per city. As soon as it'd be lucrative, there will be some competition in the city where it's worth most.

Sure, you can't just copy uber's app. But building a simple app with basic functionality isn't that expensive. And people will mainly decide by price.

Their only chance to ever make back the losses is the self driving business. If it works out they could be very profitable. If it doesn't, I don't believe that they'll ever make back the investments.

Amazon is actually a good example. They also haven't really made back the investments. They're still not very profitable. Maybe they never will be, who knows. As long as enough investors believe in them they can comfortably afford this style.


people will mainly decide by price.

They have, they are, and they will.


Can you explain how you arrived at the deduction that NO taxi service could exist?

I agree :

(1) SOME taxi service has to exist. By "work out", we mean profitable, if not non-negative

(2) the unit economics of Uber at the current prices are not profitable / sustainable

(3) The deca-unicorn valuation is based on Uber's revenue as a monopoly.

To ask Sam Altman's question: "If you win, will you stay the winner?"

In the case of Amazon, I can imagine a world where a retailer simultaneously satisfying (1) (profitability) and (3) (monopoly) can continue to exist. Amazon have logistical advantages and the largest catalog of products. It will be hard for a startup to de-thrown hypothetical monopoly Amazon. Hypothetical monopoly Amazon will reap monopoly profits, but demand for retail goods is not suddenly going to disappear.

Hypothetical Uber monopoly will reap monopoly profits per transaction. But will demand for taxi services remain as high as in the current environment of subsidized taxi prices? I'm sometimes able to land a 3-4 mile Uber pool ride in San Francisco for just $3. I have a feeling Uber is not making money here, and these prices are not sustainable, unless of course, someone invents Unicorn fuel * ahem *. This is comparable to public transit prices. If Uber monopoly starts charging unsubsidized and profitable rates, I will reduce my usage by an order of magnitude, if not two. Particularly in situations where public transit is a substitute good. 3 years ago, before moving to San Francisco, I lived just fine without ever using a Taxi. I used public transit to the airport, or got a lift from a friend. Being in San Francisco where subsidized Uber is so cheap, why WOULDN'T I use it? Taxi service is a price sensitive good, and I can live without taxi services 99% of the time.

What characteristics of Uber will allow it to remain a monopoly? Technology? Intellectual Property? Network effects? Lower costs than competitors? Autonomous vehicles? Or something else?

- Network Effects - if I'm using a taxi service, I don't really care if my friends or family use it. Network effects will definitely make me try it out, but ultimately it will be based on price.

- Intellectual Property / Technology - I don't mind making a phone call, rather than using a fancy app, if making a phone call to a cab company saves me money versus a monopoly

- Lower Costs - Will Uber's economy of scale allow them to get gasoline at a significantly lower cost than competitors? Or buy cars and maintenance at discount? Or what other cost advantage will monopoly Uber have?

- Intellectual Property / Technology - Maybe Uber's dispatching system is better than the competitor. What's to stop Google from calling along and integrating a comparable dispatching system?

- Autonomous Vehicles - Are Uber's autonomous vehicles at Level 4 autonomy?

What's to stop a local low-tech taxi cab company from coming along and offering profitable, sustainable, and most importantly (relatively) attractive non-monopoly prices?


I'm sad that I'm saying this, but why wait for a permit when there's little to no chance of actual drawback or punishment for being reckless?



The real question is how hard is this permit to obtain? I'm also guessing the permit limits Uber from doing what they want completely.


> I'm also guessing the permit limits Uber from doing what they want completely.

Yes, it is in fact a key feature of laws and statutes that they (sometimes) keep people from just doing whatever they want.


So that's a reason to just ignore it?


Self-driving Uber car blows red signal at crosswalk in SF:

https://www.youtube.com/watch?v=_CdJ4oae8f4


Updated Uber statement:

"This incident was due to human error. This is why we believe so much in making the roads safer by building self-driving Ubers. This vehicle was not part of the pilot and was not carrying customers. The driver involved has been suspended while we continue to investigate."

https://techcrunch.com/2016/12/14/uber-looking-into-incident...


CA law says every autonomous car in testing needs to have an operator in charge, so technically all incidents are due to human error.

Also: Uber is not listed in the CA DMV list of companies allowed to test autonomous vehicles? https://www.dmv.ca.gov/portal/dmv/detail/vr/autonomous/testi...


They seem to claim that it's assistive technology, not self-driving and therefore not required to register as autonomous: http://arstechnica.com/cars/2016/12/uber-tests-self-driving-...


Funny , because they say "self driving" on their own website:

https://newsroom.uber.com/san-francisco-your-self-driving-ub...


Sounds like a Tesla PR play: call it Autopilot, except when the context is liability.


The airplane technology that Tesla's Autopilot is named after is also an assistive technology, not a replacement for pilots. The complaint people have around the name is that people don't understand what autopilot is, not that Tesla are equivocating.


>people don't understand what autopilot is

Much effort is expended on branding and naming. This would, at a bare minimim, include a basic consideration of how a word is commonly perceived, irrespective of whether the common perception is accurate.

Tesla is fully aware of the cachet that the name confers upon the brand, while displaying it prominently and relegating the absolution and disclaimers to the finer print. I like Tesla. I admire Musk's acumen and vision. But, here, they are certainly equivocating and it's irresponsible.


Language evolves. If the vast majority of people understand by Autopilot the capability of driving alone without assistance, and you know it, I don't care that the proper textbook definition of that technology refers to assisted driving.

You are being disingenuous at best.


I agree that relying on archaic definitions would be sneaky, but I'm not talking about "the proper textbook definition," I'm talking about how it's used in the real world by people who actually create and use autopilot systems in airplanes. Just because some people use "quantum" to mean "huge" (e.g. "It's a quantum leap forward") doesn't mean somebody is being disingenuous to use it in the sense of "a minimal amount."


Yeah, well, they say they're not a taxi service, too.


unsure if the dmv requires a human operator since uber's argument for not needing dmv's permission is there is a human operator in the car, and therefore it's not a self-driving car.


Uber Marketing: "self driving car!"

Uber Legal: "assisted driving"

Pick one.


I find it telling that they do not actually say whether the car was in autonomous mode when it ran the red light, only that it was "due to human error". Was the human error that they didn't assume manual control and stop the car?


I hope they wouldn't be that shortsighted. If they start suspending employees because of getting caught on camera not reacting to the car's screw up in time they're going to create a lot of distrust and encourage concealment of issues that they'll need to know about to make a safe solution.


Obviously the error was made by the human who wrote the software that drove the car... hence, human error. QED.


we can only assume it was in fact in autonomous mode.


They should release the interior video of the incident, then. It's their car and their employee (not one of their contractors) and with no passenger there's no privacy concern either.


> This incident was due to human error. This is why we believe so much in making the roads safer by building self-driving Ubers.

My new side project is a The Aristrocrats joke, except it ends with "The Gig Economy!"


And the classic monty python reference "those responsible have been sacked"


Seems like a reckless statement on a couple of fronts. For one, it's openly disrespectful to the human drivers upon whom they've relied to this point--seemingly regarding them as nuisances which must be discarded post-haste.

And, of course, it begs the question: what would their response have been if it was due to a technical glitch, which may yet happen? Their statement could easily be used against them.


> And, of course, it begs the question: what would their response have been if it was due to a technical glitch, which may yet happen?

That's not a problem at all. They'll just use "git blame" and fire the programmer whose "human error" caused the accident.


Why on earth should we believe them?


Does it matter who was driving? An Uber company car, involved in testing, ran a red light. That is a demerit to the operation overall.


If a driver is driving a car, and the car knowns it has red light. It has to take control over the driver and stop the car. Over. Why else do you think we need driverless cars? Literally, to fix drivers mistakes. It's Ubers fault, it was computer error. What else do you think Uber would say?


I mean it is due to human error. The human error resides somewhere within Uber-hq.


Why would an employee/contractor be suspended for a minor traffic violation? Nobody was hurt. I'd rather they talk about taking steps to train the people and algorithms better


Did you actually see the video? The car drove through the red light long (in relative terms) after the signal turned red. That an accident didn't happen is sheer luck.


Also, note the pedestrian in the walk who is aware enough to see the car coming and not get hit. I (like many in San Francisco) have crossed that crosswalk many, many times -- often with my kids (it leads to the Yerba Buena Gardens complex) and it's sickening to see this. I agree with the earlier commenters: if they want to maintain that this is human error, Uber should release all data that they have on this.


Yeah, and that's the crossing from Yerba Buena to SFMOMA. It's obviously designed to give pedestrians a lot of breathing room. Autonomous or not, blowing through that crossing was Very Bad.


Looks like the video was taken by a dash cam from a police car - am I reading that right?

If thats true I wonder why the cop didn't pull the uber over.

edit: This also leads to another question - how does a cop pull a self driving car over? Does the car know to respond to police lights? And if it does pull over then who does the cop talk to?


The video, published by the San Francisco Examiner, was captured by a dashcam mounted inside a vehicle operated by Luxor Cab, one of SF’s licensed cab companies.

https://techcrunch.com/2016/12/14/uber-looking-into-incident...

The car certainly should know how to respond to police lights/sirens. I was reading Google's monthly reports on their cars and they were talking about adding a pull over response to their system.


It's a taxi, actually. I'm sure they were delighted to get this video.


It was a taxi.


> And if it does pull over then who does the cop talk to?

It's going to be even funnier when the patrol car is an unmanned self-driving bot itself.


Maybe it learned a bit too much from San Francisco drivers?


If it uses Uber drivers for training, heaven help us: I’ve seen illegal U-turns, constant double parking, reversing down 1-way streets, moving right turns through red lights, sudden stops, speeding 25+ mph over the legal limit, lane changes in intersections, lots of turning without signals, lack of proper yielding to obvious pedestrians, cyclists nearly side-swiped, etc. (Some subset of) Uber drivers are one of the worst hazards on the road.


An Uber driver deliberately pushed me (a cyclist) out of the way with the hood of his car when I tried to take a picture of him parking on an expressway. I filed a police report and a complaint with Uber, but nothing came of it.


Curious, have you reported those things to uber/given appropriately bad ratings? (Uber always claims the rating system should filter unsafe actors out quickly)


That only works for misbehavior that inconveniences the rider. If it only hurts those on the other side of the windscreen, it's recklessness as a service.

Sure, every user has some threshold beyond which they would not be fine with it anymore, but the bad boy image Uber so carefully crafted over the years provides a considerable amount of self selection so that those who would care most never even install the app.


I don’t have Uber installed on my phone. I mostly walk or take the bus/subway. This is just my observations as a pedestrian.


oh, then I misunderstood. Would still be curious how passengers actually act in these cases, since "I don't want to give a bad rating" seems to be a big thing.


I give bad ratings any time a driver acts like this. I have no problem with that and if Uber wants to stop servicing me because I handed a bad driver a bad rating, I am ok with that.


Why wouldn't you want to give a bad rating for bad driving?

I almost always give 5 stars, but with the notable exception being distracted or aggressive drivers.


I don't want to criticize you in particular but I have a problem with "always give 5 stars". Are you telling me that your average Uber ride is 100% perfect? I find that hard to believe, moreover basic distributions would find this improbable.

Sorry for ranting, I wish more people would give realistic ratings to things instead of starting with 100% perfect and decrementing for misses. It should start in the middle, 50% and go up or down based on experience. This would give a much better set of ratings that mean something.


I agree with you that ratings should mean something. However, Uber requires an absolute minimum score of 3.8 for drivers to stay on the platform, and a 4.6 minimum for any area that has reasonable driver supply (eg: everywhere). This means every time you rate a driver a 4 it's effectively a statement that they should be fired.


> However, Uber requires an absolute minimum score of 3.8 for drivers to stay on the platform, and a 4.6 minimum for any area that has reasonable driver supply (eg: everywhere). This means every time you rate a driver a 4 it's effectively a statement that they should be fired.

No, it's a statement that the quality is what the customer would rate as a 4 on a 5 point scale. If Uber'a expectations of ratings are unreasonable based on the way it's customer base assigns ratings, then it is Uber, not customers, who should adjust.


Whether Uber or any other company does this doesn't change my premise. Uber is just like others further baking the problem and encouraging people to treat 100% as the status quo which means we have nothing to look forward to. I enjoy being pleasantly surprised when service is above average.

Its the same problem with the game industry has with reviews. If I were to market a game as "so and so" gave it 7/10 no one would care but if I say 9/10, 9.9/10 or better yet 10/10 than we have a marketing headline.

I wish more people would understand this.


Correct answer is stop using Uber if you don't like their rating system, not risking the livehood of people because it doesn't fit your mental mode of how the world should work.


We've traded one set of problems (with taxis) with another set of problems.


problems before: dirty taxis, disrespectful drivers, limited availability in some neighborhoods, often opaque and expensive pricing

problems now: we don't have enough flexibility to give 4 stars for a good but not perfect ride


Uber needs to replace its five star system with a binary rating system: "Good enough" or "Deserves to be fired". I mean that is basically what they have today, but they really should make it clearer


If a driver's average rating drops below around 4.5 they get kicked off. By design you can't give a driver less than 5 unless you want them to lose their job.


Why do you phrase this in a way that implies that is my problem? That is a mechanism Uber created because of the problem I originally called out.

If we didn't have this rating problem, 4.5 threshold would probably be much closer to the 50%.


It's your problem because of the knowledge you now have. 1-3 stars are masturbatory bullcrap, the actual rating system is binary.

Similar problem many professors face in schools with grade inflation. You are not holding any system accountable by using your own sensible scale instead of the one that matters, you're just punishing the student.


> Are you telling me that your average Uber ride is 100% perfect?

If you think of the service as a means of conveniently getting to a given destination and not as a performance in artfully begging for a good rating, then 100% should not be too unlikely.

Once you start giving extra points for nonessential extras, a driver who is polite and always on time will get a lower average rating than one who is 20% unreliable but hands out warm towels and free cake.


I am not expecting a performance but I do expect to be picked up where I dropped the pin, I expect the driver to follow traffic laws, I expect the driver to not make abrupt turns because of traffic. So yes most rides are not perfect, I don't know what city you use Uber in but the ones I do the drivers are far from skilled. Frankly, this is one reason why I feel safer in a taxi is the driver in my experience much more skilled at the job of driving.

The Uber driver who just moved to the city and doesn't know his way around without Google maps is going to have a difficult time in major metropolitans.


Because the driver will retaliate by giving you a bad rating.


Does the driver see individual ratings? In most pools the driver immediately rates the passenger, anyway.


If the behavior resulted in the passenger getting to where had to be on time, as opposed to 15 minutes late, then I'd assume they'd reward the driver for this behavior.

In fact it would be interesting to see how ride times compare between driven and driver-less Ubers. I suspect that initial results will show that driverless cars are slower and that this might make some people prefer human drivers.


If you're not in the car, how can you possibly know whether it's Uber or not?


There's a sticker on every Uber, isn't there?

http://www.centredaily.com/news/local/article64550272.html


I've never seen one with a sticker. Don't know why that'd vary from country to country but I guess it does.


That and 90+% its a Toyota Prius.


In some places Uber cars have stickers on their windshield


All others are bad, though right turn on red light is legal after stop.


Left turn on red is legal as well in CA whenever two one-way streets are intersecting.


And then people laugh at me for taking the train. ¯\_(ツ)_/¯


Been watching too many Muni busses.


Nice video, but the evidence is not clear that was human or computer error. It would be nice to make that distinction somewhere..


There is a red light. The car, which is being driven by a computer, did not stop. That is a computer error.


The Uber self-driving cars have a human in the driver's seat that is able to take control of the vehicle. It's not clear whether the computer or the human was in control when the car blew the red light.


Some variation of that comment has been oft-repeated on this thread and it's exactly what Uber wants.

One of the main functions of that human is to accept blame (or at least obfuscate culpability) and absolve the tech in the event of an accident.

There are a ton of scenarios wherein it would be virtually impossible for a human--especially one lulled into inaction and passivity for most of the ride--would be able to intervene quickly enough to avert an accident once it became apparent that the tech was failing.


And you're absolving the legally responsible human, so you can blame the implied-Evil corporation, because... why?

Humans who are lulled into passivity by highway driving don't get to escape the blame if they crash because they couldn't respond quickly enough if the situation changes suddenly, regardless of 'tech failing'.

If you have SatNav and it says 'drive down here' and you do, and it's a one-way street, you don't say "that's exactly what Garmin want, a human to absolve them when their tech fails". Because the driver is in the position of responsibility, the tech is not.


>so you can blame the implied-Evil corporation, because... why?

Because the "implied-Evil corporation" offers technology which claims to be capable of doing something, entreats you to confer upon it some degree of trust, then holds you responsible when it fails. Because lives can be lost as a result of such failures and the company's stance represents a cynical and cavalier regard for this fact. Because, they want the cachet (and the cash) without the responsibility.

>Humans who are lulled into passivity by highway driving

That analogy isn't even close to holding. In that case, you are understood to be in control of the vehicle. In fact, you are actively giving minor corrective inputs at all times. It's nowhere near the same as the car saying, "here, let me do that for you", but, you are supposed to sit at the ready, hands hovering slightly over the wheel and foot over the brake, ever scanning and ready to take over? That's asinine, and it's a flawed model. At a minimum, your reaction time will be slowed as you must first recognize that the car is not responding properly in, perhaps, a split second when it suddenly does something unexpected.

Look, it's simple: the car needs to either be fully ready and safe, or it's not. This have-it-both-ways Tesla-invented PR spin that says "our tech is statistically safer, but it's your fault for not correcting it even if it does screw up" is a pernicious subterfuge and I personally don't like how it defines our relationship with technology.


Car is not 100% safe even with human driver. If a tech makes less mistakes than a human, then the tech is better. It's your responsibility to chose what is better: you, the tech, or a taxi driver.

IMHO, every self-driving car must be equipped with a black-box to capture all information in case of an accident and special commission for investigation, to make rules to follow, so self-driven cars will be safer for others, and that is all. Self-driven system manufacturer can be punished only if these rules are not followed, by suspending a license to manufacture self-driving systems. Currently, we have almost no such rules at all.

IMHO, the first rule for self-driving cars, until they will be mature, must be better communication between computer and human: as driver, I want to see how computer sees road and obstacles, and planned actions, and other variants of these actions to pickup. I want computer to project computer map to wind-shield and be presented with tree buttons: big red «stop» button, left button, and right button, so I will be able to see road situation, computer understanding of that situation, and will be able to correct computer or make a full stop.


Because the "implied-Evil corporation" offers technology which claims to be capable of doing something

Are you suggesting that the 'something' it claims to be capable of doing is, explicitly, "driving 100% without human intervention"?

because you seem to be pushing "it claims to be able to drive without human intervention and cannot", which would be misleading and probably grounds for lawsuits about fitness for purpose.

but ... I don't think that's what it claims to be able to do.

In that case, you are understood to be in control of the vehicle. In fact, you are actively giving minor corrective inputs at all times. It's nowhere near the same as the car saying, "here, let me do that for you", but, you are supposed to sit at the ready, hands hovering slightly over the wheel and foot over the brake, ever scanning and ready to take over? That's asinine, and it's a flawed model.

You, as the human driver, are responsible for control of the vehicle. If you find the terms of some assistive device 'assinine' because you won't be able to be in control, then do not figuratively click 'I agree to the terms and conditions'.

You, the responsible driver, would never believe a child telling you they were capable of driving safely, it would be dumb, you know it's not true - and you're not legally allowed to hand over responsibility for vehicle control to them anyway.

Like you're not allowed to hand over responsibility to a piece of software, no matter how enticing or convincing the marketing blurb seems. And if you do hand over control to it, and then fail to take back control and an accident happens, you should be responsible.

Look, it's simple: the car needs to either be fully ready and safe, or it's not.

Look, it's black and white thinking which doesn't reflect reality. But if you want to force it to binary then it always casts false. The car is never 100% ready, you never hand over control, problem avoided, case closed.

Saying "it has to be 100% ready, I don't believe it's 100% ready, but I'm still going to hand over control, even though I think what I'm doing is asinine because I can't respond quickly enough - and then it crashes which is totally not my fault because I knew I couldn't take over in time!" is have-cake-and-eat-it nonsense logic.


>Are you suggesting that the 'something' it claims to be capable of doing is, explicitly, "driving 100% without human intervention

That's exactly what it's attempting to do at a given point in time. It's just disclaiming fitness and holding the human driver responsible for ensuring that it doesn't screw up.

>find the terms of some assistive device

You've been had. You're buying into the alternative terminology scheme that oscillates between disclaimer words like "assistive" and sexy PR words like "Autopilot". And, the reality is that the car is not "assisting". It is taking full control from the driver and completely operating the vehicle at significant intervals; leaving the driver only to monitor it. I don't know how to make it any simpler.

Beyond that, it's no secret that these companies are trying to achieve full autonomy, and they are already making claims that they are safer than humans, even denigrating human drivers in the process. But, they want those same fallible human drivers to be responsible for taking over when their tech fails. That's what's asinine.

>You, the responsible driver, would never believe a child

>you're not allowed to hand over responsibility to a piece of software

Huh? I'm not sure if you're completely missing the point on purpose, but there's really not much more I can add.


The video doesn't show if it was a person or computer driving. And Uber claim it was a human error though I'm not inclined to take that as proof.

https://techcrunch.com/2016/12/14/uber-looking-into-incident...


which is being driven by a computer

The point is that we don't know that. It may have been under driver control at the time.

There's probably a good argument for having some sort of external indicator of whether or not a self-driving car is under computer control or human control. Maybe a small roof mounted beacon that has to be lit up or something.


But do we know whether it was under computer control at the time?



Uber released a statement saying that it was human error

https://techcrunch.com/2016/12/14/uber-looking-into-incident...


There should be special fines for self-driving cars making errors. It would force companies to be more careful and lead to better engineering. Also, competing manufacturers could "turn in" to police other manufacturer's cars that misbehave.


Also, any changes to the software run on self-driving cars should be stored in an "escrow" service for at least X months before it is uploaded (preferably by an independent party) to the cars, to allow for sufficient testing.

We don't want to end up in the situation where car manufacturers make quick updates to cover flaws.

If the software is broken, the cars should be grounded, the software should be fixed and tested for at least X hours of driving.


Whoops.


This is really worrying for Uber. It's supposedly burning 2 billion dollars a year. It doesn't own any fleet of vehicles or it's drivers. Neither is there a sustainable cost savings that doesn't involve lighting pile of cash on fire. It desperately needs driverless cars before Tesla and Google kills their business off.

Now Uber has finally hit a regulatory wall which it won't be able to pay off. It's likely to hit more such obstacles until it realizes it's not really in the business of taxi but a business of acquiring market share with low interest rate capital.

We won't see an Uber IPO anytime soon.


It's not clear that Uber has hit a regulatory wall. The DMV wrote them a stern note, yes. Uber has received various scary-sounding threats of government sanction and has overcome them in the past (for example, at one point a German judge threatened to fine them 250k euros per car per day). They have often been successful in arguing their case or in going above or around uncooperative regulators.

The California state government is not a monolith, and Uber is an experienced lobbyist and negotiator. It's certainly possible that the DMV will prevail here, but not a certainty. And even if the DMV does prevail, Uber can get the appropriate permits and then resume, or else just pilot the program in one of the 49 other states.


Exactly. Uber is likely already following all of the permit requirements ($5 million insurance and experienced drivers) so they just need to apply for one. That assumes they don't challenge it in court based on the ambiguous language of the bill about what determines an autonomous vehicle, which sounds like their plan. They've demonstrated themselves to not be afraid of some bad press or the court room.

Hardly a major roadblock at the pre-market testing stage for such a big company already operating in Pittsburgh. They could easily shift testing to other people more welcoming states and countries in the meantime.


>Now Uber has finally hit a regulatory wall which it won't be able to pay off.

I wouldn't be surprised if they just can say "sorry, misunderstanding", apply for a permit like everybody else and continue with a bit more oversight.


That's what will happen, but it's fashionable to predict Uber's imminent demise.


Well, and look at all the free press they got out of this.


Michigan just passed a set of very pro-self-driving cars laws: http://fortune.com/2016/12/09/michigan-self-driving-cars/

So, Uber only hit the wall in SF, and can still continue testing elsewhere.

That said, I don't like the aggressive gambling with pedestrians lifes like Uber or Tesla do.


>That said, I don't like the aggressive gambling with pedestrians lifes like Uber or Tesla do.

We gamble with pedestrian's lives every time we get in a car.

No one has seen how these vehicles perform in real world scenarios. It seems very possible they could perform better than humans on average.


My guess is that the levels of these systems will differ. Some systems will be better than humans (on average), and some systems will be worse. The gamble I mention is that Uber and Tesla are putting too many machines on the road at once without having good priors for their current level of competence.


> The gamble I mention is that Uber and Tesla are putting too many machines on the road at once without having good priors for their current level of competence.

And from whom do you expect this data will come from? How many is too many? Why?

As far as I'm concerned as long as a human can take control at a seconds notice, the risk here is very much overblown.

I haven't seen an instance where a company wasn't taking reasonable precautions and was acting foolishly. I'd love to be proven otherwise...

There's a reason why companies waited until 2016 to start putting these on public roads. And even then they are usually limited like Teslas. They are taking it slowly like responsible engineers...

Now that full data capture and machine learning is being added pretty much every ride is contributing to the quality of these systems.


The problematic part for me was Tesla bragging about safety of autopilot before they had enough significant data. Basically saying "We still don't have any fatalities so he'll yeah we must be safer than human!".

The sad part for skeptics like me is that we couldn't really oppose this kindergarten argumentation. We had to wait for the first fatality (wich happened as you must all know).

So now if I base myself on Tesla own logic, I'm only one death away of saying that Tesla autopilot is far worse than average human driver...

And this actually is gambling with human life.

PS: Noteworthy if theses gamblers are wrong this could easily backfire on the whole autonomous driving industry.


> And from whom do you expect this data will come from?

If they have that much money, they can build their own test city and populate it with volunteers. I'll be thrilled to see Travis Kalanick walk over the mock street 100x per day while the cars are being tested.

> As far as I'm concerned as long as a human can take control at a seconds notice, the risk here is very much overblown.

That worked out splendidly for Air France flight 447 and many others.


https://www.cnet.com/roadshow/news/trump-advisers-elon-musk-...

Uber have the ear of the president-elect. That's worth quite a lot in terms of being able to ignore regulations.


I recommend reading the actual letter from the DMV. It's impressively well-written, balanced, and thoughtful about autonomous technology. I was not expecting a response like that from the government.


As an outside observer, California seem to be dealing with the whole self-driving car matter really sensibly.


Link?

Don't even know what DMV is.


It's embedded in the article.


Department of Motor Vehicles.


The only thing that might surpass Google's inability to execute on their autonomous cars is Uber's congenital need to break the law for no reason at all.


Uber has the garages for both Otto and these cars on Harrison, between 3rd and 4th streets, on the side of the lanes headed to I-80W/101S. When they need to park either, particularly the semi-trucks, they get people in vests to come out, stop traffic for pretty much all lanes of traffic, and slowly park their delicate vehicles.... Glad to see this story.


Uber tries to break the law to get a competitive advantage yet again. Innovation at its finest.


Can anyone tell me if these things are smart enough to slow down enough through a puddle to avoid splashing a pedestrian?


Does Tesla have this DMV permit for their autopilot feature?


Why would they need it, except for testing their feature on public roads in vehicles operated by Tesla themselves?


Uber is a company built on ignoring laws. Everything they did in the beginning was against some form of taxi or car hire regulation, in the name of disruption. (Regardless of what you think of the laws in question, they were still being broken in the eyes of most regulators)

I don't know how I feel about that type of corporate philosophy (or Facebook's "move fast break things") in light of projects where people can actually get killed.


Uber exemplifies the libertarian / anarchist subculture in tech. It seems to be a winning strategy for them so far.

Edit: to clarify my position. This is why software will never be regarded as an engineering profession. There is a complete disregard for everything that would make us a profession. I'm saddened with the way the world has gone in 2016. It's like we rebelled against adulthood on a global scale.


It's not 100% clear from the article that the permit applied to Uber. And the incident in question of a car running a red light was apparently being driven by a human.

> "This vehicle was not part of the pilot and was not carrying customers. The driver involved has been suspended while we continue to investigate."

If their self-driving tech is still driver-assisted, then shouldn't Tesla cars also be ordered to stop driving? Or did Tesla also require a permit?

From the DMV legislation:

> (a) “Autonomous mode” means an autonomous vehicle, as defined by this article, that is operated or driven without active physical control by a natural person sitting in the vehicle’s driver’s seat. An autonomous vehicle is operating or driving in autonomous mode when it is operated or driven with the autonomous technology engaged.

To me that sounds like it would cover what Tesla and Uber is doing if the driver is entirely hands-off. But this part is less clear:

> (b)(d) “Autonomous vehicle” means any vehicle equipped with technology that has the capability of operating or driving the vehicle without the active physical control or monitoring of a natural person, whether or not the technology is engaged, excluding vehicles equipped with one or more systems that enhance safety or provide driver assistance but are not capable of driving or operating the vehicle without the active physical control or monitoring of a natural person

https://www.dmv.ca.gov/portal/wcm/connect/211897ae-c58a-4f28...


> And who is losing here exactly?

Uh, pedestrians in crosswalks, apparently. I'd also add:

1) drivers who were lied to about their potential take home pay 2) customers who will have to eat 50% fare hikes once Uber gains monopoly power in order for them to ever become profitable. 3) employees who work under a CEO who lauds the company as "Boober" in honor of all the fringe benefits he gets for running it. 4) investors who will lose value once the company finally implodes.


Isn't it a fun fact that throughout history, (barring the morally exceptional[0]) each human only ever chose to do things that were in their best interest. However, since humans existed, we have continually seen a consistent improvement[1] in the average human's life.

[0] Funny thing about the morally exceptional: I can't think of any which continue to improve our current day-to-day lives.

[1] Improvement is subjective. But objectively (compared to centuries past), our housing situation has gotten better (more roofs, less exposure to the cold/heat), the average human doesn't worry about getting sick from bad water or getting bitten by a snake while in their bed, the average human dies less to disease and overall dies at a later age, and the average human is able to spend more time on hobbies rather than necessities.


Regardimg [0]: The entire open source industy. Voluntary fire brigades. And so on.

I'm surprised you ignore all those so easily.


As you said yourself: "open source industry"

Voluntary fire brigades, et al:

My bad on the miscommunication, I was talking about "throughout history": As a result what I meant was: I can't think of any [morally exceptional people] which [past their own lifetimes] continue to improve our current day-to-day lives.

You are correct though that morally exceptional people do have a (slight) positive impact to my day-to-day. I just haven't seen a case where their contributions transcended time. Similar to the contributions of say: Thomas Jefferson, Alexander Bell, person who first repeatedly sparked fire, etc.

The contributions of religious figures transcend time. But you could argue either way about their contributions being beneficial to our day-to-day. Again it would be debatable if it was in the best interest of these religious figures to spread their personal ideology. Either of these debates would be inherently biased towards your take on religion.


> The entire open source industy.

The open source industry is far from unbridled altruism.


> Voluntary fire brigades.

It's entirely rational to join a VFD out of self interest.


And yet what you do in VFD is very often against your own self interest - you sacrifice a lot of time, and sometimes risk your health and life, and all you get for yourself is bragging rights.


> all you get for yourself is bragging rights.

Not at all! You get the knowledge that the men and women standing beside you at an emergency will be there when it's your house/barn/grain silo that goes up.


I was talking about the autonomous vehicle permit in this particular case. I removed that part of my comment because people are mistaking what I said apparently.


Nobody, until someone gets hurt.

It's a valid political position to believe that people don't need government regulation (laws) to behave themselves, not make mistakes, and not take risks that might lead to mistakes. It's the libertarian/anarchist viewpoint, as mentioned. But that's not how the government in San Francisco, California, or the US works.

Those of us who don't hold to the libertarian or anarchist positions believe that it's a good thing for laws to prohibit risky decisions that are probably going to be totally fine, if the not-fine case is particularly bad. For instance, that's why there are food inspection departments; even if you don't believe anyone is deliberately trying to poison people, there are a set of risky behaviors (not refrigerating things promptly, not washing things, etc.) that stand a small chance of getting people sick, and we believe it's worth stopping those, even if it slows down production, and even if we can't completely eliminate people getting sick.


> It's the libertarian/anarchist viewpoint

That's about as coherent a statement as 'it's the progressive/communist viewpoint'

There is a huge gap between Zero Government anarchism and the belief that government authority should be limited so it does not grow to oppress.


Sorry, that phrasing was unclear; my intention was that this one viewpoint is shared by libertarianism and anarchism, which, yes, are two quite different philosophies as a whole. (Although you may well disagree with that rephrasing.) I was referencing the post above that said "libertarian / anarchist subculture", which I'm guessing was meant similarly.


Just to play devils advocate here. Assuming Uber gets a permit to test autonomous vehicles in public, how much risk has been mitigated as a result vs not having a permit and allowing them to continue testing?

Reading the legislation, the only requirement for the permit are:

- have insurance with up to $5 million

- the driver has a drivers license and has "Instruction on the automated driving system technology", as well as defensive driving training

Those both sound like reasonable, common-sense things to have. I'd also want a company testing their vehicle to have insurance and know how to drive their car. But at the same time if this permit didn't exist, how much less safe would the world be? Really?

They're not stress testing the software, they're not analyzing the code, they aren't testing the competency of the people who built the software, etc, etc. They aren't doing any of that for a good reason. So ultimately this is mostly just theatre or an exercise in letting Uber know who the boss is.

Because I highly doubt Uber doesn't have the insurance or experienced drivers. Nor do I think there will be any significant amount of tech companies wrecking havoc on California because they didn't think to buy auto insurance before starting an automobile software company, save a permit.


CA DMV also requires companies testing autonomous vehicles to report all their crashes and disconnects. Here are all those reports.[1][2] 20 companies have signed up. This is basic data collection to help decide when a system is safe for deployment.

Uber apparently tried to evade this minimal level of scrutiny. Looks like it didn't work.

[1] https://www.dmv.ca.gov/portal/dmv/detail/vr/autonomous/auton... [2] https://www.dmv.ca.gov/portal/dmv/detail/vr/autonomous/disen...


I can think of a number of ways to collect this data without requiring a special permit. The police are in coordination with California, so incidences involving semi autonomous vehicles can be collected using existing systems. I'm also sure there are means to shutdown Uber's autonomous testing via court order.

Additionally insurance and drivers licenses are already regulated.

I'm still not convinced of the necessity. The hammer is always looking for a new nail.

The bar for expanding bureaucracy and adding highly specific oversight is so low and so commonplace that you're labeled a techno-libertarian for questioning the basic economic cost (realistic benefits vs tradeoffs) of doing so.

You may feel more comfortable knowing someone is watching over these specific companies testing process. That doesn't automatically mean the world has been made safer as a result of their time spent writing complex legislation, building regulatory systems, enforcing it, slowing down development of technology, battling cases in court, pushing companies to other states/countries, etc.

Unfortunately questioning the realistic utility of such regulation is no longer the default and is dismissed as the foolishness of some silicon valley tech people disconnected from real life...


The data collection required for autonomous vehicles includes even very minor accidents. The idea is to collect data on what failed and why before somebody gets killed. Unlike regular accident data, all reports are public. This lets us see that Google's repeat problem is being rear-ended when their cars stop while entering an intersection because they detected cross-traffic. It also tells us that Cruise Automation hit a parked car on 4th St in SF for no good reason.

Anybody testing autonomous vehicles has this data. They just have to send it in.


So do you think the public wouldn't be able to inform themselves on which service or vehicle is safer before purchasing these cars without this data? Or that Google, Uber, Tesla, etc wouldn't already be taking these numbers very seriously and doing everything to minimize them? Or that the public wouldn't truly know the real safety of these vehicle as a whole vs human drivers.... without this mandatory test vehicle permit?


> Or that Google, Uber, Tesla, etc wouldn't already be taking these numbers very seriously and doing everything to minimize them?

Given that we know existing car manufacturers have faked or covered up safety data in the past, you're putting a lot of faith in the goodness of tech companies.


The organization who detected that VW was faking the data wasn't actually the government (California Air Resources Board in this case). It was detected by a privately funded organization... funded in part by technology billionaires:

https://en.wikipedia.org/wiki/International_Council_on_Clean...

...I'm the one questioning the utility in of California's system of control. My faith in either actor's earnest participation is therefore not of much use.

I do know customers value safety and honesty of companies they buy/use cars from, and so do the investors who back these companies. Which are the people who keep these companies in business. That pressure exists independently of these permits. Which is why one must realistically question the utility of special government oversight.


I'm definitely a fan of the thesis that government should be more competent, although I don't think that's where you were going with that. :) I'd love to see government regulations that demand that you should stress-test the software, hire someone competent to analyze the code, get licensed engineers to be involved in design and development-process review, etc.

The difference between your devil's-advocate worldview and my worldview is I think not in our perception of facts - we both agree that the government isn't super competent right now - but in our optimism. You (or your viewpoint) isn't optimistic that government regulation will ever get good, and would rather just stop trying, but is pretty confident that individual humans and companies are unlikely to be abusive, or to cause much damage if they are. I'm resigned to expecting that someone, somewhere, will try to get away with the bare minimum required, and in the absence of regulation will just be too stupid to realize they're risking people's lives until people die. But I do have higher expectations of government regulation, and believe that it can be good or at least we can work towards it being good.

(Also, government regulation doesn't have to involve the government doing things. Several industries have so-called "self-regulatory organizations", which mostly exist because of the implicit threat that the government could start regulating more heavily. That is, the SROs only exist because of government letting industry know who the boss is, so there's a distinct positive effect of that posture! The SROs aren't perfect, but to first order everyone is better off: the government isn't interfering with things they don't understand, yet actual, competent oversight with meaningful enforcement powers exists. Self-driving cars are too small an industry at present to meaningfully self-regulate, but it certainly sounds like they won't be very shortly.)


Knowing how to drive and having insurance both sound like reasonable things for people to do, but we still mandate driver licensing and insurance.


It's a gross misrepresentation to claim that ignoring (i.e., breaking) laws is considered acceptable by [generally recognized] libertarian philosophies. Even many anarchist philosophies would reject that, although not all.

Perhaps the concept you're looking for is "anti-authoritarian", although it's entirely possible to be anti-authoritarian without being willing to break laws as one sees fit.


I don't think Uber exemplifies any sort of anarchist culture except perhaps anarchocapitalism which is an instance of the red herring principle [1].

https://ncatlab.org/nlab/show/red+herring+principle


So I have an off topic question?

I originally got into tech because I hate my life and thought that tech was this outlet that could actually make some change. I come from a world of violence and abuse and never really cared for the powers that be. Tech seemed like the answer.

So my question to the people who don't feel this way: why are you in tech? Serious question. Tech only seems interesting because you can move fast. What value do you find from working in this field if you don't feel this way? If you have to work inside some system?

This comes from a person who lived in an abusive household most of my life and still struggling to find something outside of QA while I work on my own projects, but I don't really get the appeal of this field if all you want is a house with kids and work within the system. It comes off as mediocre to me. So as a person reaching almost 30 who hates life the way it is and wants to see a change, please explain to me what you get out of this.

Thanks for those who answer this seriously.


I switched from Marketing to Computer Science because I got tired of writing multiple page essays based on a few sentences out of a textbook and I wanted to be in a field of study where I was judged by what I could or couldn't do rather than by how many buzzwords I could fit in an essay.

But there's plenty of other reasons to be in tech: The promise of money is almost as good as the finance industry; The ability to touch a lot of lives and improve people's quality of life is another possibility; The desire to understand technology as it exponentially permeates our lives was another aspect of my decision.


I enjoy working as a software engineer because I have fun building new things, because I find the science behind it intellectually stimulating, and because I feel the work our industry is doing is building the foundations for the future.

Speed is important, but isn't the biggest priority to me. It's best when things move quickly, of course... but my first priority as a professional engineer is to do things correctly, ethically, and safely. It's a point of personal pride.

I don't think there's any part of that that's in conflict with changing the world for the better. Move fast and break things; just don't break other people in the process.

Of course, there's a big difference in what's at stake when building a website, versus someone building a rocket / self-driving car / nuclear reactor / life-critical system. The engineering practices need to be adjusted accordingly in each case.


I'm in tech because I'm an idealist and believe very seriously that, done right, technology can change people's lives for the better. For the record, I don't think I could be convinced work at Uber.


Sorry about your experiences.

Purely anecdotal but I enjoy tech because I enjoy building things and the rabbit hole that is technology is near unendingly deep.

The salaries are great, benefits are great, paid time off is great. It's hard to complain.


I'm not a fan of the "system". But I also quite firmly don't believe that change in any direction, just for the sake of change, is good. There are a ton of things the system does wrong. There are also many things the system does right.

There's a lot of interesting tech where you don't run afoul of regulation but you still massively change lives for the better. Take cell phones - that's been a huge change in the world, and at no point did anyone need to break laws or skirt regulations to make it happen. (I'm not even sure how they could; the only thing that comes to mind is using disallowed frequencies, and even that would have been so much more harmless than anything involving cars.) But they've made an immeasurably huge change in the world, for the better.

Or take the Internet. Or take Wikipedia. Or Twitter. Or mobile camera technology. Or e-commerce. Or the technologies required to run large research clusters to develop new medications. None of these things are things where the system is fundamentally opposed to what you're doing. There may be regulatory fights, sure; someone will be worried about e-commerce and taxes, or encryption and law enforcement, or whatever. You're moving fast, and the system is confused; yes, fight the system.

But you're not calling the very concept of the rule of law into question. That's where I disagree with the libertarian/anarchist technocratic movement: the fact that someones you disagree with the law doesn't mean you have to consider it illegitimate.


Terry Pratchett wrote in the forward to his books that he became an author because it was indoor work with no heavy lifting.

Similarly, I got into tech because I was a smart kid who was good with computers and so it's a well paying field that I find easy and enjoyable. I'm also someone who's always been very much "within the system" and therefore try to be aware of my privileges. Don't underestimate the value of "house with kids and work within the system". Plenty of refugees dying on beaches for the possibility of a chance of maybe having that. Even within the comfortable West, it looks a lot more attractive when you're closer to 40 than 30.

Mind you, the reason I didn't go into CS research when I might have had the opportunity was because it was very obviously so much of a dead end; at least with startups there's a chance of people using your product.


I've thought a lot about this and can only conclude that the only reason that drives me (when I could be in other socially and financially better jobs) is the promise of striking gold with some successful B2C or B2B product, be it VR or CRUD apps. Actual CS engineering is interesting but not worth making as many sacrifices for.


> Uber exemplifies the libertarian / anarchist subculture in tech.

You are misusing those terms, as belovedeagle pointed out.

> ... will never be regarded as an engineering profession > It's like we rebelled against adulthood

Who is "us"? Many, if not most, software engineers would welcome being able to refuse doing anything morally questionable. Many are unhappy about companies disregard for safety, security, privacy, respect for users.

Yet, most are powerless in a world of at-will employment, H1B shackles, NDAs, and lack of professional organizations to protect those who disagree with their employer on ethical grounds.


And for us! Uber pool lowers prices, surge pricing increases availability. Uber would not let passengers ride in an autonomous deathtrap -- the negative publicity would be disastrous to their goals. If this is our libertarian dystopia, I welcome it with open arms.


Does surge pricing actually increase availability? There's no public evidence for it.


[flagged]


Please don't post unsubstantive comments here, and please don't call names in comments: https://news.ycombinator.com/newsguidelines.html.


> to clarify my position. This is why software will never be regarded as an engineering profession.

who cares?


> in light of projects where people can actually get killed.

Compared to what? Most states give out licenses to 16 year old kids with a few hours of driving school. Once you get a license you never get retested for vision or driving ability. There are very dangerous elderly drivers on the road and car accidents are the leading cause of death of 1-44 year olds.

Some allowable risk has to be accepted in order to have progress and a thriving society.


I'm going to talk about my state's perspective because what you're saying sounds like madness.

> 16 year old kids with a few hours of driving school

The standard curriculum for 16 year old drivers is 12-16 hours of in class lecturing, 5 hours of in-car instruction with a licensed professional, a minimum of 50 hours of in-car instruction from a guardian or family member, a nuanced written test, and an hour in-car examination.

That's just to get a restricted license. Until you get your real license 6 months later you have to additionally obey the following:

* No passengers except for your guardian, parent, or instructor.

* No driving after 11 PM.

* Absolutely no cell-phone use (even hands free).

* Any ticketable violation has a mandatory court appearance and will trigger a second round of more difficult driving instruction.

> Once you get a license you never get retested for vision or driving ability.

I have terrible vision and I have had to go through the test three times; after the third time I now have to my to wear my glasses to drive. It happens every time you have to renew your license.

> car accidents are the leading cause of death of 1-44 year olds

That's a bit misleading, there isn't much else that kills young people.

My point is that in situations where there is that kind of risk involved we actually to go through a lot of effort to mitigate it.


That's a lot of regulations. I would think that teenage drivers are very safe.

> In 2014, 2,270 teens in the United States ages 16–19 were killed and 221,313 were treated in emergency departments for injuries suffered in motor vehicle crashes. That means that six teens ages 16–19 died every day from motor vehicle injuries. In 2013, young people ages 15-19 represented only 7% of the U.S. population. However, they accounted for 11% ($10 billion) of the total costs of motor vehicle injuries. The risk of motor vehicle crashes is higher among 16-19-year-olds than among any other age group. In fact, per mile driven, teen drivers ages 16 to 19 are nearly three times more likely than drivers aged 20 and older to be in a fatal crash.

> Of the teens (aged 16-19) who died in passenger vehicle crashes in 2014 approximately 53% were not wearing a seat belt at the time of the crash.2 Research shows that seat belts reduce serious crash-related injuries and deaths by about half

Perhaps someone can pass a law mandating seatbelt use!

[0] https://www.cdc.gov/motorvehiclesafety/teen_drivers/teendriv...

Edit: it's also misleading that you suggest that the driver must have a minimum of 50 hours of driving experience. While technically true in NY State, all the applicant has to do is provide a completed Certification of Supervised Driving (MV-262) signed by your parent or guardian and that's only for teens under 18 year olds.


Those teens are driving in all sorts of conditions that autonomous vehicles can't even touch right now.


> Once you get a license you never get retested for vision or driving ability

If you are in California, you will occasionally be required to go to an office in-person for DL renewals where your vision will be tested as part of the process. Many insurance companies also incentivize older drivers taking "mature driver improvement" courses every few years as a refresher. Then of course there's things like traffic school if you are driving and receive a moving traffic violation. This seems straightforward...

Also, I might note this because I went through this process with my father... when your vision is bad enough to fail the DMV's criteria of 20/40 corrected vision with either or both eyes, they have a whole load of things they make you do. With my father he could theoretically hit the 20/40 vision requirement but made mistakes due to scarring from surgeries that blurred vision in only certain areas. He had to get documentation from his doctor as to what kind of driving he could potentially do (e.g. no night driving), got additional mirrors installed in the car, went through a special drive test (after he did his own refresher driving class voluntarily), and had a whole load of restrictions added to his already limited term license (e.g. he had to get a new license after only a couple of years, and every renewal would require a drive test). In the end, it was such a hassle despite zero functional issues driving that he ended up giving up his DL at the last renewal. If his vision had declined significantly from there, DMV would have immediately revoked the license and suspended his ability to receive a new one.

So, sure, seniors and teens alike cause a lot of accidents, but we're not exactly lacking on trying to reduce problems.


Vision tests depend on the state. Colorado and Texas both require vision tests every X years (depends on age in some cases).


What? There were self-driving vehicles in San Francisco? I'll spend the next hour of my life getting up to speed with this. Good stuff!


'Disrupt' is fine, but not when you're potentially putting lives at risk.


>"Safety is our top priority"

Do they want me to die from laughing?


Can't disrupt these days without breaking some laws.


If these people actually gave a damn about the environment they would embrace self-driving cars. One-car-per-person is ludicrously wasteful.


Uber just doesn't get the whole concept of regulations do they?

I understand violating ones that are "dumb" and hurt your business (e.g. taxi medallions) or those that it's easier to just do it and figure out hte results later (e.g. insurance) but this is literally "get the permit first".

They should process this nice and sloooooooooooow.


I don't know about regulations, but Uber has made traveling around my city a cheap and pleasurable experience for me, which was certainly not the case before with the so called "legal and regulated taxis", constantly offering shit service and outright scamming people. When it seems like the regulations are not helping consumers, it's only natural that the consumers will not care if a company follows them.


> Uber just doesn't get the whole concept of regulations do they?

I think you don't get the whole concept of free market capitalism behind Uber - any regulation they can't simply ignore if they choose to hurts their business by definition.


The pedestrian was at fault for not installing Uber's app. This gives new meaning to the term "god mode"... "Install our app, or DIE!"


I really hope California isn't overreacting, possibly due to industry pressure from taxis etc. The state should be doing all it can to support, grow, and encourage this innovation.


There are already 20 companies that have been granted permits. Uber might have been granted one too, but they didn't even apply. https://www.dmv.ca.gov/portal/dmv/detail/vr/autonomous/testi...


These are permits for Autonomous vehicles. Uber couldn't legitimately abide by a permit as they would need to confirm that the "autonomous vehicle is operated for testing purposes only"... and they want to use it as part of their business model straight away.

Without looking into the specific legislation, on the surface it looks like a reasonable interpretation.

All of the references from Uber state they are doing "self-driving".. which isn't IMHO Autonomous, as it is only Level 3. https://newsroom.uber.com/pittsburgh-self-driving-uber/

I'd expect Uber to obtain a license when they are ready to move towards driverless autonomous development, such as Level 4/5.


Uber doesn't get to decide if they are in violation of California regulations.

As for the permit, there is no guarantee that Uber will get one if they apply for it; the state can refuse such an application. If they keep showing callous disregard for public safety, they may not even be around to get refused by the State of California.


Uber does get to decide if they follow the law and not the DMV's interpretation of it.

Providing the vehicle required human monitoring the actions of the car, it isn't classed as an autonomous vehicle.. or do you disagree with my interpretation of the below?

  (1) “Autonomous technology” means technology that has the capability to drive a vehicle without the active physical control or monitoring by a human operator.
  (2) (A) “Autonomous vehicle” means any vehicle equipped with autonomous technology that has been integrated into that vehicle.
http://leginfo.legislature.ca.gov/faces/codes_displayText.xh...

Providing the vehicle design still requires a human to be monitoring the car, it by my interpretation not an autonomous vehicle. Obviously, this will change in the future.. and then they will require a permit.


Reading the regulations and DMV's commentary, there seems to be no prohibition on carrying passengers for hire while testing an autonomous vehicle. DMV decided not to allow "commercial vehicles" in this phase, but that means heavy trucks and buses requiring commercial plates.

The draft autonomous vehicle deployment regulations are tougher.[1] They include a data recorder requirement, testing, and demonstrations.

DMV seems to be trying hard here to do a good job. Somebody has to restrain the "move fast and break things" crowd before they leave blood on the pavement.

[1] https://www.dmv.ca.gov/portal/wcm/connect/211897ae-c58a-4f28...


> The state should be doing all it can to support, grow, and encourage this innovation.

I don't think anyone disagrees. California's been very proactive in promoting and allowing self-driving car trials.

But you can't allow just any driverless car on the road without ensuring that it's safe — that's why there's a permit process for it. This isn't just another app — people's lives are at stake here.


> you can't allow just any driverless car on the road

In San Francisco, no less! I'm a huge fan of self-driving cars. I think they should already be legal in the dead of night in Cupertino and Wyoming. But a high-density urban space is not where you experiment with cars.


As someone who uses this cross-walk on occasion, I think the state is reacting just fine.


If the car was driving autonomous or not, there was a driver sat in the drivers seat which is fully responsible for the actions of the car.

If the self-driving capability missed the red light, then it is the drivers responsibility to have intervened and stop the car.

If I did this in a Tesla with autopilot enabled, would all Tesla's be banned in California? I should be punished as the driver in command of the vehicle.

If I did this in an old banged out SUV, would the car be banned? No, i'd received a citation for the offence of running a red.

Which is exactly how this incident should have been handled.


The ban wasn't because of the car running the red light (though the incident may have contributed to it), it's because Uber didn't apply for a permit to test autonomous vehicles in California. Tesla, by the way, received a permit.


The problem with the thrust of your argument, I think, is that you're proposing that self-driving cars are something a driver actually needs to pay more attention to than a standard automobile.


As long as the vehicle requires human intervention, they should not be allowed to enter autonomous mode in a dense, urban environment.


I don't think that the taxi industry has a lot of pull at the state level. Perhaps in some cities, but Uber has generally not had a lot of difficulty besting taxis politically in California thus far.


If they did, you'd think the government would be cracking down on unlicensed taxi services (which is what Uber really is) as an immediate and clearly illegal threat to the regulated market which taxis were originally promised by the state.


Ah so that's what we're calling the wreckless endangerment of pedestrians these days... innovation!


> The state should be doing all it can to support, grow, and encourage this innovation.

Obviously California does a good job at this. The innovative machine that is Silicon Valley has been impossible to replicate anywhere else on Earth, nothing else even comes close.


That happens despite California, not because of California.


Right, that's why there's a Silicon Valley in every state and in every nation. Obviously.


Can't you both be wrong?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: