Hacker News new | past | comments | ask | show | jobs | submit login
Carla – An open-source simulator for autonomous driving research (carla.org)
156 points by programd on Nov 17, 2017 | hide | past | favorite | 24 comments



There is also...

AirSim from Microsoft - https://github.com/microsoft/airsim

Gazebo car simulator - https://www.osrfoundation.org/simulated-car-demo/

Udacity car simulator - https://github.com/udacity/self-driving-car-sim

It would be great if someone has experience with any of above and comment on how they compare with CARLA.


Both AirSim and CARLA aim to fill in similar gaps in the current state of photorealistic simulators for perception and control research. Here are some differences one of my collegueas told me (he has used AirSim and beta-tested CARLA):

AirSim

+ Also has quadcopter simulator (besides car)

+ Large environment

+ Easy to add new environments

CARLA

+ Simulator can be stepped

+ Multiple weather conditions

+ Environment has pedestrians and other cars


My experience with Udacity's simulator is purely from the perspective of having used it as a part of participating and completing their nanodegree program.

First off, it's very easy to use; it's written in Unity and uses something like (or maybe is?) a websocket-like interface to communicate with the simulated vehicle. I'm not sure what its complete capabilities are, because we used a number of different versions over the course. Our initial uses were with a "closed source" version, but toward about the middle of the course it was opened up (I was in the October 2016 cohort, which was either the first or second group going thru - so we were very much "beta testers").

From my recollection, we could get camera info (from one camera in the "center" of the windshield - but there were supposedly two other cameras on the left and right corners - these cameras were similar in position to what was on the real Udacity car, "Carla" - no relation).

We could also get waypoint data, plus a variety of other data from the "car"; we could send steering and acceleration/braking data. But at no point in the course did we get all of these bits of data at the same time; it was very much tailored to the lesson or project.

Another note: I used this simulator on my workstation at home, which is a 6-core AMD system with 8 gig of RAM and a NVIDIA GTX 750 TI OC video card (getting to be long in the tooth). I run a very tweaked version of Ubuntu 14.04 LTS on it. I also set up TensorFlow and cuDNN, to utilize my GPU. I found that my system had no problem running various CNN models, training, etc at the same time as using the GPU to run the simulator. It seemed to work well for that, but it probably isn't ideal.

As a simulator, I would say it is fairly simplistic, but it wasn't meant to be a production-ready solution, but rather a training/educational simulation for teaching the topic. I'd say it does that part well.


Also since some are using GTA for these sort of experiments I wonder if Grit is suited for this sort of thing as well:

http://www.gritengine.com/

I believe Grit has Lua scripting built in as well. Looks like CARLA has some Python thrown in which is good too, but no direct mention about it.

Edit:

Youtube demo of Grit:

https://www.youtube.com/watch?v=RpORM_it8l8 (from 2014)


Use gazebo on the regular, but at first glance, only AirSim looks to use rendering real enough to test computer vision algorithms with.

I do love that Udacity's simulator is in Unity though.


I just found out about Carla a few days ago during my regular searching through github for gpl projects. It's very cool that they release the art assets for free, I'm already importing them into my UE4 project.

UE4 has a lot of potential uses other than just gaming many people don't realize. For example, the camera and film tools really get me excited for the future of digital film.

Right now UE4 is one of the main pieces of software I have compromised on when it comes to my normally staunch position on foss, because I just don't have the time or resources myself to get what I really want off the ground, which would be a linux only, vulkan only engine.

Epic hasn't delivered on their promises to the gnu+linux community, for example we still have no marketplace because they have closed the source for the launcher so it's windows only, but in my book it's certainly better than unity for linux native dev.


I encourage everyone to watch the End-to-end reinforcement learning part of the video...


Reminds me of "python plays GTA V" stream where a Convolutional Neural Network drives through the vast GTA world: https://www.twitch.tv/sentdex


The drunk driver simulation mode is on point, I see.


crashtastic


There is a simulator from Apollo also (Baidu)

https://github.com/ApolloAuto/apollo


Anyone knows if it supports hills and uneven terrain? In the video all I can see is flatland?


Seems like vehicles can also be manually controlled [1]

[1] http://carla.readthedocs.io/en/latest/how_to_run/


It seems to be either simulation (networking) or manual control? Wonder why there's conflict between them?


This is pretty cool.

I was hoping this was a namesake or somehow linked to Udacity's SDCE nanodegree program, since their actual self-driving vehicle that they use in that program (as the "final project") is named "Carla".

But it seems to have no relation - but more simulators are always welcome!


Could be useful for humans to practise driving?


3D Driving School Simulator PC


Offtopic unfortunately, but i've been looking long time for a simulator of IoT and electronic devices, for home automation. I'd like to simulate the hardware, so i can focus on the development

Any advice?


What does simulating IoT hardware even mean to you? For example, most IoT stuff outputs one or more values when queried or when triggered. Are you looking for something which generates the values in a way which looks like IoT? That is, a random generator which will generate a temperature over time, or a random door sensor generator which will generate open/closed in a pattern similar to a real door? It seems to me that it would be almost trivial for you to write these random generators yourself and feed them with the distribution of (time,values) that you want to randomize. Or are you looking for more intricate simulation of jitters, failures, errors, flaws similar to real embedded IoT hardware? Can you elaborate?


Actually, electronic models and home automation. So I can create a plan and prepare electronics, sensors and software in a VR


Might be a bit low level but https://circuits.io/ goes into this territory with Arduino and stuff. I was blown away by how much electronics you could automate there and just focus on Arduino development instead.


Can you use this to create training data for SDCs?


That's their purpose. Simulators are like dynamic datasets.


Isn't the dataset biased?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: