Deepgram is open sourcing Kur (http://kur.deepgram.com)! Kur is the worlds first descriptive Deep Learning software. Think of a model, describe it in a simple YAML or JSON file, and train to get state-of-the-art results. There's no need to code.
Why we built Kur: Prototyping DNNs is a slow process. Most people doing deep learning want to iterate and try out different model architectures and learn from others. It's hard to do this using barebones backends like TensorFlow/Theano or even the higher abstraction of software like Keras.
Kur is not speech specific. It can be used for images (we supply two examples), speech (we supply one example and DG is open sourcing a new audio dataset with it, the DEEPGRAM10), text, etc.
There are CNN layers, RNN, dense, dropout, batch norm, etc. to pick and choose from. The best part? Kur does all the plumbing! You want one input but two outputs? Not a problem, describe that model in Kur!
We're really pumped to be releasing Kur and would love to answer questions if you've got em.
Hi! I'm the core maintainer of Kur. I've also been intersted in altcoin trading, and I agree that Kur will be a great place to start. Let me know what progress you make (on Gitter)! Excited to here!
There have been a few frameworks that take this declarative approach; DSSTNE, Twitter's internal framework and probably others.
DSSTNE had a clearish reason for doing so: automatic model-parallel training. Twitter wanted something simpler than Torch for most of their devs to use.
Thanks for bringing up other frameworks in DL! Amazon's DSSTNE is restrictive, but it's great for their purposes. On the DSSTNE GitHub you can see statements like:
"DSSTNE currently only supports Fully Connected layers ..."
Kur supports the cutting edge: like CNN/RNN.
"DSSTNE Engine works with data only in NetCDF format."
Kur supports the data that you have on hand. You can see in the tutorial (http://kur.deepgram.com/tutorial.html) how easy it is to send brand new data in the familiar Python pickle object.
These are the kind of GOTCHAs that people doing deep learning run into all the time. And they are a major time suck. Kur relieves you of those duties so you can work on more interesting parts, like trying novel models :).
We're so glad DL tools are coming out. But there's still tons of progress to be made and Kur is one step along that path—making the user experience more efficient and enjoyable.
We use it and like it because we are trying a lot of models per unit time. And we want to change them, slide in new data, transfer weights, and others things like that, without headaches.
So the real reason is for internal efficiency for model prototyping, or more direct: results per human-hour. If we have to spend time in troubleshooting land, then we're losing our Startup Competitive Advantage™ in DL (the ability to move fast).
I guess I just don't think that's true, Keras is already pretty easy to use for model definition, and this doesn't seem to solve any actual pain points I've encountered.
Oh yeah, Kur allows MUCH faster prototyping. And if you utilize Jinja2 (Kur supports that in the Kurfiles), you really start unlocking the time savings.
Great idea!
However installation instructions are a little misleading. It says I only need Python 3, while in reality I need a whole bunch of things to be installed already. Is there a list of all requirements?
Also, does it work on Windows? "DL for dummies" code should run on Windows :)
The "whole bunch of things" is just the standard system-wide install for working with any Python package: python, pip (which comes with python), and virtualenv. Though I also recommend virtualenvwrapper for setup simplicity.
But I'd like to know what you're about to install on my machine, where can I find that list?
Anyway we can help get it installed for you?
I got "Kur requires Python 3.4 or later" error even though I have it on my Ubuntu vm:
~# python3 --version
Python 3.4.3
Error says: "Command python setup.py egg_info failed with error code 1 in /tmp/pip_build_root/kur"
Why is it using python instead of python3?
A couple of weeks ago, I went through some struggles installing TF on Ubuntu 16.04 and getting it to see the CuDNN (mostly due to various paths not being setup correctly), so forgive me if doing all that with a single command sounds too good to be true, especially if it can't even see that I got the correct python already.
And scroll just a little bit to the “Quick Start For Using pip”. It has this code:
pip install virtualenv # Make sure virtualenv is present
virtualenv -p $(which python3) ~/kur-env # Create a Python 3 environment for Kur
. ~/kur-env/bin/activate # Activate the Kur environment
pip install kur # Install Kur
kur --version # Check that everything works
git clone https://github.com/deepgram/kur # Get the examples
cd kur/examples # Change directories
kur -v train mnist.yml # Start training!
The key line there is where it mentions:
virtualenv -p $(which python3) ~/kur-env
This line grabs your Python 3 install and makes a virtual environment. The rest of the commands just move things around into a tiny environment so you’ll find yourself in a directory where you can just run:
kur train mnist.yml
Then you’ll be training. :)
The list of dependencies is in "setup.py" in the repo--they are all Python packages. The reason you're probably getting the Python3 error is that you need to set up a virtual environment so that your system can isolate different versions of Python (and Python packages).
In fact, if you use virtual environments, all the dependencies that get install are confined to that virtual environment, and won't affect the rest of your installation. Using virtual environments is definitely a Python "best practice," and if you've never done it before, we walk you through it in our "Quick Start" section of the documentation: https://kur.deepgram.com/install.html#kur-quick-install
If you want to inspect the packages that are installed in the kur-env virtual environment, then (while in kur-env), just do:
pip freeze
And it will print out the installed python packages.
Looks very interesting! Particularly because I don't know NNs and would love to be able to get a feel for it. I'll give it a try this weekend and share my noob experience
Why we built Kur: Prototyping DNNs is a slow process. Most people doing deep learning want to iterate and try out different model architectures and learn from others. It's hard to do this using barebones backends like TensorFlow/Theano or even the higher abstraction of software like Keras.
Kur is not speech specific. It can be used for images (we supply two examples), speech (we supply one example and DG is open sourcing a new audio dataset with it, the DEEPGRAM10), text, etc.
There are CNN layers, RNN, dense, dropout, batch norm, etc. to pick and choose from. The best part? Kur does all the plumbing! You want one input but two outputs? Not a problem, describe that model in Kur!
We're really pumped to be releasing Kur and would love to answer questions if you've got em.
Thanks! Deepgram AI Research Team
http://kur.deepgram.com http://github.com/deepgram/kur http://kurhub.com