Hacker News new | past | comments | ask | show | jobs | submit login

Thanks! Likewise :)

Thanks for the suggestion! I just added links to the demo applications earlier in the README. All applications are Jupyter notebooks that you can open in Google Colab.

* Examining the emotion palette of actors in a movie: https://evadb.readthedocs.io/en/stable/source/tutorials/03-e...

* Analysing traffic flow at an intersection: https://evadb.readthedocs.io/en/stable/source/tutorials/02-o...

* Classifying images based on their content: https://evadb.readthedocs.io/en/stable/source/tutorials/01-m...

* Recognizing license plates: https://github.com/georgia-tech-db/license-plate-recognition

* Analysing toxicity of social media memes: https://github.com/georgia-tech-db/toxicity-classification




I personally wouldn’t put the Emotion one first on the GitHub README, that was the only one I opened, before clicking the license plate one and a) see it was a whole other GitHub demo and b) opened two files to see both doing parsing/loading models without any SQL before getting bored and closing the project.

Maybe I’m not the target market but seeing the 2nd and 3rd example in your list here, which actually has SQL query examples, were much more interesting and relevant IMO


Thanks for the helpful suggestion! Just reordered the examples in the README.

Here are the illustrative queries:

  -- Object detection in a surveillance video
  SELECT id, YoloV5(data)
  FROM ObjectDetectionVideos 
  WHERE id < 20

  -- Emotion analysis in movies
  SELECT id, bbox, EmotionDetector(Crop(data, bbox)) 
  FROM HAPPY JOIN LATERAL  UNNEST(FaceDetector(data)) AS Face(bbox, conf)  
  WHERE id < 15;




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: