The black proved slightly harder to track than the yellow and red balls, as it didn't have a particularly distinctive hue. The first step towards tracking the black was filtering out everything except the green surface, which is demonstrated in the video. Unfortunately we ran out of time before getting the black tracked.
I'm wondering whether you could apply an edge-detect filter to your input frames and combine it with the hue-based detection. I'd assume that irrespective of the ball color, an edge detect + posterize would give you a video frame with clear spots where the balls are, which should be relatively easy to find. I remember from the image analysis courses I took in college that spot detection is a very common operation in automatic analysis of medical images, and hence extensively researched. After you know where each ball is you could convert the frame to YUV and use the UV (chroma) channels to detect the ball colors.
Edge detection algorithms are fairly slow and noisy from what I found. To do this in realtime you have to have fast algorithms or cut down on the information to process over.
Background subtraction works really well for these kinds of problems. You should give that a go, even better if you can make your background based on a rolling average of the image pixels.
As for the speed issue, your can often get massive gains by restricting the colour classification and blob detection to areas of the image which have changed since the previous frame.