My senior year in high school I got hooked on an online multiplayer Tetris game called TetriNET. I eventually created a pretty sophisticated bot to play for me, which absolutely crushed the games. A friend recommended I submit it for consideration in the regional science fair, where I wound up taking first in the computer science category.
Lots of details on the blog [1], including screenshots, source code, and details of the algorithms used.
I would be interested to see whether a genetic algorithm performs better than a hill-climbing one in this context. By hill-climbing I mean searching for a vector in the seven-dimensional space you've defined which makes the biggest improvement in the fitness function. I tend to think hill climbing would converge faster than evolution among random mutations.
I hate to be "that guy", but I'm somewhat confused why everyone online seems to be so obsessed with genetic algorithms. What they are is a kind of optimization technique, and one of the worst ones at that. (Genetic algorithms are what you resort to in the rare cases when no other optimization technique will work.)
As 'lincolnq said, standard hill-climbing would probably work much better for this kind of problem.
In my experience (I worked on convergence analysis of some early GAs), genetic algorithms suffer from a combination of extremely easy implementation, a "cool" factor that plagued neural-networks-as-classifiers research in the early 90s, and a belief that biological "analogies" would perform better than decades of research in stochastic optimization.
Still, they have pedagogical value because they are easy to implement, and for many students, are their first real soup-to-nuts "AI" algorithm implementation. I've seen them serve as the gateway for students to get into CS research, but IMO aspects such as the last resort principle, no free lunch theorem, and principles of stochastic optimization are generally under-stressed, which leads to some abhorrent research papers along the lines of "Problem X using GAs".
Standard hill-climbing would indeed work better under the assumption that the fitness landscape is fairly unimodal, in terms of local optima. In this problem that may or may not be the case. I think that this is the motivation for a somewhat more "global" search strategy (something that can escape local optima) like genetic algorithms.
That being said, amongst the possible candidates for search strategies, genetic algorithms are fairly lousy. Differential Evolution or CMA-ES would likely work far better.
As a counterpoint to "that guy", what I like about GAs is the localization aspect : While modern mathematical methods can do sampling (and so optimization) more efficiently, they seem to be in the "Intelligent Design" camp. GAs can work using extremely local information - and let the force of numbers pull the process through.
<grin> Besides, evolution is the optimization method that God chose. </grin>
Population size of 16 is way too low : it's not surprising that he's getting premature convergence. Also, his mutation is huge : so his overall process is more like a directed random walk.
More reasonable parameters would be 100-1000 for the initial population, and +/- 2% on a parameter for mutations.
My senior year in high school I got hooked on an online multiplayer Tetris game called TetriNET. I eventually created a pretty sophisticated bot to play for me, which absolutely crushed the games. A friend recommended I submit it for consideration in the regional science fair, where I wound up taking first in the computer science category.
Lots of details on the blog [1], including screenshots, source code, and details of the algorithms used.
[1] http://www.mattmazur.com/2009/05/creating-a-tetrinet-bot/