I am a little concerned about doing that though. Having seen sites killed stone dead by people blocking stuff by mistake in robots.txt and the engines then never looking at them again I'm very wary indeed of blocking stuff I intend later to unblock.
It sounds like you care more about google traffic than you care about real users, if you want to do this without having more than a few entries in your robots.txt and your sitemap then you could simply remove the other pages until you're ready to have them spidered, alternatively have them behind a login and hand out invitations.
In a nutshell, if you put up millions of pages and tell google about it it will index you, if you don't want that you'll have to make choices about the quantity and/or switch to a different kind of host.
Also, this kind of 'bot trap' tends to attract penalties so if this is not some ploy to get traffic out of google you may want to re-consider how you've laid things out, the difference between a legitimate site with a lot of generated pages and a page-spammer is hard to determine and google tends to err on the side of caution.
Since your site is down I can't see how it's organized, but I would think hard about having two separate parts to it. One would be a site that changes slowly, perhaps a descriptive page, maybe with a 'best of' or examples that you cull. Then have the main page that is rapidly updated and keep the 'bots out of that. There's no reason to index the rapidly changing map, is there? Just index a slowly changing pointer to it.