Hacker News new | past | comments | ask | show | jobs | submit login

I'd expect it's possible by changing the generator at the lowest level, but it makes sense to me that you need a loop if you don't control the underlying generator.

Imagine you want to turn a random number in 1..4 into a random number in 1..3. The original is your only source of randomness, so the rest should be deterministic. Then each outcome in 1..4 has to map to exactly one number in 1..3, but there's no mapping that accepts all of 1..4 while still giving each of 1..3 an equal probability.




What if we allow the mapping function to be stateful?


I guess you could save up a few bits over repeated calls, but it can't help you always execute the first call with a single round of generation.


Could it cap the maximum number of generator calls though? Rejection sampling is technically O(infinity) because you could reject an unbounded number of times. This isn't a problem in practice but it sure is theoretically annoying. With a cap on the maximum number of calls, it would be O(1).


I don't think so.

If you want a number in 1..3, and the generator provides numbers from 1..4, and you want a cap n, then you could model it as a generator that provides numbers in 1..4^n. There's never a way to split that space into three equal parts.

You always end up with unwanted extra possibilities that you still need to handle somehow.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: