Hacker News new | past | comments | ask | show | jobs | submit login

No, algorithms people don't do lossy compression (Human/Computer Interaction people tend to do those). However you can do algorithms that fail with a certain probability, then you just repeat it until it succeeds and calculate the expected cost. Specially within information theory, there are a lot of strange techniques dependent on things like this.



What a weird way to create artificial boundaries in thr science community.


I think Thomas meant that designing a lossy compression scheme inherently involves making decisions about what parts of the data it is OK to lose -- i.e., in audio processing it might be acceptable to drop frequencies outside of the range of human hearing from the compressed output.


But you can't "just repeat it until it succeeds" because there's a deterministic chance (1/128 in the case of the cards) that a De Bruijn cycle occurs more than once in the same sequence. So to just repeat it until it succeeds you would need to keep rearranging the sequence until you eliminate those redundant cycles.. which obviously doesn't work cleanly because now you have to encode those rearrangements somehow.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: