No it doesn't. You can have an O(N) amortized time. Big-O is a bounding function up to a constant factor, not necessarily a boundary (as in worst-case) time.
To say that something runs in amortized O(n) time guarantees an upper bound on the average time per operation in a worst-case sequence of operations. It does not deal with average-case time on random or typical data.
I didn't say it did. On the other hand, unless the author of the algorithm is really clueless (edit: or knowingly making a probabilistic statement), I'm sure he meant amortized time.
It could be just an abuse of notation, in the same way that people say that Quicksort is O(n lg n) on average. Sure, on perverse data the best time bound you can prove is O(n^2), but on random data you get expected O(n lg n) time, and on typical data with good partition selection you can generally expect to not go quadratic.
(You could also use the O(n) time median algorithm to construct a truly O(n lg n) Quicksort, but that's just a fun theoretical side-note here.)
> ...an abuse of notation, in the same way that people say that Quicksort is O(n lg n) on average.
I think I may have misunderstood what you meant by this. If I have, I apologise, if I haven't...
Talking about the average-case complexity of algorithms with Big-Oh notation is in no way an "abuse" of that notation. If the average time complexity of an algorithm is `O(f(n))`, then its running time averaged over all inputs of size `n` is bounded above (in some sense) by `f(n)`.
I guess that statement doesn't say that the number of comparisons or swaps or "steps" the algorithms must make to complete is `O(n lg n)`. Perhaps that's what you meant.
Asymptotic bounds can be given in the worst case, the best case, the average case ("in expectation"), or with high probability. All have rigorous definitions and you should go learn about them. There is nothing bastardized about this notation.
>Preprocessing phase in O(M) space and time complexity. Searching phase average O(N) time complexity and O(N*M) worst case complexity.
I don't trust the analysis of someone referring to "average O(N) time"; Big O notation refers to boundary times.
Edit: Okay, based on arguments here and on [1], I'm going to accept that maybe he's just bastardizing the notation.
[1] http://stackoverflow.com/questions/3905355/meaning-of-averag...