O is how you choose to work on something, like how you choose to wash the dishes.
n is how many dishes you have to wash.
Which is faster? Washing, then drying, then putting away each dish one after the other, or washing all of the dishes, then drying all of the dishes,
then putting them all away?
If n is 1 then either method is the same. If n is 10 then maybe the first approach is a little less efficient, but when n is 100 then obviously the batched approach is more efficient.
Big O notation is just the formal way of comparing which algorithm is more efficient when applied to large data sets.
Sorry if I've misread your example, but I think this isn't quite correct. While the "batch" method of dish-washing (washing all at once, then drying all at once) is probably more convenient and efficient for humans, if you were to formalize your "dish-washing" algorithms, I believe both of them would have running times in the same class of functions -- ie the same big Oh.
In pseudo-java-code:
void washDishes(Dish[] dishes) {
for(Dish d : dishes) {
Sponge s = getSponge();
s.wash(d);
Towel t = getTowel();
t.dry(d);
}
}
void washDishes(Dish[] dishes) {
Sponge s = getSponge();
for(Dish d : dishes) {
s.wash(d);
}
Towel t = getTowel();
for(Dish d in dishes) {
t.dry(d);
}
}
Maybe, in practical terms, there would be differences in efficiency, but those are exactly the kinds of constant factors big Oh is meant to abstract, and for the purpose of example, both of these are O(n) given input size (number of dishes) of n.
I hesitated to post this comment, since it's kind of a "well, actually"[0], but unless I'm mistaken this example is fundamentally flawed.
In your first method, you call getSponge() and getTowel() d times each, but in your second method you only call each once. That would be significant for large values of d, if I'm not mistaken.
edit: using "d" to refer to the number of dishes here. that could have caused confusion, my bad.
That's actually exactly what I was trying to point out. It may be significant practically, or might not, and definitely would be if you were really washing the dishes ;) But in terms of big Oh, they're the same.
If you assume that all of the methods (getSponge, getTowel, wash, dry) do the same amount of "work", ie take the same amount of time to execute, then the first function takes 4d units of time (continuing to use d as the number of dishes). The second takes 2d+2 units of time. We define these as (mathematical) functions in terms of d, which compute the time spent for a given input size:
f(d) = 4d
g(d) = 2d+2
It's possible to prove that both of these function are O(d).
The formal definition of O(n) is:
f(x) = O(n) if and only if there exists some constant c
and some value x0 such that, for all x > x0:
f(x) <= c * n.
It's a way to talk about the growth rate of functions, what people usually mean when they say some algorithm is O(n) is that the function calculating the runtime based on input size is O(n).
This is pretty trivial to show with f and g above, pick c=5, x0=1.
Roughly, you can drop additive and multiplicative constants when computing the "big Oh" of a function. I'm leaving out an awful lot of formal mathematical details here, partly because I'm rusty, but also because all of this is much better presented elsewhere, in numerous algorithms textbooks and courses. (btw it's possible I've made a mistake in the above, for which I hope you'll forgive me)
Lots of people would recommend the big white book from MIT, but I'm fond of "The Algorithm Design Manual" which is less formal and rigorous, and thus perhaps a little better suited to professional software developers who are just brushing up.
Makes sense. I think I glanced over the text before and after your pseudo-code too fast. You're absolutely right about the constants not affecting the Big Oh part. For some reason I initially interpreted your comment to say "these two algorithms are equally fast". My mistake :o
n is how many dishes you have to wash.
Which is faster? Washing, then drying, then putting away each dish one after the other, or washing all of the dishes, then drying all of the dishes, then putting them all away?
If n is 1 then either method is the same. If n is 10 then maybe the first approach is a little less efficient, but when n is 100 then obviously the batched approach is more efficient.
Big O notation is just the formal way of comparing which algorithm is more efficient when applied to large data sets.