I'm having a hard time wrapping my head around the big-O notation for a pairing operation. The question is pretty simple- Generate all possible pairs for a given list of numbers in an array.
My first guess is to have a nested for/foreach loop and generate the pairs. This is easy enough and I get that for every n, I analyze n other numbers and this gives me a complexity of n^2.
Now, if I try to optimize this further and say that (1,4) is the same as (4,1), then for a sorted array like 1,2,3,4,5. I only run the pairing operation in a nested for loop this way
for (i =0; i < count; i++) {
for (j = i + 1; j < count - 1; j++)
}
}
In this case, I only run through the array < n^2 times. For a sample size of 7 numbers, i ran through the loop 21 times to generate the pairs. I know that this cannot be a log-n operation and I'm tempted to say that this operation converges to n^2 but I don't remember enough from my math or theory classes to answer this definitively. How do I go about this problem?
Context: I had an interview with a similar question and this was born out of an argument I had with my friend if a pairing operation from a list can ever be better than n^2.
You are correct that you're doing fewer than n2 operations. The question is how many fewer operations you are doing.
Let's think about how many pairs there are in the array. If each of the n numbers can be paired with (n - 1) other numbers, the total number of pairs possible is n(n - 1). Each iteration of the original for loop generates one of these pairs, so the total number of pairs you generate is n2 - n, which is O(n2).
Now, what about if you eliminate duplicate pairs by saying that (1, 4) and (4, 1) are the same? In this case, note that half of the pairs you generate are going to be extraneous - you'll generate each pair twice. This means that the number of pairs is (n2 - n) / 2. This expression is less than n2, but notice that it is still O(n2) because big-O notation ignores constants.
In other words - you are right that you are generating fewer than n2 pairs, but the total number of pairs you're creating is still O(n2).
More generally - if you ever decrease the total amount of work that an algorithm does by some constant factor (say, you cut the work in half, or cut the work by a factor of 100), you have not changed the big-O runtime of the algorithm. Big-O completely ignores constants. In order to decrease the big-O runtime, you need to decrease the total amount of work by an amount that is more than a constant; say, by a factor of n, log n, etc.
Hope this helps!
Remember that big-O notation involves an implied multiplicative constant. So your complexity is still O(n^2) if your run-time is <= k.n^2 as n -> infinity.
It's still O(n^2) since now you have exactly half the pairs that you had before introducing the order requirement. Dividing by two does not change the Big O.
j <= count - 1
Kuroki Kaze 2012-09-21 12:02