It's obvious that we cannot find min (or max) in an array of length n in strictly less than n "steps". It's also well-known that no comparison-based algorithm can sort an array faster than n * log n.
In both of these cases formal proof relies on a simple counting argument: number of all possible inputs to an algorithm is bigger that all its' possible outputs, which such algo can achieve in given time.
In essence, this means that any algorithm that works in general (outputs correct answer for all possible inputs) must "pay" the corresponding "price" -- n units for "max problem" and n * log n units for sorting.
My question: Can we use similar approach to prove lower bounds on time complexity for other polynomial algorithms? For example, it would be nice to show, by some clever counting, that some "intuitively quadratic" algorithm indeed requires at least n^2 steps, because we absolutely must consider all pairs of inputs' elements.
I understand that the whole thing is highly informal and maybe a bit naive, but still, how far can we push this counting argument?