Is it possible an algorithm complexity decreases by input size? Simply $O(1/n)$ possible?
3 Answers
Consider an algorithm with some running time bounded by $f(n)$ and suppose that $f(n) \in O(1/n)$. That means that there is some constant $c$ such that for sufficiently large values of $n$, it holds that $$f(n) \leq c\frac{1}{n}.$$ Clearly, for any fixed $c$ and sufficiently large $n$, the right side will be strictly less than $1$, which requires $f(n)=0$, since $f$ maps to $\mathbb{N}$. In my understanding, even an algorithm that immediately terminates, takes at least $1$ step (namely to terminate), i.e., $\forall n\colon f(n)\ge 1$. So no such algorithm can exist.
- 1,028
- 7
- 10
Bucket sort's insertion sort step is O(1/n) on average.
Reference - CLRS Section 8.4
BUCKET-SORT(A)
1 n ← length[A]
2 for i ← 1 to n
3 do insert A[i ] into list B[nA[i ]]
4 for i ← 0 to n − 1
5 do sort list B[i ] with insertion sort
6 concatenate the lists B[0], B[1], . . . , B[n − 1] together in order
$T(n) = \theta(n) + \sum_{i=0}^{n−1} O(n_{i}^{2})$
Taking expectations of both sides and using linearity of expectation, we have
$E[T(n)] = \theta(n) + \sum_{i=0}^{n−1} O(E[n_{i}^{2}])$
It is proved that $E[n_{i}^{2}] = 2 − 1/n$
Hence step 5 takes $2 - 1/n$ time. Overall complexity of bucket sort is
$\theta(n) + n\cdot O(2 − 1/n) = \theta(n)$.
- 309
- 3
- 9
By definition, all functions (even the trivial) must perform at least 1 operation. It is impossible to perform half of a step; computers are discrete machines, and work incrementally. As such, the complexity $\Omega(1)$ is the lower time bound of all computational operations. In addition, for any operation accepting any N inputs, it must deal with at least a constant number of them at once (even if that number is zero). It's impossible to start at using one unit of space and halve that space infinitely; at some point your algorithm will reach the atomic unit of space for the computation (one bit, one electron, one string, whatever) and from that point you will use either zero or one of those atomic units, thus arriving at a constant base case. Thus, $\Omega(1)$ is also the lower space bound of all algorithms.
Now, it is possible, trivial even, for an algorithm to produce an output on that order of magnitude or cardinality. Given N elements, you can return 1/N of them (array[x] produces an output 1/N the size of the input array). But, this takes constant time (calculate the location of the element by offsetting the address of the array by the number and size of its elements, and return the bits beginning at that offset and continuing for sizeof(elementType)).
- 414
- 2
- 5