0

How can we perform time complexity analysis on a function that has no loops?

int somefunction(int param) {
  if (something)
    do this;
  else
    do this;
}

Would the time complexity of this function change depending on how many times the function is called within a program? I currently believe that the time complexity is O(n), because it is dependent on the amount of times that we call it.

Raphael
  • 73,212
  • 30
  • 182
  • 400

3 Answers3

1

The complexity is typically understood to be for one call. If you want to talk about the complexity of a sequence of calls, then you would need to say so.

In your example, you don't really gain anything by looking at sequences of calls. But there are other situations (with loops or recursive calls allowed) in which you would indeed want to look at sequences of calls. One example is when the function typically takes little time but once in a while it decides to do a lot of work. In that case, you need to quantify how ‘once in a while’ compares to ‘a lot of work’. A natural way to do so is to see how much it takes to execute a long sequence of calls (keywords: "amortized analysis").

rgrig
  • 1,346
  • 1
  • 11
  • 15
1

Depends on what something and do this are. In particular, do we have to assume global (or object) state?

Consider this:

int switch(int param) {
  if ( !set.contains(param) )
    set.add(param);
  else
    set.remove(param);
}

Now the runtime clearly depends on the implementation of set and the number of entries it has when switch is called. Calling switch $n$ times starting with an empty set and using pairwise distinct parameters, you might get $\omega(n)$ runtime in total, hence not all calls run in time $O(1)$.

On the other hand, if somefunction does not access state outside of its scope -- assuming that the function itself does not have persistent state -- the runtime can only depend on param. That does not mean runtime is in $O(1)$ -- something and do this may still do non-constant things. Call methods, recurse, etc.

Note that depending on the cost model you use, even adding two numbers has non-constant cost.

Raphael
  • 73,212
  • 30
  • 182
  • 400
-1

It does not become linear just because it is called by linear-worst-case code; it retains whatever complexity it would have in isolation, which here looks an awful lot like constant-time, barring certain odd edge cases. Big-O of a chunk of code is not evaluated based on code that might include it in its own Big-O; you don't consider a hashtable access linear just because there's code that uses the hashtable to process a whole list of items and increment their occurrence counts.