3
$\begingroup$

The following is an excerpt from Introduction to Algorithms by Thomas H. Cormen, Charles E. Leiserson, Ronald L. Rivest, and Clifford Stein (aka CLRS):

$\mathcal{O}(g(n,m)) = \{ f(n,m): \text{there exist positive constants }c, n_0,\text{ and } m_0\text{ such that }0 \le f(n,m) \le cg(n,m)\text{ for all }n \ge n_0\text{ or }m \ge m_0\}$

Consider an alternative possible extention of Big-$\mathcal{O}$ notation to multiple variables:

$\mathcal{O}(g(n,m)) = \{ f(n,m):\text{there exist positive constants $c$, $n_0$, and $m_0$ such that }0 \le f(n,m) \le cg(n,m)\text{ for all }n \ge n_0\text{ and }m \ge m_0\}$

Are there any reasons why the first definition is used as an extension instead of the second?

$\endgroup$
4
  • $\begingroup$ Sorry but I cannot find any difference between these two definitions. Could you please point out that explicitly? $\endgroup$ Commented Apr 6, 2013 at 23:39
  • $\begingroup$ @WangfanFu: The connective between $n\ge n_0$ and $m\ge m_0$ is an "or" in one version and an "and" in the other. $\endgroup$ Commented Apr 6, 2013 at 23:41
  • $\begingroup$ @sadf : Please notice the recent edit. You do not need to put $\{\text{curly braces}\}$ OUTSIDE of $\TeX$. Just use backslashes, thus: \{f(n,m)\}. You'll see this: $\{f(n,m)\}$. When you put the outside of $\TeX$, then standard spacing and size conventions are not followed. $\endgroup$ Commented Apr 7, 2013 at 0:55
  • $\begingroup$ related: cs.stackexchange.com/questions/105280/… $\endgroup$ Commented Feb 23, 2022 at 17:22

1 Answer 1

2
$\begingroup$

Good question. I think the reason why the first definition is "better" is that it can be stated independently of the number of parameters as:

Let $f,g:X\to\mathbb R_+$ for some discrete space $X$. We say that $f \in \mathcal O(g)$ iff there is a positive constant $c$ such that $f(x)\le cg(x)$ for all but finitely many $x$.

Then the usual one-parameter definition results when we take $X$ to be $\mathbb N$, and the one you quote from CLRS when we take $X$ to be $\mathbb N^2$.

In contrast, your proposed second definition would allow unbounded functions such as $$f(n,m) = \begin{cases} n+m & n=100 \lor m=100 \\ 1 & \text{otherwise} \end{cases}$$ to be $O(1)$, which doesn't seem to be useful for reporting algorithmic complexity.


More abstractly, remember that asymptotics are always defined relative to some limit $x\to x_0$. In the case above the limit is $x\to (+\infty,+\infty)$. Limits involving infinity are often treated as a special case in real analysis, but there's a natural topology on $\mathbb N^2\cup\{\infty\}$ such that the definition above is equivalent to

We say that $f\in\mathcal O_{x\to x_0}(g)$ iff there is a positive constant $c$ such that $f\le cg(x)$ everywhere in some neighborhood of $x_0$, except possibly at $x_0$ itself.

This view encompasses the asymptotics for $x\to 0$ we see in analysis. Additionally, it offers the opportunity to get your second definition back in the game, if you redefine the topology on $\mathbb N^2\cup\{\infty\}$ to give $\infty$ smaller neigborhoods.

Then the distinction between the two definitions you quote is simply that they are based on two different limits.

$\endgroup$
3
  • $\begingroup$ Great answer! Thanks for the help! $\endgroup$ Commented Apr 20, 2013 at 20:51
  • $\begingroup$ Great answer. But what is CLRS? Would you mind looking at math.stackexchange.com/q/1484899/15523, Henning? $\endgroup$ Commented Oct 17, 2015 at 21:59
  • $\begingroup$ But note that this definition is not perfect either. By this definition, we do not have $mn +1 = O(mn)$ for $m,n\ge 0$. And even if we restrict to, say, $m,n\ge 1$, we do not have $mn = O((m-1)(n-1))$. $\endgroup$ Commented Feb 23, 2022 at 17:24

You must log in to answer this question.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.