1

I have already read A Primer of Mathematical Writing, by Steven Krantz which gives extremely good advice about writing mathematics. But I would like to collect some more specific suggestion about notation. I know that there are many different possible and equally correct notations and that it is a matter of taste and that therefore this question is likely to be very subjective. However, I am sure that there are some choices that are agreeably better than others.

To make only an example, I personally feel that ":" is preferable to "|" to express "such that", because "|" is used to mean "divisible" as well, and that $\mathbb{N}_0$ is neater than $\mathbb{N}^{>0}$.

So my question is: can you point out some "officially recognized"choices of notation that are agreeably better than other ones in terms of clarity? Even better, can you recommend some references on this topic?

Dal
  • 8,582
  • 4
    To me $\mathbb N_0$ contains ${0}$. – Git Gud Oct 03 '14 at 20:47
  • @GitGud really? How would you indicate $\mathbb{N}^{>0}$ then? – Dal Oct 03 '14 at 20:48
  • @GitGud well, I think it depends on the authors anyway. – Dal Oct 03 '14 at 20:48
  • Yes. That's how I learned in school and it stuck so hard that it seems completely natural to me. Also $\mathbb N^{>0}$ I denote simply by $\mathbb N$, though I confess that $\mathbb N_{>0}$ seems much more common (than $\mathbb N^{>0}$) to me. And yes, it depends. I'm just saying to me it is like that, it's not universal, I believe. – Git Gud Oct 03 '14 at 20:49
  • As for $\color{blue}\colon$ over $\color{blue}\mid$, I agree. However divisibility is not part of everyday life for everyone, so I can see why it wouldn't be a problem. I still prefer $\color{blue}\colon$ over $\color{blue}\mid$ because I've never seen any case in which $\colon$ could be ambiguous. – Git Gud Oct 03 '14 at 20:52
  • I much prefer $\mathbb Z^+$ over $\mathbb N^{>0}$, because the latter looks clunky to me. As for the meaning of the symbol $\mathbb N$ on its own, I doubt that we're ever going to reach a consensus on whether $\mathbb N$ contains zero or not. Each author has to make that choice (and say explicitly what it is!) based on context. – Jack Lee Oct 03 '14 at 21:00
  • $\mathbb N^\star$. It makes no sense for $\mathbb N$ to exclude $0$, since then it would lack the neutral element for addition, and it would not be able to form a group with it, and a ring with multiplication, as opposed to all others: $\mathbb Z$, $\mathbb Q$, $\mathbb R$, $\mathbb A$, $\mathbb C$. – Lucian Oct 03 '14 at 23:07
  • Lucian, I don't think that's the main reason $\mathbf{N}$ has come to include zero. There used to be debate about whether zero actually existed as a number, as many felt it was as artificial as a negative number. Set theory provides an obvious criterion: natural numbers are those that correspond to the cardinalities of sets, and the empty set is a set. However, from a naive perspective, zero feels artificial, which is why the traditional view is still strong at the school level. I do think there is a consensus now among mathematicians that $0 \in \mathbf{N}$, just not in teaching. – user180040 Oct 03 '14 at 23:47

3 Answers3

2

I don't necessarily believe that these sources are better, but I can try to answer your question about what's "official."

  1. ISO, the International Organization for Standardization, has an official set of mathematical symbols. See http://www.ise.ncsu.edu/jwilson/files/mathsigns.pdf . I believe they may have had physicists in mind more than mathematicians, but that's what there is.

  2. Bourbaki's notations are of course hugely influential. And the more time you've spent in France, the more likely you are to think of them as "official." They are responsible for the hilarious $\subsetneq$, and I believe they may also be the reason mathematicians in France and some other countries began referring to the number $0$ as "positive" (as well as "negative").

  • 1
    $>$ is "strictly greater than", as opposed to $\ge$, which is "greater or equal than". In like manner, $>0$ is "strictly positive", as opposed to $\ge0$, which is "positive". Likewise, $<$ is "strictly lesser than", as opposed to $\le$, which is "lesser or equal than". In like manner, $<0$ is "strictly negative", as opposed to $\le0$, which is "negative". Equally, we speak of "strictly increasing/decreasing/monotonous", as opposed to "increasing/decreasing/monotonous", etc. Conceptual uniformity is key to a clearer thinking. – Lucian Oct 03 '14 at 23:02
  • Yes, I understand the underlying reasoning. It was felt that $\leq$ was a more important relation than $<$, and should therefore have the simpler terminology associated with it. The difficulty with this is that you can't turn the entire language on its head lightly. – user180040 Oct 03 '14 at 23:41
1

So there is this German Math Superstar Auto Didact Albrecht Beutelspacher who wrote Das ist o. B. d. A. trivial!: Tipps und Tricks zur Formulierung mathematischer Gedanken (Mathematik für Studienanfänger) (German Edition)

It is a wonderful resource for writing mathematical proofs in German, his premise is that a proof is a (German) Text and should be treated as such. If you are German, big recommendation.

If not, perhaps you find Terry Tao's comment on notation insightful: Quoting loosely here, a good notation should be unambiguous, expressive, preserving unambiguity and expressiveness, error corrective, suggestive, transformable, etc. This means the preferred notation can vary depending on the field and context.

Personally, I follow coding practices: What does everyone else use? Is my notation compatible with that? What is this symbol typically used for?

For $\mathbb N$ honestly, best say what you mean and then be consistent about it, some ppl use it this way some another way. I tend to think that $0$ is a Natural Number and if it shouldn't you take it out, for instance defining $\mathbb N^* := \mathbb N \setminus {0}$.

| means different things in different contexts, if you use : instead in set notation, that's perfectly fine everyone will understand, I don't have much hope that you'll change the field though ;-) Thinking of conditional probability there : would be an error, and | is correct. $\mathbb P (A |B)$ makes sense, $\mathbb P(A:B)$ can mean loads of things, depending on how : is defined...

  • 1
    @Dal: In addition to what WonderfulWonder and others have said, I strongly recommend that you avoid a common beginner error of thinking adding symbolism increases rigor. You want what you've written to be readable by PEOPLE, so you don't want a wall of unreadable machine-like code. For example, if you don't really need a symbol for something (or would only use the symbol once or twice), then don't introduce it. See this MSE answer for more examples. – Dave L. Renfro May 03 '24 at 10:46
0

One good resource is Florian Cajori's A History of Mathematical Notation, which will at the least give you a glimpse of the sheer variety of ways of notating things. Volume I is available online.

Added later: My looking up Cajori's book on Amazon prompted Amazon's recommendation engine to tell me about Mathematical Notation: A Guide for Engineers and Scientists by Scheinerman$^2$. (It also reminded me of another very nice book on the history of notation: Enlightening Symbols by Joseph Mazur.)

Barry Cipra
  • 81,321