2

In several contributions here on M.SE I saw notations like $$a/bc$$ where from the context one could conclude that it actually meant $a/(bc)$, not $(a/b)c$.

Both terms are obviously different in general, and the programming languages I know of have semantics

a/b*c = (a/b)*c

i.e. evaluate left-to-right.

The examples I saw all had "$b$" and "$c$" short like in $1/2x$ and did obviously not mean $\frac12x$ but $\frac1{2x}$as the context revealed.

How should one handle this? I am always avoiding such ambiguities and am writing the clunky $1/(2x)$, but then $1/2x$ could still be misread as $\frac1{2x}$. If typesetting is available, then I'd use $\frac12x$ vs. $\frac1{2x}$ or $(2x)^{-1}$ or $0.5x$ or $0.5/x$, where the decimal expansions are ugly, IMO, even uglier for ⅙ etc.

Are there different conventions, e.g. in the U.S., Europe, Russia, Far East?

Using / vs. ÷ wouldn't help either. Never used the latter though, only know it from calculator's key for division.

The question about different conventions depending on the corner of the planet you live in or school vs. university is because there are already related questions, and their answers are far from conclusive. For example this one claims $abc/xyz$ is to be read as $(abc)/(xyz)$ because "dropping the multiplication glyph (·, ×, * or whatever) binds stronger than with explicit mention, i.e. $$abc/xzy\neq a·b·c/x·y·z=abc/x·yz$$

So my question is less on which exact notation to use but rather on cultural differences and how established that juxtaposition rule is.

  • 1
    "x/2" is better than all of your proposals for $\frac12x$. – Mark S. Mar 12 '20 at 14:18
  • Personally, if I thought the meaning wasn't clear from context, I'd just write "$a/(bc)$" or "$a$ divided by $bc$" or something else along these lines, which I actually had to do until the last 8 or 9 years (over a decade with ASCII sci.math posts, over two decades with ASCII emails to students and others I corresponded with, etc.). This seems like a solution in search of a problem, unless you're developing or working with a certain programming language. Even in Excel, I don't know what the convention is --- I just use $a/(bc).$ – Dave L. Renfro Mar 12 '20 at 14:28
  • 2
    sometimes I type $1/2/x$ in Wolfram Alpha for $1/(2x)$ – J. W. Tanner Mar 12 '20 at 14:29
  • @J. W. Tanner : Ya, that does it in programming; but even worse, there we even have a/b/c ≠ a/(b*c) because of limited range, rounding and precision of the types. It's not even the same if $b\mid a$ and $c\mid a$ due to different overflow characteristics of either expression. – emacs drives me nuts Mar 12 '20 at 14:48
  • As to the question of what cultural differences there might be in terms of which interpretation is most common, I am not aware of any study done on it, but you might have some luck browsing the many references included on the knowyourmeme page about this. For me, I avoid it where possible, but were I forced to choose one to interpret it as I would more likely assume that it were $a/(bc)$ for similar reasons as this answer. For the record, I am in the south-eastern United States. – JMoravitz Mar 12 '20 at 14:56
  • I have not in my experience in school dealing with people from other parts of the country or other countries had any reason to pay attention to how others interpret it beyond being aware that the possibility for differences in interpretation is high enough that the notation should be avoided. – JMoravitz Mar 12 '20 at 14:57

1 Answers1

3

I would advise you never to use this notation. You can make your meaning unambiguous by writing $\frac{a}{bc}$ or $a/(bc)$.

However, no reasonable person would write $a/bc$ to mean $(a/b)c$, since that is more succinctly (and unambiguously) expressed as $ac/b$. So if someone does write $a/bc$, it is very likely that they are being lazy and mean $a/(bc)$. (Quite possibly it came from a bad transcription of $\frac{a}{bc}$.)

There is no universal convention for how to interpret the expression; I suspect most mathematicians would say "it's ambiguous". However, as wikipedia points out,

For example, the manuscript submission instructions for the Physical Review journals state that multiplication is of higher precedence than division with a slash, and this is also the convention observed in prominent physics textbooks such as the Course of Theoretical Physics by Landau and Lifshitz and the Feynman Lectures on Physics.

(i.e. these journals interpret it as meaning $a/(bc)$.)