In several contributions here on M.SE I saw notations like $$a/bc$$ where from the context one could conclude that it actually meant $a/(bc)$, not $(a/b)c$.
Both terms are obviously different in general, and the programming languages I know of have semantics
a/b*c = (a/b)*c
i.e. evaluate left-to-right.
The examples I saw all had "$b$" and "$c$" short like in $1/2x$ and did obviously not mean $\frac12x$ but $\frac1{2x}$as the context revealed.
How should one handle this? I am always avoiding such ambiguities and am writing the clunky $1/(2x)$, but then $1/2x$ could still be misread as $\frac1{2x}$. If typesetting is available, then I'd use $\frac12x$ vs. $\frac1{2x}$ or $(2x)^{-1}$ or $0.5x$ or $0.5/x$, where the decimal expansions are ugly, IMO, even uglier for ⅙ etc.
Are there different conventions, e.g. in the U.S., Europe, Russia, Far East?
Using / vs. ÷ wouldn't help either. Never used the latter though, only know it from calculator's key for division.
The question about different conventions depending on the corner of the planet you live in or school vs. university is because there are already related questions, and their answers are far from conclusive. For example this one claims $abc/xyz$ is to be read as $(abc)/(xyz)$ because "dropping the multiplication glyph (·, ×, * or whatever) binds stronger than with explicit mention, i.e. $$abc/xzy\neq a·b·c/x·y·z=abc/x·yz$$
So my question is less on which exact notation to use but rather on cultural differences and how established that juxtaposition rule is.
a/b/c ≠ a/(b*c)because of limited range, rounding and precision of the types. It's not even the same if $b\mid a$ and $c\mid a$ due to different overflow characteristics of either expression. – emacs drives me nuts Mar 12 '20 at 14:48