Let $e_{j}$
be the $j^{th}$
standard basis vector (i.e. 1 in position $j$,
and 0 elsewhere). First note that $Ae_{j}=c_{j},$
where $c_{j}$
is the $j^{th}$
column of $A$.
Consequently, $\frac{\|Ae_{j}\|_{1}}{\|e_{j}\|_{1}}=\|c_{j}\|_{1}$ . Also, note that $\frac{\|Ax\|_{1}}{\|x\|_{1}}$ is scale invariant, and therefore it suffices to show this for $x$ such that $\|x\|_1=1$.
Let $j^{*}$ be the index of the column of $A$ with maximum 1 norm. Now, any vector $x\in \mathbb{R}^n$ with \|x\|_1 =1$ can be written as
$$x = \sum_{i=1} ^{n} \alpha_i e_i$$
where
$$\sum_{i=1} ^{n} |\alpha_i| = 1$$
Note that
$$Ax = \sum_{j=1 } ^n \alpha_j A e_j = \sum _{j=1} ^n \alpha_j c_j $$
Also note that the one norm of $Ax$ is
$$\|Ax\|_1 \le \sum _{j=1} ^n \|\alpha_j c_j\|_1 = \sum _{j=1} ^n |\alpha_j| \cdot \|c_j\|_1 \leq \|c_{j^{*}}\|_1$$
The last inequality follows from the fact that the $\alpha$'s are a "convex combination" which yields the desired result.