In short, my question – which has come up in a mechanism design setting I'm working on – is the following. Let $f,g\colon \mathbb{R} \rightarrow \mathbb{R}$ be continuous functions and let $f$ be non-decreasing and non-negative. Then for each $y\in\mathbb{R}$ consider the function $$ r_y\colon \mathbb{R} \rightarrow \mathbb{R}\colon x \mapsto f(x)\cdot y-g(x) $$ over $\mathbb{R}$. What do $f,g$ have to look like for $r_y$ to be globally optimal at $x=y$ for all $y\in \mathbb{R}$?
Of course, if $f,g$ are somewhat "well-behaved", we can go through the usual motions. For all $y$ it has to be $r_y'(y)=0$, i.e. $g'(y)=f'(y)y$ and via the fundamental theorem of calculus $$ g(x)=C+\int_0^x f'(\zeta)\zeta d\zeta. $$
But my calculus/real analysis knowledge is insufficient to understand whether the premises of the problem ensure that $f,g$ are sufficiently well-behaved or to understand what happens if they are not well-behaved.
Here are some more thoughts of mine. It's been a while since I took advanced calculus, so some of this might be confused.
Instead of talking about derivatives, we can talk about the (ratios of) differences. In particular, for all $d>0,y\in \mathbb{R}$, it must be $r_y(y+d)-r_y(y)\leq 0$ and $r_y(y)-r_y(y-d)\geq 0$. Rearranging stuff a little, this is equivalent to (for all $d>0,y\in \mathbb{R}$) $$ y(f(y+d)-f(y))\leq g(y+d)-g(y)\leq (y+d)(f(y+d)-f(y)). $$ This is nice because it's similar to the relationship between the derivatives. (Dividing by $d$ and letting it go to $0$ gives the relationship between the derivatives if they exist.) But it doesn't immediately answer questions about what kinds of $f$ are allowed or how to construct the corresponding $g$. (See below for problematic examples.)
Lebesgue's theorem for the differentiability of monotone functions seems to imply that $f$ and $g$ are differentiable almost everywhere. (The previous point implies that if $f$ is monotone, $g$ is monotone on $(-\infty,0)$ and on $(0,\infty)$.) So we can talk about the derivatives of $f$ and $g$ almost everywhere. But we might not be able to obtain (an "allowed") $g$ via integration of $g'(x)=f'(x)x$. For example, if $f$ is the Cantor function for $x\in[0,1]$, then $$ \int_0^x f'(\zeta)\zeta d\zeta = 0, $$ right (see, e.g., sect. 2 here)? But if $f$ is the Cantor function and $g=0$, then $r_{\frac{1}{2}}(\frac{1}{2})<r_{\frac{1}{2}}(1)$. That is, $r_{\frac{1}{2}}$ does not have a maximum at $1/2$. So can $f$ be the Cantor function? That is, if $f$ is the Cantor function, is there a corresponding $g$ (s.t. $r_y$ has a maximum at $y$ for all $y$) and if so what is the corresponding $g$? More generally, can $f$ be a function that we can't find by integrating its derivative and if so, is there a way of finding the corresponding $g$?
The above point asks whether $g$ might not be an integral over $g'$. Another question is whether $g'$ (or $f'$) must be integrable at all. Of course, there are continuous functions whose derivatives are not integrable, but the examples I am aware of aren't even monotone (not to mention our other requirements for $f,g$).
One standard move that people use to replace derivatives is to use sub-/super-derivatives. But that seems to require the functions in question to be convex/concave. But neither $r_y$, nor $f$ or $g$ must be convex/concave everywhere.