I will show that there is a unique $f_a$ which obeys $f(1)=0$, $f(x) = f(x-1) + \log^a x$ and is convex on $(e^{a-1}, \infty)$. The functional equation gives us a unique extension to $(1,\infty)$; I am not sure whether it is convex there. (I will write $\log^a x$ for $(\log x)^a$, as the original poster does.)
Construction Set $g(x) = \log^a x$. So
$$g'(x) = a \frac{\log^{a-1} x}{x} \ \mbox{and}$$
$$g''(x) = \left( \frac{d}{dx} \right)^2 \log^a x = \frac{a (a-1 - \log x) \log^{a-2} x}{x^2}.$$
For $x \in \mathbb{C} \setminus (-\infty, -1)$, set
$$h_2(x) = - \sum_{n=1}^{\infty} g''(x+n).$$
(If $a<2$, we also have to remove $x=0$, so we don't have $\log 1$ in the denominator.) We have $g''(x+n) = O((\log n)^{a-1}/n^2)$, so the sum converges, and does so uniformly on compact sets. So $h_2(x)$ is an analytic function. Moreover, for $x \in (e^{a-1}, \infty)$, we have $g''(x) <0$, so $h_2(x)>0$. By construction, we have $$h_2(x) - h_2(x-1)=g''(x).$$
Also, easy estimates give $h_2(x) = O(\log^a x/x)$ as $x \to \infty$.
Put $h_1(x) = \int_{t=1}^x h_2(t) dt$. Then $h_1(x) - h_1(x-1) = g'(x) + C$ for some $C$. Sending $x$ to $\infty$, we have $$\lim_{x \to \infty} h_1(x) - h_1(x-1) = \lim_{x \to \infty} \int_{t=x-1}^x O(\log^a x/x) dt = O(\log^a x/x)=0,$$ and $\lim_{x \to \infty} g'(x) =0$, so $C=0$.
Integrating again, put $h(x) = \int_{t=1}^x h_1(t) dt$. So $h(x) - h(x-1) = \log^a x + C$ for some $C$. This time, I couldn't figure out whether or not $C$ is the correct constant. But, if it isn't, that's okay: Set $f(x) = h(x) - C(x-1)$. Now the condition $f(1) = 0$ and $f(x)-f(x-1) = \log^a x$ are okay, and $f''(x) = h''(x) = h_2(x)$ which, as we observed, is $>0$ for $x>e^{a-1}$.
Uniqueness Suppose there was some other $C^2$ function $\tilde{f}(x) = f(x)+r(x)$ which met the required criteria. Then $r(x)-r(x-1)=0$, so $r(x)$ is periodic. Suppose for the sake of contradiction that $r$ is not constant. (If $r$ is constant, then the constant is zero, as $r(1)=0$.) Then $r''$ is a periodic function with average $0$, so $r''(y)<0$ for some $y$, and we then have $r''(y)=r''(y+1)=r''(y+2)=\cdots$. But then $\tilde{f}''(y+n) = f''(y+n) + r''(y+n) = O(\log^{a-1} n/n^2)+r''(y+n)>0$ for all $n$, contradiction that $r''(y+n)<0$.
Convexity for all $x$? What remains is the question: Is $f''(x)>0$ for all $x \in (1,\infty)$? Or, equivalently, is
$$\sum_{n=1}^{\infty} g''(x+n)<0$$
for all $x \in (1, \infty)$? I expected that the answer would be "no", and I would just have to search for a little bit to find a counter-example to finish this answer. But, so far, numerical computations suggest the answer is always "yes". I think I could prove this by unenlightening bounds, but instead I'm going to go to bed and see if I think of a better strategy in the morning.