Yes, your conjecture is true. In fact, primitive recursive functions can perform a huge variety of set-theoretic tasks, which makes primitive wellfounded recursion straightforward... Or at least, it's straightforward to anyone who's familiar with the ordinary proofs of wellfounded recursion. There are some additional complexities, but largely it is the same proof.
I'll proceed while assuming the reader is familiar with basic primitive recursive functions. Namely: addition, multiplication, exponentiation, truncated subtraction, floor division, and the remainder function are all primitive recursive. We also have the ternary conditional operator, which can be used to implement all boolean logic and casewise definitions. We say a relation is PR whenever its indicator function is PR, and acknowledge that the $<$ and $=$ relations are PR. We of course have the summation operator, in that $\sum_{n\leq N} f(n)$ is a primitive recursive function of $N$ assuming $f$ is PR, and likewise there's the repeated product operator. The two-argument min function $(x\land y) = \min(x,y)$ is PR and, identically to sums and products, we find the repeated min operator $\bigwedge_{n\leq N} f(n) = \min\{f(n) : n\leq N\}$. An unrelated construction will produce the bounded minimization operator $\min\{n\leq N : \phi(n) \}$, which is a PR function of $N$ assuming $\phi$ is a PR predicate. There are analogous constructions for maximums as well.
Primitive Recursive Operations on Finite Sets
We identify $\mathbb{N} = V_\omega$ as being the set of hereditarily finite sets, in accordance with the Ackermann coding. We'll prove that a huge range of set-theoretic functions and predicates are primitive recursive, under this coding.
Definition: For $S,x\in\mathbb{N}$, we say $x\in S$ whenever $\lfloor S/2^x\rfloor = 1 (\mod 2)$, which is PR. Notice $\forall x, x\in S \implies x<S$.
Under the above definition, we generally have $\{x : \phi(x)\} = \sum_{x\in\mathbb{N}} 2^x\cdot P(x)$ for any proposition $\phi$, where $P$ is the indicator function for $\phi$. This works so long as the set constructed by $\phi$ is a finite set, since then the summation is finite and obeys all the required properties. In case $\phi$ is defined with extraneous free variables, the corresponding setbuilder operation can be proven primitive recursive so long as the members of the constructed set can be bounded by some PR function. This is clarified and proven as follows.
Theorem: Suppose $\phi(x,v_1,\cdots,v_n)$ is a PR relation, and suppose there's a PR function $J(v_1,\cdots,v_n)$ where generally $\phi(x,\overline{v}) \implies x\leq J(\overline{v})$, then the function $\overline{v} \mapsto \{x : \phi(x,\overline{v})\}$ is also PR.
proof: Simply notice $\{x : \phi(x,\overline{v})\} = \sum_{x\leq J(x)} 2^x\cdot P(x,\overline{v})$, where $P$ is the indicator for $\phi$. The latter summation is clearly primitive recursive, and so the former setbuilder operation must also be primitive recursive. $\square$
The above theorem will be used extensively, as it's often trivial to construct the requisite upper bound on a PR predicate. A special case of the above is Specification, used to construct subsets. Given any PR predicate $\phi(x,\overline{v})$, the function $\left<S,\overline{v}\right> \mapsto \{x\in S : \phi(x,\overline{v})\}$ is a PR function, since we get a trivial upper bound given by $x<S$. We will also frequently use the fact that PR relations are closed under bounded quantification, as proven below.
Theorem: Suppose $\phi(x,v_1,\cdots,v_n)$ is a PR relation, then the relation $\psi(X,\overline{v})$ defined by $\exists(x\leq X), \phi(x,\overline{v})$ is also PR.
proof: Where $\phi$ is indicated by $P$, then $\psi$ is indicated by $Q(X,\overline{v}) = \max\{P(x,\overline{v}) : x\leq X\}$, proving $\psi$ is PR. $\square$
The above theorem only works for existential quantification, but we get universal quantification with a similar proof; just use min instead of max. We also get quantification bounded by $\in$, since we have $\exists(x\in X), \phi$ equivalent to $\exists(x\leq X), x\in X \land \phi$, which is clearly PR relative to $\phi$. The above two theorems combined easily prove the following results.
Definition: Let $|S| = \sum_{x<S} \operatorname{In}(x,S)$, where $\operatorname{In}(x,S)$ is the indicator function for $x\in S$. Clearly this is the set theoretic cardinality operation, which is seen to be primitive recursive.
Definition: $\bigcup S = \{x : \exists(Z\in S), x\in Z\}$, which is PR since $x\in\bigcup S \implies (\exists Z, x\in Z\in S) \implies x<S$.
Definition: $\mathcal{P}(S) = \{Z : Z\subseteq S\}$, which is PR since $Z\subseteq S \implies Z\leq S$.
Definition: Given PR function $F(x,v_1,\cdots,v_n)$, define $F^\to(X,v_1,\cdots,v_n) = \{F(x,\overline{v}) : x\in X\}$. This $F^\to$ is also PR, since $y\in F^\to(X,\overline{v}) \implies y\leq \max\{F(x) : x<X\}$ provides a PR upper bound.
Definition: let $\left<x,y\right> = \{\{x\}, \{x,y\}\}$ denote the Kuratowski ordered pair. Also let $X\times Y=\{p : \exists(x\in X),\exists(y\in Y), p=\left<x,y\right>\}$, which is primitive recursive since $X\times Y\subseteq \mathcal{P}(\mathcal{P}(X\cup Y))$
Definition: We say $F:X\to Y$ when $F\subseteq X\times Y$ and $\forall(x\in X), \exists!(y\in Y), \left<x,y\right>\in F$, which is seen to be a primitive recursive relation. Similarly define $(X\to Y) = \{F\in \mathcal{P}(X\times Y) : (F:X\to Y)\}$, which is also primitive recursive.
Definition: $\operatorname{dom}(f) = \{x : \exists y, \left<x,y\right>\in f\}$, and similarly $\operatorname{ran}(f) = \{y : \exists x, \left<x,y\right>\in f\}$. Note that all such $x,y$ obey $x,y<f$, giving the equivalent definition $\operatorname{dom}(f) = \{x<f : \exists(y<f), \left<x,y\right>\in f\}$, which is primitive recursive.
Definition: $f[x] = \min\{y\in\operatorname{ran}(f) : \left<x,y\right>\in f\}$ whenever $x\in\operatorname{dom}(f)$, and otherwise $f[x] = 0$. This is primitive recursive.
Proof of Wellfounded Recursion
Theorem: Let $\left<\mathcal{C},\prec\right>$ be a primitively recursible order, then for any primitive recursive $G:\mathbb{N}^2\to\mathbb{N}$, there exists primitive recursive $F:\mathbb{N}\to\mathbb{N}$ such that $\forall(c\in\mathcal{C}), F(c)=G(c,F\restriction_c^{\prec})$
proof: Firstly, given any $c\in\mathcal{C}$, we define $\operatorname{seg}(c) = \{x\in\mathcal{C} : x\prec c\}$. That's a PR function since $\prec$ and $\mathcal{C}$ are PR relations, and since moreover we assume there's a PR upper bound on the largest $x$ obeying $x\prec c$, where both observations follow from the definition of a primitively recursible order. Next, we partition $\mathcal{C}$ into a wellfounded hierarchy of finite sets, compatible with $\prec$, which will help with the recursion. Explicitly, we define $C_0 = \emptyset$ as the empty set, and use the following recursive rule.
$$C_{N+1} = \{x\leq N : x\in\mathcal{C} \land \operatorname{seg}(x)\subseteq C_N\}$$
This is primitive recursive due to all our previous theorems, and the fact that the $\operatorname{seg}$ function is PR. Moreover, we can easily prove via wellfounded induction that all $c\in\mathcal{C}$ eventually appear in some $C_N$ set. In fact, the primitive recursive function $c\mapsto R(c)$ defined by $R(c) = \max\{x : x\prec c\} + |\operatorname{seg}(c)|+1$ will obey $c\in C_{R(c)}$, which can be proven via wellfounded induction on $\prec$. This fact will be extremely useful in just a moment.
To complete our proof, we just perform a basic sequential recursion while relying on the $C_N$ hierarchy. First, given any object function $H\in\mathbb{N}$, we define $H\restriction_c^\prec = H\cap (\operatorname{seg}(c)\times \operatorname{ran}(H))$, which is clearly a primitive recursive function of $H,c$. This also obeys $H\restriction_c^\prec = \{\left<x, H[x]\right> : x\prec c\}$, which is analogous to how we defined the restriction $F\restriction_c^\prec$ for a proper function $F:\mathbb{N}\to\mathbb{N}$. Define $H_0=\emptyset$ as the empty function, which incidentally has $\operatorname{dom}(H_0)=\emptyset = C_0$, and recursively define $H_{n+1}$ like so.
$$H_{n+1} = \{\left<c, G(c,H_n\restriction_c^\prec)\right> : c\in C_{n+1}\}$$
Finally, just define $F(c) = H_{R(c)}[c]$. It's easily shown that this $F$ obeys the required properties, where in particular the restriction of $F$ to the domain $C_n$ is exactly $H_n$. This works primarily because each $c\in C_{n+1}$ obeys $\operatorname{seg}(c)\subseteq C_n$ and thus $\operatorname{dom}(H_n\restriction_c^\prec) = \operatorname{seg}(c)$, and also because we always have $c\in C_{R(c)} = \operatorname{dom}(H_{R(c)})$ so that $F(c)$ is well defined. This results in $F(c) = H_{R(c)}[c] = G(c, H_{R(c)-1}\restriction_c^\prec) = G(c, F\restriction_c^\prec)$ as required. The sequence of $H_n$ is primitive recursive due to our theorems, and thus $F$ is also primitive recursive, so our proof is completed.
$\square$