1

Is there any non-category theoritic argument of the fact that the primitive elements of the universal enveloping algebra of a Lie algebra are precisely those which belong to the underlying Lie algebra? I have found one such here but as the other answer clarifies that the argument very much relies on the language of category theory which I am not quite familiar with. So I am looking for a proof which doesn't involve heavy guns of category theory.

Any suggestion regarding this would be greatly appreciated. Thanks in advance.

Anacardium
  • 2,716
  • 1
    To make the question somewhat more self-contained, please remind us what a primitive element is. Also, I do not see a lot of category theory at work there. The language looks formal but all that is happening is that things get lifted from the ("easier", "quotient-like") graded algebra to the ("more complicated") filtered original algebra, as is not uncommon in this theory at all. I mean, that's kind of what PBW is about itself. – Torsten Schoeneberg Apr 25 '23 at 18:31
  • @TorstenSchoeneberg$:$ The primitive elements $x$ in a Hopf algebra $A$ are those which get mapped to $1 \otimes x + x \otimes 1$ by the comultiplication or coproduct map $\Delta.$ Do you know any book which deals with this problem in a somewhat elementary way e.g. I have searched it in Abe's book but couldn't find it there. – Anacardium Apr 25 '23 at 18:37
  • I meant to edit it in the question, and for that, you might want to explain to your readers how the comultiplication map is defined in the case at hand i.e. the universal enveloping algebra. -- I do not know a book as demanded, but I also think it's not clear what fits your definition of "elementary". As said, I do not think there is much category theory used in the linked posts. – Torsten Schoeneberg Apr 25 '23 at 18:46
  • @TorstenSchoeneberg$:$ If we know PBW theorem then I don't think that any category theoretical result is necessary. I am trying to approach the problem along the following lines $:$ For a given Hopf algebra $H$ let $P(H)$ denote the set of all primitive elements of $H.$ Now suppose that $\mathfrak g \subsetneq P(U(\mathfrak g)).$ Take an ordered basis $\mathfrak B = {X_1, \cdots, X_n}$ of $\mathfrak g$ and extend it to a basis $\mathfrak {B}1 = {X_1, \cdots, X_m}$ of $P(U(\mathfrak g))$ for some $m \gt n.$ Let $X \in \mathfrak {B}{1} \setminus \mathfrak B.$ contd... – Anacardium Apr 26 '23 at 05:02
  • Then by the virtue of the PBW theorem $X$ can be written as a finite linear combination of monomials in $\mathfrak B.$ But that will already give a non-trivial linear dependence amongst the monomials in $\mathfrak {B}_{1},$ contradicting PBW theorem for $U(P(U(\mathfrak g))).$ – Anacardium Apr 26 '23 at 05:06

0 Answers0