My textbook, Introduction to Probability by Blitzstein and Hwang, gives the following definition of a discrete random variable (p. 94):
A random variable $X$ is said to be discrete if there is a finite list of values $a_1, a_2, \dots, a_n$ or an infinite list of values $a_1, a_2, \dots$ such that $P(X = a_j \ \text{for some $j$}) = 1$.
Initially, this definition seemed a bit weird to me. It's saying that a random variable $X$ is said to be discrete if the probability of the random variable being equal to some outcome $a_j$ in the sample space for any $j$ is equal to $1$; in other words, that it is certain that some outcome will occur. But would I be correct in presuming that what the author is likely attempting to accomplish by defining a discrete random variable in this way is to contrast it with the definition of a continuous random variable, since the case of a continuous random variable is one in which the probability that the random variable is equal to any outcome in the sample space is equal to $0$?