A bit of background on why I'm asking this:
Take the sequence of ℤ+, which has the same cardinality as ℤ:
1, 2, 3, 4, 5, 6, ...
Suppose we create a number in base 11 using x as the extra symbol, by concatenating the numbers, and delimiting with x.
1x2x3x4x5x6x...
This is a perfectly legal number in our base 11 scheme.
Now scale it:
0.1x2x3x4x5x6x...
This would be a real number between 0 and 1 [0, 1).
In other words, this real number represents the sequence ℤ+ and consists of a single point p on the real number line, 0 ≤ p < 1.
Take any other sequence in ℤ+, transform it the same way, and it can be represented as a point p on the real number line, 0 ≤ p < 1:
0.10x20x30x40x... times ten
0.1x4x9x16x25x... squares
0.41x8x20x9x5x... some random sequence
All finite sequences of similar length can also be represented similarly. For example, all finite sequences of length 2: {1, 1} {1, 2} {1, 3} ... can be ordered and represented by a single number in ℝ:
0.1x1x1x2x1x3x...
So we have the set of all sequences, finite and infinite, representable as a point p, 0 ≤ p < 1.
These sequences can be scaled to any arbitrary [q, r) such that r → q, but always r > q.
Given that [q, r) is an arbitrarily small but bounded interval in ℝ, and that ℤ+ is represented as a single point p', q ≤ p' < r, can it be argued that the size of this set is larger than ℤ but smaller that ℝ?
I guess it all hinges on whether the cardinality of any bounded interval of ℝ = |ℝ|, hence the title of the question.