90

If you calculate the area of a rectangle, you just multiply the height and the width and get back the unit squared. Example: 5cm * 10cm = 50cm²

In contrast, if you calculate the size of an image, you also multiply the height and the width, but you get back the unit - Pixel - just as it was the unit of the height and width before multiplying. Example: What you actually calculate is the following: 3840 Pixel * 2160 Pixel = 8294400 Pixel

What I would expect is: 3840 Pixel * 2160 Pixel = 8294400 Pixel²

Why is that the unit at multiplying Pixels is not being squared?

Raphael
  • 73,212
  • 30
  • 182
  • 400
JFFIGK
  • 1,017
  • 1
  • 8
  • 11

11 Answers11

239

Because "pixel" isn't a unit of measurement: it's an object. So, just like a wall that's 30 bricks wide by 10 bricks tall contains 300 bricks (not bricks-squared), an image that's 30 pixels wide by 10 pixels tall contains 300 pixels (not pixels-squared).

David Richerby
  • 82,470
  • 26
  • 145
  • 239
123

I have a different answer from other folks: pixel is the correct unit for areas, and you do need dimensional analysis. The discrepancy is that the pixel in "3840 pixels wide" is not the same unit as the pixel in "the display has 8294400 pixels". Instead, "pixel" is a natural-language abbreviation for different units at different times, and it takes some context and judgment to expand the abbreviation appropriately.

The unabbreviated form is "3840 pixel-widths wide x 2160 pixel-heights tall = a bazillion pixel-areas" (and one "pixel area" is definitionally equal to "pixel-width * pixel-height" for rectangular pixels).

N.B. it is frequently assumed that pixel width and pixel height are equal (as in the CSS discussion in the other answer), and even without that assumption the above assumes that pixels are rectangular -- and these assumptions are often but not always true!

Daniel Wagner
  • 1,290
  • 1
  • 8
  • 12
45

A pixel is already a two-dimensional object

In your example, you specify centimeters as a contrasting example. Centimeters are a unit of length, which is by nature a one-dimensional measurement. When measuring areas, we need to talk about square centimeters, which defines the unit as a two-dimensional quadrilateral with right angles and equal length sides = 1cm. When discussing volume we then talk about cubic centimeters, which defines the unit as a three-dimensional, six sided prism with squares for sides, the length of each side = 1cm.

Since a pixel is an abstract concept, and not a strict unit of measure, it makes sense for it to exist purely as a two-dimensional object. When you're measuring something like screen resolution, you could consider the measurement to include an implied pixel widths. ex: 1920 pixel widths x 1280 pixel widths

As an interesting note, there is a three-dimensional pixel, called a voxel, which is defined as a three-dimensional prism with a pixel for each face.

@KlaymenDK Had a great note, which is that pixels are so abstract and context sensitive that they are not necessarily square. This pushes their class further from 'unit of measure' and into 'count of objects' territory.

Sophie Altair
  • 551
  • 4
  • 7
16

As others have said it's because pixel is an object. If you'd like to think of it in terms of units, the equation is technically 3840 pixel-heights * 2160 pixel-widths = 8294400 pixels (where you can think of a pixel as 1 pixel-height * 1 pixel-width)

reffu
  • 411
  • 4
  • 7
13

Pixels are weakly typed units. Just like 1 can be coerced into an integer, floating-point value, or string in a weakly typed language, a "pixel" is coerced into whatever unit makes sense in context.

If we were to more strongly type the unit, we'd probably have several:

  1. pixel-width;

  2. pixel-height;

  3. pixel-diagonal; and

  4. pixel-area.

As you correctly point out, if we assume that pixels are squares, then$$ \left[\text{pixel-area}\right] = {\left[\text{pixel-side}\right]}^2, $$such that it'd make more sense to speak of square-pixels when discussing image size in terms of pixel-sides.

The thing's just that, if we're talking image size, then a "pixel" is meant to be coerced into a pixel-area, not a pixel-side.

Note: Pixels are units

A unit is literally anything we use to express measurements of some sort. Sure pixels are objects, but so are other units - there's no conflict there. For example, in the US, we still measure lengths in feet.

Object-defined units of measurement were the historical norm, so there's nothing unusual about pixels being object-defined units. Just, there're obvious shortcomings to such definitions, so in recent history standardization efforts have been made. For example, feet are now more formally defined than as a person's foot's length.

That said, the same is happening to the pixel unit:

A device-independent pixel (also: density-independent pixel, dip, dp) is a physical unit of measurement based on a coordinate system held by a computer and represents an abstraction of a pixel for use by an application that an underlying system then converts to physical pixels.

-"Device-independent pixel", Wikipedia

Nat
  • 1,351
  • 1
  • 10
  • 18
7

You may want to look at this in another way:

$$3840\, \frac{\mathrm{px}}{\mathrm{scanline}}\times2160\, \mathrm{scanline}=8294400\,\mathrm{px}.$$

I.e. treat pixels as two-dimensional objects, much like bricks in a wall, and just count them along the display width and height.

Ruslan
  • 179
  • 5
6

The width of an image is being measured in a discrete 2 dimensional space.

The image is 3840 pixels across. This means there is a horizontal band of 3840 pixels (which are each 2 dimensional regions) that cross the space. We aren't using pixel as a unit of measurement -- we are actually counting things called pixels.

When we measure how tall it is, we measure 2160 pixels in a vertical band to the top of the image. Again, pixel isn't a unit of measurement, it is a thing we are counting.

If you take a grid of things that is 3840 wide and 2160 tall, you end up with 3840*2160 of them. This is counting.

We could also describe the image as 3840 pixel_widths wide and 2160 pixel_heights tall, and modify those distances. Then we'd get 3840*2160 (pixel_width * pixel_height) area. This is an area calculation.

These happen to have the same numerical value because pixel_width*pixel_height = pixel_area, and X pixels have an area of X pixel_area.

A difference between these calculations appears when you have non-square pixels and you rotate. Something 10 pixel_widths wide rotated 90 degrees may not be 10 pixel_heights tall. At the same time, the rotation should preserve area (up to rounding).

The ratio between width and height on a pixel is called its aspect ratio. CRTs often had an effective aspect ratio of 1.11 if I remember correctly.

Yakk
  • 852
  • 4
  • 13
5

A pixel on a screen is like a block in your city. You don't say I love the houses on this city block² since block is by definition itself the square, not the side of the square.

user541686
  • 1,187
  • 1
  • 10
  • 17
3

Pixel are a discrete unit you want to know the count of. Practically, it is like counting potatoes in several bags. Use the other length as plain number, so eg. 1600*900 Pixels, but not 1600 Pixels*900 Pixels. Leaving one number without unit yields Pixels as expected (instead of Pixels-squared) and is mathematically correct.

rexkogitans
  • 227
  • 2
  • 6
1

Technically a pixel is not a unit of measurement. In optical design pixels are measured in radians. In photogrammetry, particularly when performing interior orientation, pixels are measured in millimeters. For example, if you have a sensor with 5micron pixels and you talk about a line of pixels 1000 wide, those pixels are not 5mm wide by 1mm or 0mm tall. They are 0.005mm tall. Pixels are inherently 2 dimensional, sort of. When you include tonal information pixels are actually 3 dimensional. 4th, 5th, or 6th dimensional pixels are, of course, voxels. Within the realm of hyperstacks, each nth dimension is defined and can decompose into into its constituent pixels across any dimensionality.

To think about this more concretely (pun intended) if you have a cinder block wall and the blocks are 9x18 inches then a 10x10 block wall will be 90x180 inches or 10 blocks wide and 10 blocks tall. I suppose that if you had a 1000x1000 block wall you could say that you had a megablock but then there would need to be a base2 / base10 debate.

Bottom line, those pixel dimensions might be convenient integers when stored in an image file but in the real world a single pixel can have any number of dimensions greater than 2 and have sizes of any precision.

-1

It happens when you multiply units. Pixel is not a unit. You can do that with Pixel per Inch etc. but not with the pixel alone.