Andrew M. answered 05/23/19
Software Developer
Your calculated assumption is bit wrong here. When multiplying 256*256*256, what you are actually doing is determining the possible combinations of colors for one pixel, but that's not what a display does. If it was, each picture you have would be in the terabytes!
Imagine each of the ~2 million pixels of your monitor (or a picture) being allocated a memory space to store color data. We could format the color data in a lot of efficient ways, but imagine we just have an integer that holds a number like RRRGGGBBB, so for a green pixel it would be 000256000. This takes up a mere 9 BYTES of data.
So 9 bytes for each of the 2 million pixels = 18 megabytes. Not that bad.
Also, images have other little tricks to reduce size. For example, if there is a red square in the image that is, let's say, 30 pixels * 30 pixels, each pixel won't redefine itself, but will just say "I'm the same color as the last pixel" which saves space.
EDIT: Also, if we chose to store the color data in individual binary bits instead of integers, we could define every color while only using about 3 bytes instead of 9