So What Is All This DPI Stuff Anyway?
Understanding Image Resolutions

By: Gary Gray

 

I meet a lot of photographers who don't fully understand the concept of DPI vs PPI. To make matters even worse, I come across web designers and other printing people who don't seem to understand this concept either.

 

Ever have somebody tell you that the image needs to be 300 DPI? Well, I have and it means absolutely nothing. DPI and PPI are two different things and only relative to the output medium you intend to use for displaying an image.

 

Why is any of this important?

 

Let me put it in perspective. You spent a lot of money on your camera, your software, books, printers, etc. Perhaps you like to enter your photographs into contests. Perhaps you like to make nice prints for your family and friends. Perhaps you want to sell your photographs and earn money. It's a competitive world out there and photography is no different. For every 10 photographers who don't know and understand fundamental issues here, there are 2 or 3 who do and you are competing with them for that 1st place award or that print sale. Why invest in the art if you aren't going to learn it? If you aren't interested in the details, you'd be better off buying a point & shoot camera, rely on the buttons and software features in the camera and just living with the results and this is not a recipe for getting better or being a better photographer.

 

DPI is Dots Per Inch, and is used as a blanket term for defining the resolution of their image. But this usage varies from person to person, and seldom is it used accurately.

 

PPI is Pixels Per Inch, and is often unwittingly used interchangeably with DPI by the uninformed. Pixels are unique elements of visual information and can be created a number of ways. Our digital cameras generate pixels not dots. Digital images are made up of pixels not dots. Pixels are made up of dots, one or more dots, depending on how you are displaying them.

 

I'm going to attempt to explain all of this and hopefully I'll be able to keep it as simple as possible. The big issue here is really simple mathematics, so unfortunately I'm going to have to explain it using math. Mostly multiplication and division, so that should be simple enough don't we think?

 

If I haven't frightened we off by now, keep reading.

 

For the purpose of this discussion, I'm going to leave our camera's sensor out of the equation as that is a whole other explanation and isn't relative to what I'm about to explain.

 

PPI and DPI are relative. Relative to the size of the image you want to produce, how you are reproducing that image and what resolution you need to produce any given size of image and still have it contain a reasonable quality.

 

DPI and PPI are not interchangeable terms, even if they appear to be. There is however, a relationship depending on what you are trying to accomplish.

 

Let's begin with our image file.

 

We take a photograph using our digital camera. Our camera's sensor converts that image information into a digital photograph and that photograph is stored on a digital memory chip inside of our camera as a image file. We download that image file to our computer and then open it with our photo editing software.

 

I use Adobe Lightroom as a photo editor, so this discussion will be based on that photo editing software perspective. The rules and concepts don't change if we're using another photo editor, so not to worry.

 

When we are viewing our image on a computer monitor, the image we are viewing is made up of pixels. These pixels are arranged in rows and columns, and when arranged properly, give us an accurate representation of what the sensor in our camera recorded. As we know, digital cameras come in all different configurations and models, with different resolutions. These resolutions are normally expressed as megapixels. I'm going to base this conversation on a fictional camera that uses a 12 megapixel sensor. Your camera may be different, but the rules are the same.

 

Our 12 megapixel camera has produced an image file that is is 4245 pixels in the horizontal dimension (left to right in each row) and has 2830 pixels in the vertical dimension (top to bottom in each column.) If we multiply 4245 x 2830, we get 12013350 pixels, or roughly 12 megapixels (mega being million.)

 

We've established that our base camera resolution is 4245 x 2830 pixels. This is the pure resolution of our sensor and everything concerning our photograph is represented by these original pixels, all 12 million of them. If we add or subtract pixels by resizing the file with our photo editor, we alter the resolution of the image. By enlarging the image (up rezing) the computer software examines the image and creates pixels that didn't exist. This isn't adding resolution, it's just creating fake information. By reducing the size of the image (down-rezing), we remove pixels and recreate image data using fewer pixels, thus image information is deleted. Up-rezing and Down-rezing decreases the true resolution of your image. You can not increase the resolution of your image by adding pixels, you simply remove original information and replace it with a computer guess of what the information would be like if there were more pixels. Down-rezing your image simply throws away information. Unfortunately, we have to re-rez our image files at some point if we intend to display those images in some type of medium, be it a print, on our computer monitor, in an email or on web page. The output medium will dictate the actual resolution required (pixels) to display the image with acceptable results. The final resolution of the image for that particular output (display) medium can be measured in pixels per inch (PPI.)

 

Monitor.

 

The most common method of viewing our digital photographs are to display them on our computer monitors. The image may be displayed in our editing software, in an email or perhaps over the internet via a web page. Not all computer monitors are created equal, so for a given size and color of an image on one monitor that image may be a different size and color on another person's monitor. Monitors come in all different sizes and types. Today, the most common type of monitor is the LCD (Liquid Crystal Display) monitor. Ten years ago, CRT (Cathode Ray Tube) monitors were most common. There are fundamental differences between the two but for our discussion, I'm going to stick to LCD monitor types as this is most likely what you have.

 

The Liquid Crystal Display monitor is basically designed to give you rows and columns of tiny visual elements called pixels. The spacing of these pixels is called pixel density. Each pixel in the display is a distinct entity and can be used to represent a single (or more or less) pixel of a digital photographic image. This pixel density is referred to as PPI. A basic LCD monitor may have a resolution of 72 PPI or as in the case of my Mac Cinema Display, 100 PPI. That means for every inch across the face of my monitor, I have 100 pixels, for every inch from top to bottom I have 100 pixels. The more pixels per inch (PPI) your monitor has, the greater the pixel density.

 

When we talk about resolution relative to a LCD computer monitor, we are talking PPI. To understand the relationship of the image file to the size of the image on our monitor, we must understand and know something about how our monitor is configured to operate as well as its design. Taking a look at your computer operating system configuration, you'll discover that you have the ability to set the resolution of your display. You probably have a number of choices such as 800x600, or in the case of my 30 inch Apple Monitor 2560 x 1600. The mathematical relationship between horizontal and vertical resolution is known as the "aspect ratio." Monitors have a wide range of aspect ratios, most are different from the typical digital image file aspect ratio. Desktop monitors are typically different from laptop monitors. Still, we can generally define the number of Pixels Per Inch displayed on our monitors in order to obtain what we subjectively determine to be best on our eyes. Our definition of that resolution does not change the actual number of pixels the monitor has built into the screen. All it changes is how each pixel is rendered by the monitor relative to the size we see with our eyes. Our computer's video card/monitor re-rezes the image on the fly and gives us this visual information as we want to see it.

 

Has anybody ever sent you a photograph through email and you open it to find that it takes forever to load and when it comes up on the screen, it's 4 times larger than your computer screen. A full resolution file may be 8-20 megabytes in size, and nobody needs nor wants a picture in their mailbox that takes up that much space, just so they can have a look at the picture. One normally doesn't need a full resolution image file for viewing on a computer screen or television. It is normally when we are editing a file that we need a full resolution version of that file. If we want to take an edited file and send it to somebody via email, we need to lower the image resolution to something more manageable by our email. There are a few ways to lower the resolution to something more suitable. One is to use an export function that is already built in to your editing software and makes all the calculations for you so you don't have to think. Another is for you to make the calculation of what size your file needs to be and export that file to the exact file size you want it to be. The best approach to calculating how large an image file needs to be for computer viewing, one has to have a basic understanding of the dimensions required for proper viewing. Assuming a typical LCD computer monitor has a resolution betwen 72 and 100 DPI, a quick calculation will tell you that an image will appear on a monitor to be somewhere near 4x6 size by calculating 6 inches / 100 dots per inch. This equates to an image that is close to 6 inches wide at 100 DPI, or a 600 x 400 pixel resolution. We begin with our 12 megapixel image with a pixel dimension of 4245 x 2830, and end up with an image pixel dimension that is much smaller. The end result is the file is much much smaller and the photograph still contains a reasonable quality at the new size. Personally, I normally convert all of my images to about 800 pixels wide on the long side. Most users have monitors which display resolutions above 1024 x 769 pixels per inch. I know this from reviewing the stats of my web pages. The most common size is 1280 x 800 pixels. So, by sizing my images to 800 pixels on the long dimension, these images will display nicely on just about any size monitor without taking up a bunch of hard drive space and without overwhelming the viewer with something too large or too small to look at. An 800 pixel wide image will give a nice sharp image on a monitor that is 11 inches wide. Anything larger is overkill. You're wasting bandwidth, time and storage space on both ends, your's and the viewers.

 

When you convert your image to something smaller, remember to make a smaller copy and use that smaller copy. Don't resize the original file, as once you change the original, you've done irreversible editing to the file. Keep your original image files in a safe place and always make a copy for sending or posting on the internet. Adobe Lightroom, and other programs are great at doing non-destructive editing to your photographs. You simply export a copy of the file to the dimensions you'll need.

 

If you are exporting images for display on a television, even a high definition television, it's simple to figure out. The pixel dimensions of the image don't need to exceed the resolution of the television. Your typical HD television will display 1080p, which means you don't need an image to be larger than 1080 pixels in the short dimension to generate a high resolution result. If you think that you're playing it safe and making images for a slide show that contain more pixels than this, you're fooling yourself. All that is happening is the scaling the picture down at the television, taking up more file space and making it take longer to load those photographs.

 

Understanding the resolution of your display devices will allow you to make optimal size image files.

 

Prints.

 

Using our 12 megapixel camera's file resolution, we can determine what a proper and visually acceptable PPI would be by simply dividing the image dimensions (horizontal or vertical) by the number of inches. For example, to make a suitable 4x6 print of our image, we can divide the horizontal dimension of 4245 pixels by 6 (inches.) This gives us a native resolution of 707.5 pixels per inch of print. This is overkill. You don't and probably can't print at 707 pixels per inch. You can size a file down quite a bit and still get a very good quality 4x6 print, anything larger, you'll need larger image dimensions. So, a minimum file pixel dimension for a good 4x6 print turns out to be (assuming a minimum of 150 pixels per inch) about 900 x 600 pixels wide. If you want a full 300 dpi resolution, just multiply 6 x 300 and you get 1800 pixels (long edge), so 1200 x 1800 image files are technically all you'll ever need to get the best possible quality 4x6 print.

 

If we wish to make a larger print, say 8x10, we can figure out what the native resolution will be using the same approach. Divide 4245/10 (inches) and we come up with a 10 inch wide image that contains 424.5 pixels per inch. Lets make a larger print, say 11 x 14 inch print. Divide 4245/14 (inches)and we come up with 303.2 pixels per inch. Now, in case you haven't noticed, I've only been working in one dimension, using the longer horizontal dimension for my calculations. Take the same formula for the 11x14 print and apply it to the shorter dimension. 2830 pixels / 11 inches and you get a resolution of 257.3 pixels per inch. Horizontal and Vertical resolutions are slightly different as you can see. The trick here is to establish a minimum acceptable resolution for the medium we wish to display our image. For the sake of argument, we are making prints and to maintain a quality reproduction of our image in print form we must have a minimum resolution of 150 pixels per inch to do that. When calculating how large a print you can make without a significant loss of quality, you need to know how to make this calculation. So, how large a print can a 12 megapixel camera make and still look subjectively pretty good? Simple, just divide the dimensions by 150 (pixels per inch) and it will give you a very good estimate. Since we know that the vertical component of our image file will have slightly less resolution than the horizontal component , a good minimum can be calculated by dividing 2830 (pixels)/150(pixels per inch) = 18.86 inches. The largest size print we could rationally assume to have a good quality from a 12 megapixel image file is 18.86 inches x 28.3 inches. Since this isn't a common frame size, in the real world you can expect that you can print a 12 megapixel image up to 16x20 inches and have a potentially good print. Other factors need to be considered, but the camera's native resolution will handle it. For more on print size, you can view my article "The 300 PPI Print Myth" here.

 

Quality is subjective though. What one person believes to be good another may not. For prints, I've learned through experience that I can see a difference between prints that have 150 pixels per inch and prints that have 200 pixels per inch. What I've learned is that above 200 PPI, it is almost impossible to discern more photographic detail in a print by any means short of a strong magnifying glass. At 150 PPI, the detail is discernible by magnification but not by the eye at a normal viewing distance (a couple of feet away, for the sake of discussion.) I'm going to leave the print resolution discussion here. If you're interested in learning more about print sizes vs resolution, I have written another article that explains my reasoning in more detail. Let us be realistic though; most photographers aren't making a lot of very large prints. Most of our prints are going to be 11x14 or smaller, simply because that's about as big a print as can be made on most affordable inkjet printers. Most common are 4x6, 5x7 and 8x10.

 

Using the simple mathematics, you can determine how large a print you can make and what the PPI of your image file needs to be in order to make a good quality print.

 

Now, I'm going to throw in a wrench. Not all printers are created equal. Here's where we get off the Pixels Per Inch (PPI) and start talking Dots Per Inch (DPI.)

 

Dots Per Inch (DPI) is primarily a term used for printing. For the sake of our discussion, I'm going to talk about InkJet printing. There are other methods of printing, but I'm trying to keep it simple and in a perspective most of us deal with in our personal photography.

 

Dots Per Inch (DPI) is mainly a measure of how many dots of ink an InkJet printer can squirt on to a piece of printer paper. A pixel is not a dot as far as a printer is concerned. A pixel is a unit of information that has to be represented by creating dots of ink. Since each pixel can be one of millions of colors with different saturations and hues, a printer has to take each pixel and internally calculate how to accurately render that pixel on paper. In simplest terms, an InkJet printer uses 4 colors and mixes those colors to reproduce a full spectrum of potential color. Cyan, Magenta, Yellow and Black or CMYK. Look at the ink cartridges in your InkJet printer. Notice, there is no white ink. To keep it simple, white is the color of your paper, so no ink is used to print pure white.

 

Our InkJet printer must squirt tiny dots of ink onto the paper to reproduce a pixel of information. Since we have 4 colors of inks, it is reasonable to assume that there are probably going to be 4 different dots of ink used for every pixel. The printers resolution is determined by the size of the dots it can squirt out onto the paper. This is where we get into Dots Per Inch, or DPI. Actual size of each dot is not going to be the size of a single pixel, it is more likely to be smaller than a pixel's size. The best InkJet printers can reproduce up to 1440 dots per inch (DPI.) In fundamental terms, you divide the printers resolution by 4 (each color) to determine the PPI resolution the printer can reproduce. So for example, an Epson Stylus 2200 InkJet printer with a resolution of 1440 dots per inch, can make prints with PPI resolutions up to 360 pixels per inch. Wow, that's great...and it is, but the truth is, there's more to it than just Dots Per Inch as all printer dots are not created equal. But we're going to leave it at this theoretical point to simplify the discussion. Not all InkJet printers are equal. If you look at InkJet technology, you'll find that the newer photo quality printers will all have a resolution of at least 1200 DPI. Some older InkJet printers will only have resolutions up to 600 or 800 DPI, so if you divide 800 DPI by 4, you'll find that it can only reproduce an image up to 200 PPI. There are still many InkJet printers out there that can't do a high resolution image above 200 PPI, mostly they are "all in ones" or really cheap InkJets. If you're going to invest in a printer, at least get a printer that can do the job you want it to do, and for photographs, the means a high quality photo InkJet.

 

So, clear as mud?