TechWhirl (TECHWR-L) is a resource for technical writing and technical communications professionals of all experience levels and in all industries to share their experiences and acquire information.
For two decades, technical communicators have turned to TechWhirl to ask and answer questions about the always-changing world of technical communications, such as tools, skills, career paths, methodologies, and emerging industries. The TechWhirl Archives and magazine, created for, by and about technical writers, offer a wealth of knowledge to everyone with an interest in any aspect of technical communications.
Subject:Re: dpi and pixels From:Dick Margulis <ampersandvirgule -at- WORLDNET -dot- ATT -dot- NET> Date:Mon, 21 Jun 1999 16:20:43 -0400
Sylvia,
You're not the first person to stumble over some
confusing terminology.
Your monitor has an attribute called dot pitch, measured in
millimeters. A decent monitor may have a dot pitch of .28
mm (less is better, more is worse) and the carton the
monitor was shipped in may have had imprinted on it ".28
dpi" (instead of the more accurate ".28 mm dpi").
That has NOTHING to do with the question you asked, but
keep in mind the "dpi" abbreviation.
Halftone screens, used to prepare photos, tint screens,
etc., for printing, are measured in lines per inch (lpi).
A really high quality press can lay down 175 lpi
four-color pictures. The daily newspaper may use 85 lpi
or 110 lpi, resulting in clearly visible halftone dots.
This has VERY LITTLE to do with the question you asked,
but keep it in the back of your mind.
A laser printer's resolution is measured in dots per inch
(dpi). That tells us how close together the black dots
are that are used to paint the individual characters of
type as well as all other graphics.
Now there is a relationship between lpi (halftone lines
per inch) and dpi (laser printer dots per inch) when you
are producing a halftone on a laser printer. In order to
simulate, using software, the optical process of halftone
photography, we need to be able to create pseudo-halftone
dots on the laser printer that vary in size. Otherwise we
could not show varying shades of gray on a regular grid.
If we have a 300 dpi laser printer, the rule of thumb is
that the maximum lpi is one-quarter of that, or 75 lpi.
That way, we can paint from 0 to 16 laser printer dots in
a 4 dot by 4 dot square, yielding 17 levels from white to
black. On a true 600 X 600 laser printer, we can simulate
a 150-line (150 lpi) halftone in principle.
Now then, what does any of this have to do with scanning?
Not much, except that "dots per inch" shows up in that
context, too.
What happens if you scan at a resolution of 1000 dots per
inch by 1000 dots per inch? You now have one million dots
per square inch. Each dot, on a color scanner, may
translate into up to three bytes of information (256
levels for each of the three primary colors of light:
red, green, and blue). So you need 3 MB for each square
inch.
At 300 X 300, you need 270,000 bytes per square inch.
What happens if you scan a 4" X 5" photo at that
resolution? You need over 5 MB to store that photo.
What happens if you use that photo on your monitor, on a
laser printer, on a printing press, or on the Internet?
Let's see:
On your monitor, which might be set at a resolution of 96
pixels per inch (first time I've used the word _pixels_
in this post, you will note), the 4" X 5" photo will take
up about 12" X 15" of screen, based on one pixel per
"dot" of scan. Of course you can display it at a reduced
size if you like. This still has NOTHING to do with
whether you have a .28 mm dpi monitor or a .24 mm dpi
monitor or a .34 mm dpi monitor.
On a 600 X 600 dpi laser printer, with each scanner dot
converted to a halftone dot, at 100% size, the picture
will come out 8" X 10" (someone correct me if I did the
arithmetic wrong on this one).
On a 300 X 300 dpi laser printer, under the same
conditions, you'd get a 16" X 20" print. (Remember, on
the scanner, this was a 4" X 5" photo scanned at 300
dpi!)
On a webpage, you would have to convert the picture to a
.jpg file and then you would have to reduce it to some
manageable size before you could even post the darn
thing.
I know you're still confused, but reread this. Make a
chart if you have to. And realize that "per inch" shows
up in a lot of different units and a lot of similar
acronyms whenever you start talking about graphics and
electronic media in the same context.
Hope this helps,
Dick
Sylvia Braunstein wrote:
> I am confused. I was requested to scan a high resolution picture of
> 1000x1000 TIF format. I assumed it was pixels.
> My boss said that 300 dpi (dot per inch) were enough otherwise the file
> would be way too big.
> Now, I am confused between the two of them.
> Can anybody help me understand the difference and how you measure the
> resolution.
> --