Could someone explain scanner resolutions please (1200x4800 vs 600x3600, etc)

DonBlack

Senior member
Mar 31, 2001
492
0
0
I'm trying to decide in between scanners and I don't understand what I'm looking at. My two options are:

1) up to 1200x4800 dpi (optical)/up to 9600 dpi (enhanced); 48-bit color; 8-bit grayscale

or

2) up to 600x3600 dpi (optical)/up to 9600 dpi (enhanced); 36-bit A/D color; 24-bit external color; 8-bit grayscale

But what do the optical numbers mean? Is the better one overkill? I would be scanning mainly photos for archival purposes (TIFF or PNG formats). Thanks guys.
 

ripthesystem

Senior member
Mar 11, 2002
571
0
0
AFAIK (and correct me if I'm wrong) The optical numbers are the resolutions that the scanner is physically capable of. The enhanced numbers are just that.. enhanced. The color bit is the depth of color so on all of these (theoretically) the higher the numbers all the way around the better.

If you are truely scanning for Archival purposes and filesize/storage isn't an issue then go for the better optical scanner. You visually won't be able to see a difference between the 2 on your screen but at some future date in a Pro Print shop you will.

hth
ripthesystem
 

DonBlack

Senior member
Mar 31, 2001
492
0
0
Thanks. The reason I don't want to go with the better one is because we're talking all-in-one devices here and the better one costs $475 more than the cheaper one (the main expense being HP's damn new LIO interface that doesn't work with my JetDirect print server). So, you believe that I wouldn't be able to tell the scanners apart on my screen or when printing now ... but I might in the future? By archiving btw, I meant silly family stuff. Nothing business or important. =)
 

ripthesystem

Senior member
Mar 11, 2002
571
0
0
Originally posted by: DonBlack
Thanks. The reason I don't want to go with the better one is because we're talking all-in-one devices here and the better one costs $475 more than the cheaper one (the main expense being HP's damn new LIO interface that doesn't work with my JetDirect print server). So, you believe that I wouldn't be able to tell the scanners apart on my screen or when printing now ... but I might in the future? By archiving btw, I meant silly family stuff. Nothing business or important. =)


Ohh... Sorry- I work at a library and by archiving we mean ARCHIVING.. like last til the end of time type stuff;) If you are talking all in one machines for family pic then the cheaper one should be fine. Home printing should be fine from that as well. You will not be able to tell the difference on screen. Most monitors are 72dpi or l33t design monitors push the 92-100 mark. So Anything over that you won't recognize on your screen. The only way you should see a significant difference between those high resolutions is if you have something printed at a press with super high res printers.

hth
ripthesystem

 

DonBlack

Senior member
Mar 31, 2001
492
0
0
Gotcha. That jives with what I've heard as well. Now, when they say 600x3600 are they saying horizontal x vertical? That doesn't really seem to make sense... And so you're saying that, in this example, a monitor can at best display 100dpi which is much less than both the "600" and the "3600" in 600x3600, right? Thanks again!
 

kgraeme

Diamond Member
Sep 5, 2000
3,536
0
0
When they give two measurements, they are basically hoping you look at the bigger number and ignore the smaller number. Unfortunately, the lower number is technically the more accurate representation. Think of it as the smaller number being the actual resolution with the larger number being the "support" resolution to help improve the quality of the smaller resolution. Ignore the "enhanced" numbers.

The color numbers are somewhat the same. In a RGB TIFF file, you only have 24 bits of color that you can save. The scanners that offer 36 or 48 bit color actually scan at that color depth, then downsample to 24 bit. On the one hand you can say it's just throwing away excess color, but really it uses the larger color space and very complex downsampling algorithms to make the 24 bit color better than if you just scanned in 24bit.

Deciding which resolution of scanner to get depends in part on the final output you want. Do you want 5x7 photos? 8x10? Bigger? In books and magazines, the photos are usually about 300dpi. Line art drawings though require much higher resolutions with 1200dpi not uncommon. That's for book/magazine publishing. With a home inkjet printer, even a photo-quality inkjet, you can get away with as low as 150dpi for photos. And as ripthesystem mentioned, pictures for on-screen display are 72dpi.

Either of these scanners will do more than 300 dpi, so the question becomes what size is the source material you are scanning? Are you scanning something small that you want to blow up? If so, then how much? Are you trying to take a 3x5 photo and make a 11x17 print to hang on the wall? A 3x5 image scanned at 1200dpi will let you get a 12x20 print at 300dpi. That same 3x5 image scanned at 600dpi will give you a 6x10 print at 300dpi. Remember though, that even with the lower-end scanner you have some wiggle room since you aren't printing for Architectural Digest. I have often printed at 150 dpi which would give you up to the 12x20 size. With a good inkjet photo printer it takes a really good eye to tell that it's been printed at the lower resolution.

So given all that, and your need for this for home, get the cheaper one!
 

glugglug

Diamond Member
Jun 9, 2002
5,340
1
81
Actually, a 0.28mm dot pitch monitor (low end nowadays) is rated to display disrete pixels at 90.71 DPI You can typically sync at resolutions that would be over 150DPI, but the pixels are then blurred together a bit. "L33+" quality monitors can display discrete pixels accurately at 120+ DPI and sync to modes over 200DPI.

72DPI is the mode all the Macintosh NON-MULTISYNC monitors are in, and is the resolution at which various WYSIWYG programs like Word and Acrobat assume you are in for the image on the screen to be EXACTLY the size it will be on paper in the "100%" zoom setting.

Either way, your point that even a lousy scanner is better than the best monitor on the market is correct. However, you will most likely be able to see the difference when you output the scanned images to a printer.

 

glugglug

Diamond Member
Jun 9, 2002
5,340
1
81
On the issue of whether you can see the difference when you print it:

For images with low contrast (like most photos), most people will not be able to tell the difference between 300dpi and 600dpi, and you could go lower and still have most people have to look close to tell.

For high contrast images (line drawings, text), it is easy to see the difference between 300dpi and 600, and a lot of people can tell the difference between 600 and 1200.