megahertz vs megapixel

draggoon01

Senior member
May 9, 2001
858
0
0
so it occurred to me that we have entered into a megapixel race of sortz and it reminded med of the megahertz race we are in right now (which I would argue is slowing down). The question is, are they the same?

The way they are similar to me is that for both we are quickly reaching the point of diminishing returns. It?s the point where people don?t really need the newer/faster/better technology, but will simply buy it because it falls into their price range; and at this point the race slows because many will sit happy with what they have.

True there?s more to computer performance than mhz, more to picture quality than megapixel, and more to car performance than horse power, but these do make generally good overall indicators.

Several others and I have posted about the questionable need for more mhz. although most want more, generally it?s agreed that the limited need is for either gaming, encoding, or professional apps. Since gaming is slowly going way of console from pc, there is even less appeal to general public for more mhz.

But is megapixel the same? I tend to think so, and would like to hear how people would disagree. The magic number of 300dpi is considered film quality. For standard 4x6 print, you need about 2 megapixels to reach that dpi. And that?s the size most people print at. But suppose people want 8x10 prints. That would need about 7 megapixels. Right now 2 megapixels is minimum you can buy, becoming 3 this year. I would say in 3-5 years 7 megapixels will become the minimum. And what then? Sure with higher megapixel, you will at the same time get better innards and lenses and stuff, as well as more breathing room to crop or zoom, but will the same thing that?s happening to computers happen to megapixels. After cpu?s reached 1ghz as the minimum speed (desktop not laptop), the need for speed dwindled and dancing intel lab suits lost favor (surely centrino ads won?t be as effective).

What do you think
 

vetteguy

Diamond Member
Sep 12, 2001
3,183
0
0
I think some of your points are valid, but you're forgetting one important thing: marketing.

Consumers of PC hardware and digital cameras often fall into several groups: budget (cheapest/best for lowest cost), power users (want fastest/best no matter the cost and know what they're doing), and "power consumers" (want the fastest/best not because they know what it is, but just that they know they want something better than everyone else). When someone goes into buy a digital camera, they're either thinking of cost, or they're being swayed by sales pitches ("3 is good, but 5 is the BEST!"). Then there are those who come in knowing what's going on, but often those types buy their equipment online and away from marketing hype.

I don't know if it's fair to compare digital cameras with processors. By the time we hit 7-10 megapixel range being the "bottom", there will no doubt be some new type of digital photograpic technology to take its place (what that may be, I'm not sure). Think about 8-track->cassette->CD->DVD->?...once something became mature there was a replacement ready. Since digital cameras are more of an appliance than PC hardware/technology, I think they fall under that progression track.
 

0roo0roo

No Lifer
Sep 21, 2002
64,795
84
91
i for one want better then film quality in something "digital" :p as for mhz, can't get enough for games atleast.
 

glugglug

Diamond Member
Jun 9, 2002
5,340
1
81
Originally posted by: draggoon01

The magic number of 300dpi is considered film quality.

No it pretty obviously isn't. 300dpi is the quality output of low end laser printers from almost 15 years ago. Magazines are printed at 2400dpi, and I believe professional film development prints are also at 2400 dpi. That's 5.76 megapixels per square inch, or 138 megapixels for a 4x6 print. Simply put, today's digital cameras are not yet even close to the printed resolution of a traditional film camera.

Personally I have 3 megapixels on my CRT screen. That's better than most digital cameras today.

 

rbV5

Lifer
Dec 10, 2000
12,632
0
0
generally it?s agreed that the limited need is for either gaming, encoding, or professional apps
The fact is, the things folks tend to point to as proof that we are getting diminishing returns for performance, are mearly tasks that can simply be performed with yesterdays technology, like word processing and traditional communication...sure an emachine can do it, but so can a telephone and a typewriter, and a TV and a VCR.

Face it, Computers lack of power remains the biggest barrier to it reasonably performing new tasks we can't even imagine yet, not its abundance of it.
 

Mday

Lifer
Oct 14, 1999
18,647
1
81
with megahertz, it's all in what you do.

with megapixels, it's all in the camera. it's composed of several things. the lens quality, the CCD\CMOS quality, the post processing, and how the post processing is carried out.

with most ccds, a pixel is dot of color on the ccd. so roughly speaking, if a camera is 3 megapixels, it's 1 megapixel for each color. however, that's not technically true since there are more green pixels usually. the actual 3 megapixel image is interpolated. foveon has a 3 layer cmos based sensor for which there is a layer for each color, light has various wavelengths, and the sensor uses that to create a multi layered cmos device. that way, each pixel is actually 3 colors allowing for a better reproduction. then there is the megapixel size. sure a foveon 3 megapixel is better than a standard 3 megapixel. however, non foveon, traditional CMOS sensors are getting very high megapixel capacity (double digit). and likewise with better post processing are creating excellent images. and then there is fuji's superccd technology. by using a different geometric orientation, fuji's superccds are capable of better quality than traditional ccds. of course you cant forget old school ccd tech, which uses a series of optics to place an image on 3 separate ccds.

with cameras, you get what you pay for: lens, post processing techniques and ccd quality.

and one more thing, you can fit more pixels into a ccd\cmos chip if you increase the size of the actual sensor. so, we have medium format digital cameras in the works to market.
 

Sid59

Lifer
Sep 2, 2002
11,879
3
81
id like to also add that i don't think 300dpi is the magical number for equal film. Ever took a negative and scanned it in a professional film scanner. I highly doubt the outputted format is 4 megs. I think my professor said it reaches into the 100s of MBs. So take it for what it's worth ...

In actuality, a point and shoot disposable camera has better quality than some low and mid end digitals.
 

RyanM

Platinum Member
Feb 12, 2001
2,387
0
76
Originally posted by: glugglug
Originally posted by: draggoon01

The magic number of 300dpi is considered film quality.

No it pretty obviously isn't. 300dpi is the quality output of low end laser printers from almost 15 years ago. Magazines are printed at 2400dpi, and I believe professional film development prints are also at 2400 dpi. That's 5.76 megapixels per square inch, or 138 megapixels for a 4x6 print. Simply put, today's digital cameras are not yet even close to the printed resolution of a traditional film camera.

Personally I have 3 megapixels on my CRT screen. That's better than most digital cameras today.

Ummm, no.

You're confusing DPI with PPI. A lot of people do this, which pisses the hell outta me.

First off, to answer the first question, you can't have too many megapixels. At least, professionals can't. A 2 megapixel camera will produce a 300 PPI image at 4 x 6, a 3 megapixel camera will do a 300 PPI image at 5 x 7, and a 4 or 5 megapixel camera can't do a 300 PPI 8 x 10. If people actually want to print a GREAT 8 x 10, they'd need 7.2 megapixels to print at 300 PPI.

Personally, when I take a picture, I want to be able to crop the hell out of it, and still be able to print it anywhere from 4 x 6 to 13 x 19 - And currently, there's maybe one camera on the market that can do that. 15 megapixels. Unfortunately, it costs an ungodly amount, so currently, I stick to traditional film methods for most of my photo work.

Which I hate. I'm waiting for the day a 20 megapixel camera with individual sensors for RGB becomes the same price as today's 4 megapixel cameras. Unfortunately, cameras aren't improving in quality at nearly the same pace as CPUs. Most people are fine printing 4 x 6 and 5 x 7, and don't care that in the long run, they may want to print 8 x 10. Lack of demand means a lack of product development for something better.

Now, back to the original statement.

When Draggoon said that 300 dpi is accepted photo quality print output, what he meant is 300 PPI is the accepted digital resolution for source images to be printed from.

You are correct that the actual print resolutions need to greatly exceed 300 dpi. But there isn't an inkjet printer on the market that still only does 300 dpi.

Also, you can't compare the requirements for halftone printing with inkjet printing. They are two totally different methods of printing, and are not equivilant. A 2400 dpi halftone is going to look nowhere near as good as a true 2400 x 2400 inkjet print. Of course, 2400 x 2400 inkjets don't exist yet - The best professional-end ones are 2880 x 1440 and use 6 or 7 color processes. This gives them image quality that STILL exceeds 2400 dpi halftone, and by a huge margin.

Do not confuse pixels and dots. PPI and DPI are related, but it's not a 1:1 correlation. A 300 PPI source image is the accepted resolution for photo-quality printing, regardless of the DPI specs on the analog side.
 

RyanM

Platinum Member
Feb 12, 2001
2,387
0
76
Originally posted by: Sid59
id like to also add that i don't think 300dpi is the magical number for equal film. Ever took a negative and scanned it in a professional film scanner. I highly doubt the outputted format is 4 megs. I think my professor said it reaches into the 100s of MBs. So take it for what it's worth ...

In actuality, a point and shoot disposable camera has better quality than some low and mid end digitals.

Again, you're forgetting the PPI factor.

You have a negative, measuring 1 inch by 1.4 inches. You want to be able to print it out in 8 x 10.

An 8 x 10 image, at 300 PPI, needs to be 2400 x 3000 pixels total. In order for that negative to be scanned in and end up at that resolution, you'd need to scan at 2400 dpi.

However, if you have a 4 x 6 image, and you plan on scanning and reprinting at 4 x 6, then you only need to scan at 300 dpi.

You're right about file sizes though. A 300 PPI 8 x 10 source image is going to be around 21.6 megabytes. A 300 PPI 4 x 6 source will be around 6.48 megs.
 

Sid59

Lifer
Sep 2, 2002
11,879
3
81
Originally posted by: MachFive
Originally posted by: Sid59
id like to also add that i don't think 300dpi is the magical number for equal film. Ever took a negative and scanned it in a professional film scanner. I highly doubt the outputted format is 4 megs. I think my professor said it reaches into the 100s of MBs. So take it for what it's worth ...

In actuality, a point and shoot disposable camera has better quality than some low and mid end digitals.

Again, you're forgetting the PPI factor.

You have a negative, measuring 1 inch by 1.4 inches. You want to be able to print it out in 8 x 10.

An 8 x 10 image, at 300 PPI, needs to be 2400 x 3000 pixels total. In order for that negative to be scanned in and end up at that resolution, you'd need to scan at 2400 dpi.

However, if you have a 4 x 6 image, and you plan on scanning and reprinting at 4 x 6, then you only need to scan at 300 dpi.

You're right about file sizes though. A 300 PPI 8 x 10 source image is going to be around 21.6 megabytes. A 300 PPI 4 x 6 source will be around 6.48 megs.

no, i was correcting using an example of how his terminology is wrong. Also, you are defending him and you may not know if he even knows his terminology. You posted great information and i learned from it. I was also pointing out that high end film scanners do not scan at a 300dpi resolution.
 

RyanM

Platinum Member
Feb 12, 2001
2,387
0
76
Ah, see I assumed that you were also using the terminology wrong, and were referring to 300 dpi on the digital side, not the analog side.

Good to know my posts were informative.
 

jonmullen

Platinum Member
Jun 17, 2002
2,517
0
0
Originally posted by: draggoon01
Since gaming is slowly going way of console from pc, there is even less appeal to general public for more mhz.

What do you think

I would have to diagree with this. Believe it or not, but CS, Unreal Tournement 2003, Hitman, Hitman 2, BF 1942, Doom 3 (comming soon, I hope), Soldier of Fortune 2, Jedi Knight 2 and counless other games, that will mostlily be PC based only. There are only a handful of games that I play on consoul and I have a X-Box and a PS2 and most of the pervious counsouls. Granted Halo is a great game and so is the GTA series, but both of those are out on computer or are planned for release. The consoule will never be able to match the desktop in gaming quality. I mean look at Max Payne, the picture quality on the computer blows even the X-Box away. The same can be said for countless other cross platform games. Thats what I think