What do you think of Foveon chips for digicams?

Doggiedog

Lifer
Aug 17, 2000
12,780
5
81
Probably because they have invested so much in CCD technology while merchant silicon Foveon gives them no advantage over their competitors (ie commoditizes their products).
 

Wallydraigle

Banned
Nov 27, 2000
10,754
1
0
The technology is intriguing, and has many applications, but consumer imaging probably isn't the best one, at least as far as the technology has been developed so far. The problem is that with the different layers of the chip being sensitive to different wavelengths, when it comes time to put the layers back together as an image the layer on the bottom is going to be less exposed than the layers on top of it. To overcome this you either have to boost the gain of that layer, which adds chroma noise in that channel, or use software algorithms to bring the levels back up to that of the top layer. This results in all the channels being clipped at different points of under and over exposure. Shadows and highlights would take on color casts. The way they've chosen to overcome that limitation thus far has been to clip all the colors whenever one would have been clipped, leaving over exposed areas in black and white. These limitations seem to be fundamental flaws in the technology which can't be overcome using what we know today.

Applications where true color isn't necessarily the goal might benefit greatly from Foveon though. Imagine a Foveon chip on the Hubble that could take photographs of stars and such in UV, IR, and various other non-visible bands, all simultaneously.

Something else to think about is that the eyes of living things have mosaic light sensitive organs like contemproary CCD and CMOS sensors, rather than layered ones like the Foveon.

I don't think that Foveon chips will replace contemporary CCD or CMOS sensors any time soon.
 

Shockwave

Banned
Sep 16, 2000
9,059
0
0
Originally posted by: lirion
The technology is intriguing, and has many applications, but consumer imaging probably isn't the best one, at least as far as the technology has been developed so far. The problem is that with the different layers of the chip being sensitive to different wavelengths, when it comes time to put the layers back together as an image the layer on the bottom is going to be less exposed than the layers on top of it. To overcome this you either have to boost the gain of that layer, which adds chroma noise in that channel, or use software algorithms to bring the levels back up to that of the top layer. This results in all the channels being clipped at different points of under and over exposure. Shadows and highlights would take on color casts. The way they've chosen to overcome that limitation thus far has been to clip all the colors whenever one would have been clipped, leaving over exposed areas in black and white. These limitations seem to be fundamental flaws in the technology which can't be overcome using what we know today.

Applications where true color isn't necessarily the goal might benefit greatly from Foveon though. Imagine a Foveon chip on the Hubble that could take photographs of stars and such in UV, IR, and various other non-visible bands, all simultaneously.

Something else to think about is that the eyes of living things have mosaic light sensitive organs like contemproary CCD and CMOS sensors, rather than layered ones like the Foveon.

I don't think that Foveon chips will replace contemporary CCD or CMOS sensors any time soon.

Thats a hell of a good answer!
:beer: