DPChallenge: A Digital Photography Contest You are not logged in. (log in or register
 

DPChallenge Forums >> Hardware and Software >> Cropped Sensor ... WHY ?
Pages:  
Showing posts 26 - 50 of 60, (reverse)
AuthorThread
02/07/2006 07:36:51 PM · #26
This topic gets regularly discussed (at least a couple times a month) over at DPReview. You should see the length of the threads over there :-P
There is no real simple explanation, it really is a somewhat complicated subject. The real source of all the confusion is, big surprise, something called "Circle of Confusion" or CoC for short. It's traditionally defined as the smallest detail that you can make out when viewing a print at a given distance. If you know the print size and viewing distance, you know the CoC, and you can work out what the CoC size on the (film or digital) original was (divide by the magnification). The original is defined as the film negative or positive; the digital original is the image projected on on the sensor.
Now, as it turns out, if you define an "ideal" viewing distance for a print that maintains the same relationship of distance to print width (or diagonal), the CoC *on the orignal* remains the same, regardless of the print size. For 35mm film, a commonly used number is 0.030mm.
Now the real fun starts. If resolution beyond 0.030mm at the sensor plane is not useful, because it will not be seen in the final print, then 2400x1600px is all I ever need on a 35mm film frame.
Huh? that just does NOT compute.
Well, the Nyquist limit for the resolution of the sensor states that the highest frequency detail resolvable is aporoximated by 2 times the pixel pitch, and therefore all we need is 2 pixels per 0.03mm (30 microns), so pixel size need be no smaller than 15 microns, therefore 2400x1600px fills a 36x24mm frame.
Now we know that even the coarsest pitch full-frame DSLR on the market today (Canon 5D) has a pixel pitch of 8.2 microns; nearly four times the "required" number. Why the overkill? Why do 1.6-crop DSLRs have pixel pitches even smaller, around 6 microns? Let's throw out all the crap above and get to the bottom of things, shall we?
Well, OK, let's not throw it out entirely, just redefine it a bit. Let's start with the dinner plate analogy proposed above. The plate is the image circle. Draw your rectangle, either FF or cropped, and the area within is what your sensor has to work with. The level of detail in this projected image is finite, limited by the quality of the optical system. SLR lenses resolve anywhere from 40 line pairs/mm to just over 100 line pairs per mm, depending on lens quality (higher is better of course). The distance between a single line pair is equal to the Nyquist limit required to match sensor resolution to the image detail present. At 100 lp/mm, we have a Nyquist limit of 10 microns, so need a pixel pitch of 5 microns. At 40 lp/mm, a sensor with 12.5 micron pixels will match lens resolution, and the lens will look soft to a camera with much smaller pixels. Now THIS is something that makes sense, and squares with what we know about the images thrown by the lenses we really use. Good, now what do we do with it?
OK, here's the good part. Define CoC for any given camera as the Nyquist limit of the sensor. Make sure the lenses used are up to resolving at or somewhat beyond the Nyquist limit for THAT camera. We prefer beyond to minimize the lens's contribution to image degradation; it doesn't drop to zero just because the lens resolution equals the Nyquist limit for the camera. Now calculate DoF based on this CoC. You'll get results that ensure that the zone of focus actually appears to be very nearly fully in focus across the full calculated DoF. Using the larger CoC calculated from the 35mm = 0.030mm equivalency "rule" always results in over-estimated DoF for DSLRs. If we care about the image being sharp to near the limits of the recording system, we must use a smaller number.
For printing, we can still use the same analysis we've always used, but when considering only the digital original, the situation is actually simpler (if any of the above is simple).

That's my story, and I'm stickin' to it ;-)
02/07/2006 07:41:26 PM · #27
Originally posted by Bear_Music:

Originally posted by kyebosh:


Well no not quite the same, the dof will be different (more DOF for the 1.6x crop).

That is
if you shoot with the same settings and distance with a 100 on 1.6x vs a 160 on FF.


This is not true. For the same reproduction ratio, at a given physical aperture, DOF is identical regardless of the focal length of the lens. At a 1:1 reproduction ratio, my 60mm macro has exactly the same DOF as a 105mm macro lens. I used to believe what you are saying here (essentially, that a 60mm macro lens would have more DOF than a 105mm macro lens), but it turns out not to be true.

R.

YES that's correct, infact i was the one that informed you of this fact. HOWEVER, this does NOT stay true when you change the size of the sensor.

edit: I think we're pretty much saying the same stuff though, lol.

Yes a 1DsII cropped to a 1.6x camera, considering all equal, with the same lens, will produce about the same image including DOF. But if you change the lens and don't crop the 1DsII file then the DOF will be greater in the 1.6x camera.
//www.dofmaster.com

Message edited by author 2006-02-07 19:51:41.
02/07/2006 07:58:52 PM · #28
Originally posted by kirbic:

That's my story, and I'm stickin' to it ;-)

I'm now confused and walking around in circles ...: )
02/07/2006 08:09:13 PM · #29
Originally posted by GeneralE:

Originally posted by kirbic:

That's my story, and I'm stickin' to it ;-)

I'm now confused and walking around in circles ...: )


Let me know when you've come full circle!
02/07/2006 08:11:46 PM · #30
Man, but I love reading these things. Keep up the input.
02/07/2006 08:18:21 PM · #31
Originally posted by kirbic:


Well, the Nyquist limit for the resolution of the sensor states that the highest frequency detail resolvable is aporoximated by 2 times the pixel pitch, and therefore all we need is 2 pixels per 0.03mm (30 microns), so pixel size need be no smaller than 15 microns, therefore 2400x1600px fills a 36x24mm frame.


Begs the question, why isn't 2400 X 1600 pixels adequate for a full frame sensor? Nyquist's theorem has proven reliable in the past and is commonly used to define the bandwidth of digital signal analyzers. Obviously, there is more to the image quality than can be defined by Nyquist. Even if we disregard luminence and attempt to image black on white line pairs we fall far short of what is predicted by Nyquist. IMO the difference lies in that CoC. For a pixelated sensor (as opposed to film grain) the minimum size is strictly limited by the pixel size. Could it be that the film grain size creates an interference pattern that is perceived as detail that cannot be recreated with discrete pixels? I don't know the answer and this is probably a bit simplistic, but we aint' there yet!
02/07/2006 08:34:26 PM · #32
Originally posted by ElGordo:

Begs the question, why isn't 2400 X 1600 pixels adequate for a full frame sensor? Nyquist's theorem has proven reliable in the past and is commonly used to define the bandwidth of digital signal analyzers. Obviously, there is more to the image quality than can be defined by Nyquist.


It is adequate... just barely, to define the detail visible in a print with average human eyes, viewed at a "normal" viewing distance, not inspected closely.
Just like with any signal, if you want to reproduce the nuances, eliminate aliasing, and a host of other good things, you need to oversample. If you oversample by just 2x (linear), you wind up right between the resolution of the 5D and 1DsII. :-)
This exercise makes a good case for producing a camera whose sensor DOES outresolve the lens to be used. Pixelation, aliasing, moire, edge jaggies, all will be eliminated through oversampling, at the considerable cost of LOTS more pixels. When viewed at 100%, though, the image will be quite soft, and tweakers will work to wring all the detail possible out of the system until... you guessed it, they're once again up against the Nyquist limit :-P
02/07/2006 08:36:18 PM · #33
Again with all due respect: CoC "relative to photography" is only a function of light as it is refracted [though the lens glass] and then through a small opening [the ap'ture].

To understand the Circle of Confusion, you must first remember that light is reflected off the subject(s) in all directions; (except for specular light which reflect within a narrow angle). Your camera gathers all the light rays from each point on the subject which are "reflected" toward the camera, and, fall within the "circle created by the opening of the diaphragm." You may visualize this as a cone which has its apex at the point of reflectance on the subject, and, it's circular base in the diaphragm opening.

Now, if the lens is focused on the point from which these rays are being reflected, the lens will bend the rays back in an inverted cone of light inside the camera with its apex creating a point of light on the film plane.

When the lens is focused on a nearer or farther distance the point will fall in front of or behind the film plane. If the rays come to a point in front of the film plane and they will cross into another cone until it strikes the film. (As a mention here; this is precisely why a given lens on a given camera has exactly the same actual image dimensions - how much of that image is captured is how only related too how large/small your film/sensor is at that plane!

In this case the base of this 3rd cone registers on the film as a circle of light rays which are not in register. This circle of unfocused light rays is a circle of confusion.

Since the angle of the cone of light is determined by the diameter of the diaphragm opening, the diameter of the circle of confusion is affected by changes in the diaphragm opening. Smaller diaphragm openings produce narrower angles in the cone, therefore smaller bases (smaller circles of confusion).

That is why "stopping down" increases depth-of-field, and never related to a particular camera body!

In Kirbic's mention however, its the ability of the eye, its' lens and its' iris being roughy a given for any human. Therefore the math calc to determine the resolving power of the eye (because it's affected by the same CoC since too in deed have an aperature also, the iris) is roughly the same for everone.

JF


02/07/2006 08:39:11 PM · #34
Originally posted by kirbic:

That's my story, and I'm stickin' to it ;-)


Sometimes you see a Doonesbury or Cathy comic strip that you're sure is very funny or educational, but there's just too much text to bother reading. I see lots of big words and numbers in that mass of copy, so I'll just assume you know what you're talking about. ;-P
02/07/2006 08:39:55 PM · #35
Seems that we are trying to use "magnification" in many ways. I suggest using the Webster's dictionary version but even that lends itself to interpretation.

On the other hand, "crop factor" is the ratio of the sensor to a full-frame sensor, and can be used for many important things like calculating the FF equivelant focal length and the slowest speed you can handhold the lens.

DOF depends on focal length and apreture, but it also depends on human perception and the print size--smaller prints have greater DOF. A 640x480 picture on the web has large DOF compared to a 20x30 print, or to a 100% view (or 400% view for the pixel peepers).

A working calculation of DOF is to imagine three points, the begining of the area in focus, the focal point, and the end of the area in focus (two of them are very subjective). Every point in your photo is affected by the crop factor in the same way, including these three points. (DOF calculation is much more complex than this, but this is a fair approximation).

One analogy that may help visualize the crop factor issues is film. Imaging getting a rectangular punch that punches out the central APS-C sized part of a 35mm negative. Both a FF and APS-C sided negative can be enlarged to fit onto a 4x6, 5x7, 8x10, etc sized paper. This will help you visualize the size of the image and relationship of lens length.

The film analogy falls apart when you start talking about resoultion (but Bear Music will probably correct me with analogy to film grain :-)

The pixel spacing (pitch) of the 5D FF sensor is the same as the 1D MII 1.3 crop sensor, but the 5D sensor has more total pixels. A crop from the 5D that's the same size as the 1D MII would have the same number of pixels, so (in theory) you should be able to crop the 5D image to get the same resolution at the 1.3 crop factor that the 1D MII has.

The 1D MII has the same number of pixels as the 20D, but the crop factor is different, so if you took the APS-C sized crop from the 1D MII, you'd have fewer pixels.
There may be some slight advantage to the larger pixels of the 1D MII in DOF, and the larger pixels may be more forgiving of lens issues (the circle of confusion is caused by the lens, so if the pixel is larger, the COC will cover fewer pixels.)
The larger pixels are better for gathering more light (so you can get a higher ISO), but the newer cameras have improved proceses which give a lower S/N and therefore a higher ISO. (That is, the 20D may be as good at ISO 3200 as the 1D MII is).

Message edited by author 2006-02-07 20:42:24.
02/07/2006 08:51:43 PM · #36
Well, this is a bird of a very different wing for the most part... The story of pixel size is really larger is more useful in cameras. IE, while both the 5D and 1Ds-II are roughly full frame sensors, the sensor dots on the 5D are a bit larger than the other (since there are fewer of them). Since the resolution is reaching the scales now, the "amount" of light quantities that are hitting the sensor is greater. Thus making it more reactive too the light, and better at lower speeds.

However a 5D will be as enlarged when printed to the standard 72DPI...for example I can print a 1Ds-II at 40x70 of an image from the same lens, same lighting, same F-stop etc, whereas the 5D perhaps may only go too 35x55 - all else remaining equal on the print under a loope...

Originally posted by ElGordo:

Originally posted by kirbic:


Well, the Nyquist limit for the resolution of the sensor states that the highest frequency detail resolvable is aporoximated by 2 times the pixel pitch, and therefore all we need is 2 pixels per 0.03mm (30 microns), so pixel size need be no smaller than 15 microns, therefore 2400x1600px fills a 36x24mm frame.


Begs the question, why isn't 2400 X 1600 pixels adequate for a full frame sensor? Nyquist's theorem has proven reliable in the past and is commonly used to define the bandwidth of digital signal analyzers. Obviously, there is more to the image quality than can be defined by Nyquist. Even if we disregard luminence and attempt to image black on white line pairs we fall far short of what is predicted by Nyquist. IMO the difference lies in that CoC. For a pixelated sensor (as opposed to film grain) the minimum size is strictly limited by the pixel size. Could it be that the film grain size creates an interference pattern that is perceived as detail that cannot be recreated with discrete pixels? I don't know the answer and this is probably a bit simplistic, but we aint' there yet!
02/07/2006 09:06:43 PM · #37
Originally posted by jefalk:

Again with all due respect: CoC "relative to photography" is only a function of light as it is refracted [though the lens glass] and then through a small opening [the ap'ture].

To understand the Circle of Confusion, you must first remember that light is reflected off the subject(s) in all directions; (except for specular light which reflect within a narrow angle). Your camera gathers all the light rays from each point on the subject which are "reflected" toward the camera, and, fall within the "circle created by the opening of the diaphragm." You may visualize this as a cone which has its apex at the point of reflectance on the subject, and, it's circular base in the diaphragm opening.

Now, if the lens is focused on the point from which these rays are being reflected, the lens will bend the rays back in an inverted cone of light inside the camera with its apex creating a point of light on the film plane.

When the lens is focused on a nearer or farther distance the point will fall in front of or behind the film plane. If the rays come to a point in front of the film plane and they will cross into another cone until it strikes the film. (As a mention here; this is precisely why a given lens on a given camera has exactly the same actual image dimensions - how much of that image is captured is how only related too how large/small your film/sensor is at that plane!

In this case the base of this 3rd cone registers on the film as a circle of light rays which are not in register. This circle of unfocused light rays is a circle of confusion.

Since the angle of the cone of light is determined by the diameter of the diaphragm opening, the diameter of the circle of confusion is affected by changes in the diaphragm opening. Smaller diaphragm openings produce narrower angles in the cone, therefore smaller bases (smaller circles of confusion).

That is why "stopping down" increases depth-of-field, and never related to a particular camera body!

In Kirbic's mention however, its the ability of the eye, its' lens and its' iris being roughy a given for any human. Therefore the math calc to determine the resolving power of the eye (because it's affected by the same CoC since too in deed have an aperature also, the iris) is roughly the same for everone.

JF


Yep, precisely.
Now, using this light-cone-based definition of CoC, what diameter of this CoC will make it the same size as the smallest detail renderable for a particular sensor?

Answer: The Nyquist limit (2* pixel pitch) for the sensor in question!


02/07/2006 10:51:38 PM · #38
God my head hurts!
02/07/2006 11:06:03 PM · #39
Originally posted by jbsmithana:

God my head hurts!


Mission accomplished! ;-)
02/07/2006 11:07:55 PM · #40
actually it isn't a breakdown in the nyqist sample rate, it is more an issue of sample rate, combined with bayer demosaicing which increases the required pixel count.
02/07/2006 11:19:27 PM · #41
Originally posted by Gordon:

actually it isn't a breakdown in the nyqist sample rate, it is more an issue of sample rate, combined with bayer demosaicing which increases the required pixel count.


True, Bayer demosaicing means that for individual color channels, particularly blue and red, there is less detail due to unersampling, so higher pixel densities would seem to be required. If, however, we consider perceived detail, which depends mostly on luminosity data, the ability of the best systems to render such detail closely approaches the Nyquist limit. AA filters are another fly in the ointment, they essentially ensure that the imager does NOT render detail up to the limit, or aliasing and moire will be to visible.
So in the final analysis, yes, some oversampling is not at all a bad thing. I don't know if i'm quite ready to deal with 48Mpx DSLR images though!
02/07/2006 11:31:07 PM · #42
The color issue is becuase if a particular wave length is, in that good ole term the perfectly focused cone point of light, and, the sensor pixel is actually groups of three dots all right next too one another [each of RGB], then in a matter of speaking the single Red point of light, might just possibly not fall on the Red sensor dot.

So we get right back full circle (now the pun is intended), the small CoC for each point of printed resolution has too be large enough to flood every color dot of the groups.

Whereas, one of the reasons slide film has such great resolution, is that its set up in layers on top of one-another, rather than the side-by-side arrangement of sensors!

JF
02/07/2006 11:34:19 PM · #43
From what I gather nothing is enlarged (zoomed) or reduced (cropped) what is captures is either less or more.

02/07/2006 11:54:50 PM · #44
For a standard convex lenses, all parallel rays passed through, will be focused (refracted) to a point referred to as the principal focal point.

The distance from the lens [glass] to that point is in another definition the principal focal length "F" of the lens.

The lens "strength" in diopters is defined as the inverse of the focal length in meters. For a thick lens made from spherical surfaces, the focal distance will differ for different rays however, and this change is called spherical aberration. Thats why Aspherical is a big sales point -

Not quite as notable, is the focal length for different wavelengths (or colors) also differ very slightly, and this is called chromatic aberration.

JF

02/08/2006 12:15:59 AM · #45
Originally posted by Bear_Music:

Originally posted by kyebosh:


Well no not quite the same, the dof will be different (more DOF for the 1.6x crop).

That is if you shoot with the same settings and distance with a 100 on 1.6x vs a 160 on FF.


This is not true. For the same reproduction ratio, at a given physical aperture, DOF is identical regardless of the focal length of the lens. At a 1:1 reproduction ratio, my 60mm macro has exactly the same DOF as a 105mm macro lens. I used to believe what you are saying here (essentially, that a 60mm macro lens would have more DOF than a 105mm macro lens), but it turns out not to be true.

R.

This used to confuse the heck out me too. What Rob said may be totally non-intuitive but it's absolutely true: "For the same reproduction ratio, at a given physical aperture, DOF is identical regardless of the focal length of the lens". This is important stuff, so let me try to rephrase it in case that helps anyone.

DOF is a function of five variables:
* Focal length
* Sensor size
* Pixel-pitch
* Aperture
* Camera-to-subject distance

As Rob implied earlier, you can get exactly the same image if you change both the focal length and sensor size at the same time but keep the other three variables fixed. If you stand in the same place, use the same aperture, and choose cameras with the same pixel-pitch, then a 160mm lens on a full-frame camera gives you the same image as a 100mm lens on a 1.6x camera. The subject will be the same size in both images, and the images will have the same DOF and perspective.

What used confuse me was this: If you change the focal length and camera-to-subject distance but keep the other three variables fixed, then you can still get images with identical DOF but they will have different perspective. Say you take a photo with the 160mm lens. Then you switch to the 100mm lens, but use the same aperture and same camera (so sensor size and pixel-pitch are fixed). Then you walk towards the subject until it's the same size in the viewfinder and take the second photo. The two images will have the same DOF but different perspective (the background will look smaller in the 100mm photo).

So changing the crop and changing the focus distance can have the same effect on DOF, but will always have different effects on perspective.

BTW, if anyone actually wants to read even more about this (yeah, likely story!) the wikipedia DOF article is actually quite good.
02/08/2006 11:58:28 AM · #46
I think some are making this too complicated and overlooking some fundamentals. DOF is solely a function of the lens, its aperture and focus distance. Changing sensor size cannot possibly affect DOF.
That said, if you change the lens because of a small sensor size then you have changed the DOF by virtue of the lens.
02/08/2006 12:04:16 PM · #47
Originally posted by gusto:

From what I gather nothing is enlarged (zoomed) or reduced (cropped) what is captures is either less or more.


Roughly correct yes. Though the resolution within the larger or smaller captured area and pixel pitch will also change.

Oh and the picture should be upside down and backwards - but that's picky ;)
02/08/2006 12:11:38 PM · #48
Originally posted by ElGordo:

I think some are making this too complicated and overlooking some fundamentals. DOF is solely a function of the lens, its aperture and focus distance. Changing sensor size cannot possibly affect DOF.
That said, if you change the lens because of a small sensor size then you have changed the DOF by virtue of the lens.


I know it seems this way, but in fact, if you change the sensor size, you must ask whether you're also changing the pixel pitch. If you are, then the DoF WILL change. If you are assessing DoF based on a print, and you change the sensor size but hold the print size constant, the DoF that's apparent in the print can change, even if pixel pitch is held constant. It's all in the math, and unfortuantely it is a somewhat complicated and generally misunderstood situation.
02/08/2006 12:22:15 PM · #49
Originally posted by kirbic:

Originally posted by ElGordo:

I think some are making this too complicated and overlooking some fundamentals. DOF is solely a function of the lens, its aperture and focus distance. Changing sensor size cannot possibly affect DOF.
That said, if you change the lens because of a small sensor size then you have changed the DOF by virtue of the lens.


I know it seems this way, but in fact, if you change the sensor size, you must ask whether you're also changing the pixel pitch. If you are, then the DoF WILL change. If you are assessing DoF based on a print, and you change the sensor size but hold the print size constant, the DoF that's apparent in the print can change, even if pixel pitch is held constant. It's all in the math, and unfortuantely it is a somewhat complicated and generally misunderstood situation.


But changing pixel size alters only how the image is captured (and perceived), still does not alter the DoF of the lens. There are many things that can be done to an image after it is focused on a plane, but altering DOF is not one of them.
02/08/2006 12:26:24 PM · #50
no the DOF of the lens does not change, but the used DOF does change if you change the size of the sensor. For all practical purposes it makes sense to have it as a consideration because of the perceved DOF.
Pages:  
Current Server Time: 06/20/2025 11:31:10 AM

Please log in or register to post to the forums.


Home - Challenges - Community - League - Photos - Cameras - Lenses - Learn - Help - Terms of Use - Privacy - Top ^
DPChallenge, and website content and design, Copyright © 2001-2025 Challenging Technologies, LLC.
All digital photo copyrights belong to the photographers and may not be used without permission.
Current Server Time: 06/20/2025 11:31:10 AM EDT.