Rangefinder

Views on the photographic universe by Erwin Puts

Linear or nonlinear?


This question might surprise you. But there are good arguments to discuss this mathematical concept in the photographic environment. Assume you have a 24 Megapixel sensor, like the one in the current M series. The image quality (a very elusive concept) should be replaced by the information capacity, but for now it are the Image Engineering calculations that dominate the discussion. The Image Engineering analysis of the sensor performance (always including a lens) has a good correlation with perceived image quality. The German magazine Color Foto is a true believer of this software. The recent issue has a report of the Leica M10-D. The results are, for ISO100, 1931 line pairs per image height (LP/IH). One would assume that when doubling the number of pixels on the sensor, the lp/im would also double. This is the linear aproach: when x=y the 2x would be 2y. As it happens, there is also a report of the Nikon Z7 with 45.7 Megapixels. Almost twice the amount of the pixels on the Leica sensor. The result? At ISO 100 it is 2822 lp/ih. The Z6 (with 24.5 Mp) has 1988 lp/ih. The lenses used are different of course. This might have some influence, as is the selection of the JPG format. The results of the Z7 are intriguing. There is a 90% difference between the pixel amount of the Leica M10 compared to the pixel amount of the Z7, but only a 46% increase in resolution. A non-linear result! So a roughly 2 times the amount of pixels results in only 1.5 times resolution.
Another comparison: the Leica has pixel pitch of 6 micron, the Z6 of 5.9 and the Z7 of 4.3 micron. The APS-C sensor of the Ricoh GR III has a pixel pitch of 3.9 micron with an APS-C sensor of 24 Mp and a resolution of 2075 lp/ih. Presumably the pixel size is more important than the sensor size. The Leica M8 is a living proof for this argument!
If and when Leica will decide on an increase of the amount of pixel for the next generation of the M camera it will be somewhere between the 24 Mp of the current model and the ±65 Mp of the Leica S models. Not being in the position to being allowed to compete with the SL in future versions (let us assume 45 Mp) the final amount would be somewhere between 24 and 45: 34.5, which happens to be neatly between both extremes. Then the increase in amount of pixels will be ±40%. The predicted increase in resolution will be around 0.5*40% and 0.75 * 40% = between ± 20 and ±30% or 1900 *1.25 = 2375 lp/ih, not the result to be really happy with. Assuming the usual tolerance of 5% for the bandwidth of the measured results, these figures only give the direction of thinking. The exact values are less important.
The same argument can be found in the discussion about film emulsions that can record 200 lp/mm and film emulsions that can ‘only’ record 80 lp/mm. With 80 lp/mm almost every detail, that is visually relevant in a scene, can be captured. But the price for the higher resolution is slow speed, careful focusing and the use of a tripod. In handheld shooting, the increase in resolution can not be exploited. Again, assuming that the M camera will be the champion of handheld snapshot style of photography, the current level of resolution that is supported by the sensor is more than adequate for the task. Leica could improve the imaging chain and especially the demosaicing section for enhanced clarity and best results.
UPDATE June 10: There is some confusion here:
Let us first get the basic figures about the measurements, based on the IE software, related to the fixed ISO 100
Ricoh GR-III: 24 Mp and 2075 lp/ih (pixel pitch 3.9 micron)
Leica M10-D: 24 MP and 1911 lp/ih (pixel pitch 6 micron)
Nikon Z6: 24 Mp and 1988 lp/ih (pixel pitch 5.9 micron)
Nikon Z7: 46 MP and 2822 lp/ih (pixel pitch 4.3 micron)
It is universally assumed that in order to double the resolution, one needs a four times increase in area: to double the resolution of the Leica sensor (24 Mp) one needs a sensor size of 4 * 24 = 96 Mp. This increase in size would (theoretically!) elevate the resolution from 1900 to 3800 lp/ih.
The Nikon Z7, which has only twice the area of the Leica sensor and therefore its resolution would be less: it is in fact 2800 lp/ih. The sensor of the Ricoh with 24 Mp has 2075 lp/ih with a comparable pixel pitch. This is important to note, because the Nyquist limit is related to the pixel pitch. For a pixel pitch of 4 micron, the Nyquist frequency is 0.008 mm per mm (or cycle) = 125 lp per mm. (application of the Kell factor of 0.7 gives 87.5 lp per mm). 2000 lp/ih is 64 lp/mm for a 15.6 mm image height. So there is some room for improvement, at least theoretically. The pixel size of 6 micron for the Leica would produce 0.012 lp per mm or 83,3 lp per mm. The 1900 lp are for the image height of 24 mm, which is 79 lp/mm. Including the Kell factor which says that you can only reliably resolve 70% of the Nyquist frequency, the practical resolution limit of the Leica sensor would be .7* 83 = 58 lp for every mm. The Leica imaging chain is better than that of the Ricoh! Or one could also claim that the JPG demosaicing of the Leica is more aggressive and that spurious resolution is spoiling the results.
The Nikon Z7 with its 4 micron pixel pitch would be able to resolve 0.7*125 lp/mm = 87.5 lp per mm. The image height is 24 mm and the resolution is 2800 lp/ih. This would result in 117 lp per mm compared to a Nyquist frequency of 125 lp per mm. Compare the measured resolution with the calculated Nyquist limit and the Kell factor:
Leica M10-D: 79 lp/mm; 83.3 lp/mm; 58 lp/mm
Nikon Z7: 117 lp/mm; 125 lp/mm; 87.5 lp/mm
The measured resolution is quite close to the Nyquist number. This is not surprising because the IE software uses the Nyquist calculation as the limiting factor in their calculations. This limiting value would result at the point where the contrast is almost zero. Not very useful! The Kell factor is used because there is a contrast level below which there is no visual difference between two adjacent lines. A contrast difference of 15% is the minimum and the Kell factor is in many cases too conservative.

Now the calculation. Doubling the size (from 24 Mp to 46 Mp) produces an increase in resolution of 1900 lp/ih to 2800 lp/ih. That is an increase of 47% or a factor of about 1.5. This is indeed a one-dimensional relation. It compares only one direction and not the area. But here is the confusion. The resolution is measured one dimensional in line pairs per mm. This resolution is identical in the horizontal and the vertical direction.The pixel pitch is a square measure (the 6 micron length of the Leica are the same for both directions. The pixel has a square area!) Now an example: assume that we would like to have the resolution of the Z7 for a new Leica sensor. Going from 1900 to 2800 lp per mm and increasing the resolution in both directions would require that the amount of pixels for the same size of the sensor grows2800/1900*24 = 35.4 Mp to get the same resolution of 2800 lp/ih. This value is less than expected. But the Leica processing chain might be more effective. The 35 Mp number would require a pixel pitch of 4.1 micron. This would result in a Nyquist value of 122 lp/mm or 2920 lp/ih. If we require to double the resolution of 1900 lp per mm, we would need a decrease in pixel pitch from 6 to 3 micron to get a resolution of 166.7 lp per mm. (Nyquist limit). This would imply an increase in amount of pixels to 96 Mp or 4 times the actual Mp or 24 Mp.
Mixing the concept of the number of pixels in a certain area and the concept of the resolution of the pixel itself (in a one dimensional line) may be the reason for much confusion. The Nyquist frequency is a one dimensional measure, assuming a square sized pixel, and will calculate the resolution of the system. The resulting pixel pitch will define the number of pixels per sensor area.

Handheld external exposure meters


NOTE: to keep this report as short as possible I have chosen not to illustrate all meters. There is enough info on the internet to get a view of any of these meters.

The following list of exposure meters has been evaluated.

______________________________
Weston Master V, 1963 - 1972
Sekonic L-398A Studio de Luxe III, 1978 - current
Minolta Flashmeter VI, 2003, later known as Kenko KFM-2100
Sekonic L-478D with 5° attachment, 2012 - current
Sekonic L-758D, 2001 - 2017
Sekonic L-858D, 2017(?) -current
Gossen Lunasix 3, 1966 - 1980 (Lunasix 3s)
Gossen Mastersix, 1983 - 1990
Gossen Starlite 2, 2001(Starlite) - current
Gossen Digisky, 2011 - current
____________________________

This list spans a period from 1950 to 2020. Handheld external exposure meters were required when mechanical cameras were made without a meter. At first exposure meters were considered a toy, not worthy of the true photographic craftsman who could estimate the intensity of the scene illumination by experience or used one of the many tables to ‘calculate’ the exposure values. Using an exposure meter slowed down the speed of taking pictures. It has been said of Cartier-Bresson that he could guess the exposure values (aperture and speed combinations) quite accurately. Never mind that his darkroom assistants had to cope with severe under and over exposed negatives. The other method is also true: C-B knew what his film could handle and by experience took only pictures when the ambient light fell within the latitude of his pre-set values. Two problems however had to be confronted: cope with the contrast range within a scene and the fact that the film reacted differently to intensity of the brightness areas. The classical examples are the black cat in a dark barn and the white bear in a snowy landscape. The thick two-layer emulsions of those days could handle a wide contrast range and also over- and underexposure. Experience taught the photographer to underexpose when confronted with a dark scene and to overexpose when the scene was brighter than the average one.
The problem of accurate exposure became more pressing when the miniature format gained popularity. The quality of the print depended heavily on the exposure: over- and underexposure will reduce the resolution and the tiny areas of tonal differences are more difficult to reproduce in print.
With the incorporation of through-the-lens exposure metering in so-called professional cameras, like the Konica Autoreflex-T, introduced in 1968. This particular model was designed for the professional market with its TTL metering and fully automatic exposure control, setting the norm for all professional cameras since. The lenses for the Konica model were also particularly good, optically, and suitable for professional tasks, but were overshadowed by the more popular Nikon and Canon ranges and the mythical Zeiss and Leitz ranges.
The last film-cartridge loading camera models used extremely sophisticated exposure algorithms, based on the analysis of thousands of film rolls and typical and a-typical scenes. When Nikon introduced its matrix exposure system, and Canon used its multi-segmented metering areas system, the technical integration of versatile metering, the classical hand held exposure meter faced extinction! The fatal blow for this instrument came when the solid-state sensor replaced the film emulsion. Now every pixel could become an exposure meter. The idea of combining focus points with exposure points was the next step.
Whatever the sophistication, the evaluation of the luminance values of every pixel resulted in one specific EV (exposure value). The algorithm how the software weighs and averages the individual values is unknown. The user is hardly interested, because the selected EV is almost always accurate and at least sufficient as a base for subsequent post processing. The cursory look at the histogram is not a real alternative for the function of the program, because the histogram only shows the distribution of the brightness values over the range of 0 to 255. It does not tell you which parts of the scene have which values. And it is still left to the photographer to decide what parts of the scene are important and what is the best exposure for these parts.
One of the best implementations of TTL exposure measurement can be found in the famous Olympus OM-3 and OM-4. There is a spot meter option, coupled to a high and dark selection. Use the spot to aim at a bright spot and the high option will increase EV by two stops, ensuring that this area is at the top of the characteristic curve. For the dark areas the opposite rule applies. It is a crude, but effective approximation of the Zone System. The only caveat is the fact that the spot diameter changes with the used focal length. This is a problem with all so-called narrow angle meters in a camera body. One of the advantages of the handheld spot meter is the consistency of the one degree spot. Whatever you aim at to measure, it is always a one degree measurement.
Here we have one of the advantages why it makes sense to use an external exposure meter: the spot meter option. Measuring several different brightness areas in the scene and averaging these gives a good idea of the final EV. The second advantage is the option of incident metering. Much has been written about this technique. The idea is simple: the illuminance of a scene depends on the brightness of the main source. Measuring this source is the only thing that matters. The various reflectance values of the parts of the scene are not relevant. With a sun high in the sky it makes no difference if we take a picture of a white or a dark wall. The EV is identical. Incident measurements were the favorite of many Directors of Photography on film sets. It is easy to measure the main and the secondary light sources and balance both. The Sekonic Studio de Luxe is the latest version of such a meter. It has limited sensitivity (only 4 EV at ISO100). It is a battery-less meter, a great companion for the battery-less mechanical cameras (as example: the Leica M3, Leica M-A, the Nikon F and the Canon Rangefinders).
The main advantage of this meter is the simple operation, combined with a maximum of information. The large dial shows at one glance all combinations of aperture and speed for still photography and cine, ISO settings and EV values. Under- and overexposure values by one and two stops can be easily selected. In fact this dial tells you everything you really need to know about correctly exposing a scene. The method of measuring reflected light is supported, but can not be recommended.

The other meter of this type is the Weston Master, here in its final version (V). The Weston Master V is a more modern version of the original Weston Master I, made in England from 1947. The Westons are famous for their accuracy and the special shape of the Invercone dome, converting the meter into an incident-type instrument. The shape of the dome gives accurate EVs even when the main subject is backlighted. The dial presents the user with a number of controls: in addition to the ones on the Sekonic meter, there is a brightness range scale and a method for reading extremely low light and extremely high light areas. Reflected readings are also quite easy and deliver accurate results.

The evolution of the exposure meter shows an increasing tendency to flash metering and radio control of external flash units. The parallel trend is to more sensitivity, flexibility and versatility.
More sensitivity and the simplicity of the traditional scales is the strong point of the Gossen Lunasix. Its range is now extended to -4 EV, a level almost matched by the new Sekonic L-858.
The Lunasix was a purely photographic meter. Gossen produced the next generation of meters, taking advantage of then new electric circuits and micro-processor. The Mastersix was a very versatile meter with additional accessories for specialized applications. It could master all photographic tasks and also the basic photometric tasks. It has the same sensitivity as the Lunasix (-4EV) and can be used as a scientific instrument.
Its size is a limitation and the latest addition to the Gossen range, the Digisky, is more compact. (size of the Mastersix: 70 x 130 x 34 mm versus Digisky: 60 x 139 x 16). The sensitivity has been reduced to -2.5 EV. This level is still enough for most scenes. The meter compensates this with an extended range of features, like flash readings, mixed readings for flash plus ambient light and most importantly a range of options for the cinematographer. A color display supports the analysis of the main values. The meter lacks a spot option. If this is important the Gossen range has the Starlite 2.
This meter incorporates all measurements of the Mastersix and adds the unique option of the Zone System readings. Dip switches are needed to change between function groups. The meter is very sensitive and it takes some time to get stabilized readings. The versatility for cine and photometric measurements is a bit too much for most users, but when needed everything is there, including a range of flash readings.
The layout is ergonomic, but the dip switches ask for some pre-configuration to get the optimum one-hand operation.
The change from incident light to reflected light via spot meter is done by turning a ring on the measurement unit. This is a bit inconvenient and the Minolta Flashmeter VI has a better solution. Changing between the two modes is easy. The meter is sensitive to -2EV (spot is only 2EV) and has separate buttons for Average, Highlight and Shadow readings. The range can be adjusted in the Custom Settings. Standard it is -3 stops for Shadow reading and + 2, 5 stops for Highlight readings. The latitude function shows the latitude of the preferred film emulsion and indicates when the measured subject contrast range is inside or outside the specified latitude. The Custom Functions are quite flexible and must be the maximum that an analogue meter can handle.
The Sekonic L-758D was announced in 2001. It is a unit optimized fro still photography with digital cameras. There is a wired connection with a computer to set the specific latitude for the sensor response for the minimum and maximum light levels that the sensor can handle. The cine and photometric values had to be omitted and a specific version L-758C was introduced for movie requirements. The basic L-758D has a flexible way of combining readings from ambient(spot and incident) and flash light. These modes require study and insight into what is happening at the scene. This requirement is indeed one of the main reasons to use a separate handheld meter. Using a meter gives insight into what is involved in the simple final exposure value. The camera may respond with a specific f-number and shutter speed, but there is no information why the camera has selected these parameters.
The L-758 can measure low light levels to -2EV (spot 1EV), can calculate and change the ratio of flash and ambient light readings and can do averaging readings and light contrast analysis.
During the first decades of the twenty-first century color LCD and touch-sceens became the rule for smart phones and cameras. Sekonic and Gossen went back to the drawing board and came back with the L-478D and the Digisky.
The L-478D has a sensitivity of -2 EV with the Lumisphere (reflected light with separate attachment 3EV) and incorporates all functions of the L-758D and C (but without the spot meter facility. The separate attachment (for five degree measurement) is cumbersome and not really a fine addition. For ambient (incident) light readings, flash metering and analysis and personalized display settings for still photography and cinema, the instrument is quite unique. There is one problem with all these flash meters and that is the measurement of high speed flash (stroboscopic and non-studio flash). Studio flash is rather slow and most meters can catch the peak of the flash energy. On-camera and in-camera flash has a very short duration and this cannot be measured by standard flash meters.
The recent L858 does away with this critique. It combines the touch screen of the L-478 with the options found on the L-758D/C and extends even on the software flexibility. There are no longer buttons on the instrument and everything is changed with the options on the touch screen. This is both flexible and slow: instead of just pressing a button, you now have to wait through a menu of options to find the one you need. The sensitivity has been enhanced to -5EV for incident readings and -1EV for the spot meter. The ability to measure HSS flash peak values and flash duration is a bonus for those photographers who use flash for freezing movement, but will be of limited use for the available light photographer who needs to measure dimly-lit scenes.

The measurements for comparison
All measurements were done in a diffusely lit room with all meters (in incident mode) pointing in the same direction. Spot metering was done in the same room with the target a grey card, made by Fotowand. ISO setting is 100.
Below are are the results in EV:
Weston Master V (Invercone)……………………………………..9.7
Sekonic L-398A Studio de Luxe III (Lumisphere)……………..10.3
Minolta Flashmeter VI
incident…………………………………………………………… 10.3
spot 1 degree………………………………………………………10.5
Sekonic L-478D
incident…………………………………………………………… 10.4
spot 5 degree………………………………………………………10.5
Sekonic L-758D
incident……………………………………………………………10.7
spot 1 degree……………………………………………………10.8
Sekonic L-858D
incident……………………………………………………………10.5
spot 1 degree……………………………………………………10.6
Gossen Lunasix 3
incident …………………………………………………………10.0
reflected…………………………………………………………..10.5
Gossen Mastersix
incident ……………………………………………………………10.2
reflected……………………………………………………………10.5
Gossen Starlite 2
incident……………………………………………………………10.4
spot 1 degree……………………………………………………10.8
Gossen Digisky
incident ……………………………………………………………10.0
reflected……………………………………………………………10.5
Note: The EV values for the incident and spot meter modes are not comparable, because they are generated in different conditions. The target for the spot readings is the grey card. This method is good for comparing the calibration of the meter.

Between the reading of the Weston master and the Sekonic L-758D there is a difference of one stop. This is partly due to the different method of calibration. Every meter needs to be calibrated in relation to a known light source. In addition there is also the design philosophy of where the average reading should be placed on the characteristic curve. The Weston Master clearly assumes that a slight over exposure (black and white film) is good for the shadow detail in the scene. The Sekonic on the other hand assumes that slight under exposure is the best solution (it is for slide film and general sensor capture). The Sekonic can be individually calibrated, so the choice is of secondary importance. All other meters are within a third or half stop difference. Hardly important for the average black and white or colour negative film. This difference is only significant when working at the extreme ends of the film or sensor latitude.

Why use a handheld exposure meter?
This question needs to be answered from several perspectives: functional, educative and fun.
Functionally the exposure meter is required when using a meterless camera (which is obvious), when a true one degree spot meter reading is needed and when there is a substantial use of flash light in ambient light scenes. It makes sense to use an external exposure meter even when the camera has a built-in TTL metering system and has a sensor. The exposure latitude of a sensor is less than commonly assumed: a spot meter is a great tool to make sure the exposure of the important areas of the scene is within the useable contrast range. An incident reading is also advisable when the scene has unusual brightness distributions. In all other cases there is no functional argument for using an external meter: the modern in-built TTL metering systems were and are remarkably efficient.
The educational role of the external meter is the most important asset of this method of exposure control: a spot meter reading of important areas in a scene and averaging these values and comparing them to the values that the camera proposes is often interesting and an inspiration for thinking. Using the Zone System or the Highlight and Shadow Buttons lets the photographer control effectively the tonal values of the scene, irrespective of this is recorded by film or by digital means.
The fun factor is the last but certainly not the least argument. Not being dependent on what the exposure algorithms inside the camera propose and making a decision oneself, based on a thorough analysis of the scene and a knowledgable prediction of the behavior of the recording medium, is fun. The technological dependency in photography is generally quite high and the exposure is too important to leave it to some program, however efficient.

What meter do I use?
I have two meters in my bag:
The Sekonic Studio de Luxe for battery independent incident readings
The Gossen Starlite 2 for spot metering, Zone System analysis and photometric analysis (this is fun and not directly related to the photographic tasks.

If only one meter is required (because of space requirements in my small Billingham Hadley bag I select the Gossen Digisky or the Minolta Flashmeter VI or the Sekonic L-858D (for its high sensitivity). The choice depends on the envisaged application (Gossen: compact; Minolta: ergonomics; Sekonic: flexibilty).
When using a classic camera (Leica M3, Canon VIL or Canon F-1), I match the meter to the camera (easy to handle Weston Master or the scientific Mastersix).
Luckily I have the choices!





The current landscape of lens reviews


The basics
Any lens (excluding the enormously expensive lithographic lenses, only corrected for UV light) has aberrations. Aberrations are like the Russian dolls: correct the third order and the fifth order will pop up, correct all fifth order aberrations and seventh-order aberrations will emerge and so on. The task of the optical designer was and is the balancing of all these aberrations in order to fulfill the main goal of all optics: transfer the subject points to image points without any loss of information. A daunting task that every designer solves individually. Modern design is supported by software (in many cases the Code V program). Most software works with merit functions that are translated into numbers. The program then proposes an optical system (lens curvatures, thicknesses and distances between lens elements) when the number of lens elements is given. The optical designer has now the task to construct the merit function. The main and big problem is to find a merit function that can represent the practical requirements of the intended user.
This is indeed the primary problem: how to find a balance between aberrations that supports ‘good’ imagery, where ‘good’ has a considerable bandwidth of interpretation. Once fixed, the optical system has to be manufactured and the tolerances of the process define the final and actual performance.
Scientists at the Zeiss Company studied this problem quite intensively and discovered that the image contrast (global and local) was the decisive factor in appreciating the image quality when the limits of eye vision are taken into account. In is known for a long time that the optimal human eye can only resolve details that have a certain size. It has been established that the eye can resolve at most 10 line pairs per mm at a distance of 25 cm. Ctein claims that any person can distinguish between 20 and 30 lp/mm on a print, but that statement is certainly not supported by scientific or empirical evidence.
The problem
Reviews of lenses have the task to analyze the characteristics of a lens and to relate these characteristics to a lens profile, such that a prospective user can know what to expect of a lens. This requirement is more problematic that it sounds. There are basically two approaches to lens evaluation: (1) the scientific lab analysis and (2) the personal and subjective analysis of selected characteristics of the lens in question. I have to stress the element of ‘selected’, because any lens is a compromise between conflicting demands. When the reviewer is most interested in close up performance, because that is what interests the reviewer, all practical ‘tests’ and the importance of the conclusions will be focused on this aspect. When a reviewer is most interested in finding colour fringes, all his work will be focused on finding these colour defects. What these characteristics mean in practice or how important they are for the majority of users is not considered.
The current landscape
Analyzing the performance of lenses is still an important element of many blogs, review sites and magazines.
In the past, lens testing was a specialized occupation. Projection of patterns (lines, points), and MTF measurements were the classical ways of doing this. Zeiss used their own MTF equipment and most others (including Leitz) used projection patterns. Needed was considerable expertise to analyze these patterns. The important issue was always how to relate the results to practical requirements.
Since the digital turn in cameras the camera has become the virtual test lab. Software is often used to analyze the results on the sensor surface. Many magazines and some blogs/sites use the software of Image Engineering or of Imatest to present the findings. These findings are what the designers of the software find important. While often interpreted as tests of camera sensors or lens optics, the fact is that these tests always use both elements. Not considered is the problem of the hardware, the accuracy of the camera and lens, the influence of the software to translate the pixel photon count to visual images and so on. All measurements will have a tolerance level of let us say 5%. The often used parameter of the lines per image height is presented with semi-scientific precision and the camera/lens performance is ranked according to these values: when a system is measured with a value of 2856 l/ih and another one with a value of 2788 l/ih there is no discussion that the real values might be 2713 and 2928, in fact reversing the order. Even when the tolerance would be 1% there is no discussion of the real values, related to the measurement values.
It is questionable whether the numerical results can be directly related to the practical requirements, but one requirement is obeyed: the consistency of the tests and the consistency of the lab conditions.
This requirement is not guaranteed by the many subjective evaluations that ask for attention on the web.
These personalized and very subjective reviews fall in two extremes: the subjective reviews that are full of the words ‘impression’, ‘feeling’ and ‘opinion’. The choice of these words indicate the intention of the reviewer: not an objective analysis, but a description of the emotions when using a specific lens or camera body. A good example is the website of Joeri van der Kloet. A cursory glance at the discussion platform of the TheOnlinePhotographer will show you that almost every comment made there on whatever topic is at best simply an expression of personal preference and at worst the continuation of some myth. It is amazing what internet discussions can reveal about the human psyche!
The other extreme is the pseudo-scientific approach. A good example is the website of Mr Chambers. His recent review of the Leica Summilux-SL 1.4/50 mm ASPh is a fine example of this attitude. First of all, he does not describe the test conditions and uses a range of different pictures to support his statements (lab rule # 1is to state the conditions of measurement and to hold as many influencing parameters as possible constant ). Secondly he uses the method of pixel shifting to artificially enlarge the resolution of the sensor. Presumably this is done to improve the resolution yardstick for the optical performance. Martin Doppelbauer has a sensible explanation why this is a fake option. My own experiments with the Olympus PenF and the option of enhanced resolution are also not impressive: in fact the results are worse than the original ones! Thirdly he is obsessed with the phenomenon of the occurrence of distortion and the use of software to correct this so-called aberration.Again, without scientific evidence, there is the claim that this software correction will reduce the optical performance. Any modern book about optical design will tell you that distortion can be easily corrected by mathematics without loss of information.
The last issue I would like to address is the pseudo scientific language, that is supposed to disguise the subjective bias of the reviewer.
I am always amazed that there are individuals who seem to be impressed by the length of the review (erroneously described as “depth’).
Geoffrey Crawley tried to review a lens with at most a hundred words and he thought that would be enough to give a serious profile of the reviewed lens. He also refused to add illustrations because they are easy to misinterpret or to put too much value on some phenomenon.
The best test for an optical system is still the brick wall and a slide film.



Leica Special Editions



Manufacturers of photographic equipment know that any product today has a limited production time with a general lifecycle: A product announcement is hyped by the ubiquitous cheerleaders, publishing on the social media. Then a product is available and the sales are initially high. Then sales flatten and even dry up. The product managers know this and they carefully hold back some improvements for inclusion in future model changes. When a new market is required, a totally ‘new’ product is designed. ‘New’ is not the correct designation: the competition has these features already or a clever modular construction design allows for a mix of features. Canon is a master in this area.
Leica follows a different strategy. Given the high investment in a new or improved model and the long period of amortization (required because of low production volume), a new model can be announced every two or three years (compared to the six months to twelve months that a typical Japanese manufacturer requires). As soon as a camera model approaches the flat sales cycle, the company starts producing special models, specially for the M-range.
This was the obvious sales strategy for the period between 1980 and 2000 when a confusing amount of special editions was announced.
Not one of these special models could improve or efficiently support the photographer to take better pictures (technically and/or artistically). This statement is true for the most current special M models. While previous editions used iconic models of the past as a source of inspiration, current special editions are fashion statements, like the haute couture world.
The most recent edition is the Kravitz Monochrome Drifter edition, one of the most appalling and ugly versions in the history of the special editions. The ASC edition occupies a good second place, because of the simulated gold plating of the lenses. The subtleties of previous models are gone and the design statement is more aggressive and may I say (?) desperate.

Leica+M+Monochrom_Drifter+Summicron-M+2_28_FRONT_RGB

The girl with a Leica



Most Leica aficionados are familiar with the (iconic) picture of Rodchenko (Girl with a Leica) from 1934. Recently a book with the same title has been published in 2017, written by Helena Janeczec. It is about Gerda Taro, the friend of Robert Capa. She was a very good photographer herself, better even then Capa. She died in the Civil War in Spain. The story is simple. Capa got a second-hand Leica from a friend, but hardly used it. He gave the camera to Gerda, who used it all the time. Capa himself used the Contax camera and also had a Rolleiflex with him.
The Leica people stubbornly insist that the Capa pictures (the well-known “dying soldier” is one of them) were made with a Leica, which is not true. It is a fact beyond doubt that the D-day pictures were made with a Contax. The story behind these pictures is rather fascinating and AD Coleman is one of the main investigators to find the truth. It seems that the actual heroism of Capa is less impressive than the myth. And the poor guy in the darkroom did not screw up the rolls of films as is often claimed.
Even the dying soldier may be a staged scene. No one knows for sure.
The book about Taro is a very pleasant read. It seems to be available in Italian (the original) and in a Dutch translation, which I am reading.



raggazza



By the way:

All orders for my new book, the Leica Path, have been shipped.