on landscape The online magazine for landscape photographers

Camera Colour – First Tests

Variation in Camera Colour Rendering

Skip to Comments
Tim Parkin

Tim Parkin

Amateur Photographer who plays with big cameras and film when in between digital photographs.

Flickr, Facebook, Twitter

I’ve written about sensor colour in a previous issue of On Landscape under the title “The Myth of Universal Colour”. In that article I looked at the quite common preconception that if different cameras create pictures with different colours then these can be easily corrected in Photoshop.

To recap the reasoning you need to know that ‘colour’ is a perceptual construct (i.e. it gets made up in our heads) which means that the colour red is not a property of an object but a property of the combination of the light hitting an object, the object itself and the combination of the ‘sensors’ (e.g. rods and cones in the eye).

Light is actually a spectrum of lots of different frequencies or colours of light. (e.g. the suns colour is made up of a full rainbow of colours). You can imagine it’s like a paint mixing set. You can get a green colour by starting with a green pigment or you can mix a green pigment by starting with a blue pigment and a yellow pigment.

Now normally these two colours will both end up looking green to our eye because that is what paints are mixed for. What they look to a camera sensor may be different though.

Imagine the sensor happens to have extra sensitivity to blue. The pure green pigment will look OK but the mixed green pigment will tend towards a bluer colour. This effect is known as ‘metameric failure’.

So why do I think this is a problem? Well I’ve been using a combination of film and digital for some time and when I started with film I was also using a Canon 5Dmk2. Looking at the images side by side when I first started was like night and day. The colour from the 5D2 was very poor in comparison. However, when I went back to write my original article about a year ago I looked at the images again and was presented with the Lightroom dialogue “do you want to update your image to the latest camera raw engine” (or something like that - I forget the actual words). When I selected “yes” the colour improved enormously (Another good reason for shooting in raw - check on some of your old pictures to see if you can see the difference).

However there were still some discrepancies. These were particularly in lush foliage where grasses in particular would look a very yellow green on grass and in the colour of geology where shadows tended to go brown on my Canon.

I thought that this was something intrinsic to film, and to some extent it is however it was when I went out with Dav Thomas a few times and we shot similar scenes on both our digital cameras and his came out looking remarkably film like in comparison to the 5D2 that I realised that different cameras really do see colour fundamentally differently.

The prompt said “do you want to update your image to the latest camera raw engine” . When I selected “yes” the colour improved enormously - Another good reason for shooting in raw
So at this point I was on a quest to find out why - hence the original article. But it doesn’t answer the question of how it affects day to day photography. From what I have seen in a couple of ad-hoc tests is that for many things there is little difference but the only way to get to some real answers is to take some real photographs.

My first opportunity to do so was up at Southerscales near Ingleborough where I dropped in on Joe Cornish and David Ward running their Yorkshire workshop. Fortunately Bruce Cairns and Dave Mead had a Nikon D800 and Canon 5D mark 3 respectively. We also had a Sigma DP2, a Panasonic LX5 and an iPhone 4S to add a bit of extra intrigue.

We included a Colorchecker target in the image so we could colour balance the images. Sadly only the Canon and the Nikon allowed this as the Colorchecker software requires a DNG raw file of the right format. We balanced the other cameras as good as possible by eye however.

The main point of interest was the difference between the D800 and the 5D3 however. The only downside of the test was that the location didn’t really have much variety or foliage of the sort that I’ve seen problems with but as a proof of concept test it would have to do. We chose an area with some grasses and mosses as well as some yellow lichens.

This first image shows the overall scene difference between the D800 and the 5D3. Not a lot of difference is there. However the comparison afterwards shows a much smaller area and we can start to pick out a few colour differences. I’ve used a color dropper to sample a few different areas and reproduced them as colour squares on the image.


Now depending on your sensitivity to colour this could either be completely the same or quite different. I think I’ve become quite attuned to these changes and the difference in the greens for instance is quite large. The yellows look very similar as does the mustardy green of the mosses. The shadows are closish but I can see the slight red tinge to the shadows on the rocks that I have observed in Canon files before.

It is definitely the greens that show the most difference though. Let’s have a quite good satch at it though - I’ve upped the vibrance by quite a bit in the next shot, not outside the bounds of some photographers I’ve seen on the interwebs though.


Now the differences become a lot more apparent, especially in the shadows of the Canon shot.

Now this looks quite bad for the Canon but you have to remember I’m pushing things quite hard here and the shadow colours could be fixed in post processing and perhaps the passport colorchecker didn’t do as good a job as it could have. However the casual user won’t be aware of these tweaks and it’s probably fair to say that they shouldn’t need to be aware of them.

You can make fixes to the canon colour to bring it in line with the Nikon but you can’t apply them globally because some of the mosses are also the same colour and fixing that colour globally makes the mosses look wrong. If you wanted to fix like this you’d have to create a mask and only paint over the grasses. The fix is a 10 point hue shift towards blue - a process that both me and Joe Cornish have discovered through general experimentation. Here's the Canon file followed by the Nikon file and finally the corrected Canon file.






Canon with 10 point hue shift in greens (masked for grasses only)

Canon with 10 point hue shift in greens (masked for grasses only)

The camera that produced the strangest result was the Sigma DP2. It’s results varied so dramatically from the rest of the cameras that either the rest of the cameras were very, very bad in a remarkably consistent way or the Sigma has a very idiosyncratic colour reaction.
It’s also true that it may be the Nikon that has the wrong colour and it’s only through a little more testing that we’ll find out.

Next step is to get a few more cameras together and to test them in spring when we have a greater variety of greenery around, a subject that we are sure is the most difficult to deal with.

Finally we also tried some other cameras and I reproduce the results here. The Panasonic LX5 did remarkably well, producing results that were very close to the Nikon's. The iPhone put in a respectable showing but with a very small dynamic range and a strange red tint in the shadows.

The camera that produced the strangest result was the Sigma DP2. It's results varied so dramatically from the rest of the cameras that either the rest of the cameras were very, very bad in a remarkably consistent way or the Sigma has a very idiosyncratic colour reaction. The particularly strange result, apart from the fact that the greens look quite brown, is in the grassy straw in the background which shows mostly green in all the other cameras but pinky orange on the DP2.

This test has definitely produced enough of a difference to warrant more research and we'll be taking a range of cameras out on a nice greeny spring day (if we get any). If you have any observations about these sorts of colour differences we'd love to hear about them

Panasonic LX5

Panasonic LX5

Sigma DP2

Sigma DP2

iPhone 4S

iPhone 4S



  • Jon Tainton

    Aside from the camera sensor issues, I’d also be interested in hearing your thoughts and other photographers experiences with :

    1. Raw Conversion software handling colour.
    2. The potential effect of lenses in influencing colour.

    FWIW, with a manually set white balance, I’ve personally noticed subtle variability in OEM lens models of colour rendition and also with third party lenses a subtly different colour rendition. It’s all a bit bewildering to be honest …

    • Theoretically the filtering effect of the glass could add a colour change beyond a global ‘tint/temp’ might have to buy a bunch of lenses to test with – oh well! As for raw converters, I’ll try and report on that when we get a good comparison as well..

      • Jon Tainton

        I know this isn’t strictly in line with the article title, but a least with digicams there’s a chance to alter camera settings in the menu, whereas a lens character/performance appears to be fixed. And with the price of cine lenses, cinematographers have conducted some interesting tests on cine, modified stills and stills lenses, their observations and preferences are skewed to their craft/style, but some insights, I think, are relevant to landscapes too. Some links to –

        A series of lens characterisation with test 3 an outdoor performance evaluation http://www.ryanewalters.com/Blog/blog.php?id=7418575377215665840

        and a link to a 5D2 with Zeiss/Canon/Leica test http://www.hurlbutvisuals.com/blog/2011/12/lens-test/

        The results might suggest that with lenses a ‘horses for courses’ type scenario could be adopted, with a lens from manufacturer A fine for Autumnal colour, a lens from manufacturer B optimal for neutral scenes and so on, maybe?

      • I had once an issue shooting stereos where I discovered afterwards that I had a UV filter on one camera and a skylight on the other! The images and hardware were otherwise identical apart from subtle non uniform colour casts… Which were pretty much impossible to correct.

        I therefore expect that lens glass and multicoatings will have a big influence across manufacturers.

  • AlexeyD

    Great article Tim – its good to see that this subject gets some attention and I personally think it is more important than megapixels or high iso performance which lately seems to be the main driving factor for camera manufacturers. The subject of colour in digital cameras has been a long debated topic on some Russian photographic forums lately with some interesting examples (soem event in pairs with the shots of the same place) from variety of cameras. It was interesting to see there the comparison between D800 and old Kodak 645 Pro back (the latter is well known for its good colour reproduction) where colour wise Kodak imho still wins (the particular sample there shown with lots of greenery where Kodak resolved a lot more green tones variation and Nikon’s one look pretty much uniform green).

    For me the quest for a good colour started when I get back to film. I was using Nikons for quite some time but when I kept upgradyng the cameras (D70->D200->D300->D700) I could not help but notice that colour wise some of my older photos were better. With the D300/D700 I did started spending a lot of time tweaking colours in PP without a completely satisfactory results and no profiles would help the matter. Then I got back to MF film and the colours there were simply incomparable – so much better. A lot of digging around, listening to Iliah Borg lecture and communicating with other people helped me to gain at least a little bit of understanding of the issue. Film has a lot of people working on it and not just engineers – a lot of the involvement is from the people with understaning/degree in art, painting etc. I.e. film colour responses and those colour twists that exist in the chemical layers are designed to give aesthetically pleasing colours, not just document what the colour is in reality. Harmonical hue shifts and at the same time retaining certain neutral colours the eye can cling on are all characteristical of film – these Kodachrome samples here http://pavelkosenko.wordpress.com/2012/03/28/4×5-kodachromes/ is a good examples of this. In digital however, the sensor designers are not striving to achieve this and by the looks of it the experience of the film is lost. Sensor designers seems to strive to have colour filters to match an eye response curves (rather then have a cleaner colour separation) and nowadays are driven by design goal to produce a camera/sensor capable of shooting black cat in dark room handheld rather then care about colour reproduction. This all results in sensors capturing weakly separated colour and all the efforts tring to regain it later in PP by profiling applying tricks like hue twists etc. The problem is however, that these weakly separate channels cause an information loss at the point of capture and no profiles will be able to fix that.

    Its all quite sad really, after trying D800, which has better colour than D300/D700 and is a step in the right direction, it is still not quite as good as Nikon’s own D2x and Sony s900 colour wise. I wanted to stay with Nikon system due to lenses I have so in the end I opted for Kodak SLR/n instead of D800. Having shot with it for half a year all I can say regarding colour is wow. I recently tried it with RPP’s true film colour profiles and was able to come quite close to Velvia and Ektachrome (I had some shots done in summer on both films and Kodak SLR/n – not exatly the same subject but taken at the same day and the same time). Experience using Kodak SLR/n is that the colour out of it is more pliable (for the lack of a better term) – it is good to start with but survives a lot more postprocessing than say D300/D700 were without breaking apart. From conversing with the owners of other cameras with good colour (D2X, E-PL1, a900) and trying their raw files it also seems to be the case there.

    I also wanted to make two points on the article.

    The XRite colour checkers are differ from sample to sample quite substantially. In addition to that their grey ramp spectral response differs in various patches so not all grey patches can/should be used for WB in all the light conditions. From my unsatisfactory experience with Xrite passport and some information gathering, the QPCard 203 is a lot better (for camera profiling) and with a lot less sample variation.

    Then usage of LR/ACR for colour related comparisons may not be ideal. The ACR engine uses twisted profiles to compensate for the bad colour separation on modern cameras. This effectively does a hue twists to make well known colours fit in what they should be (hue wise) when profiling. In modern cameras those could be quite substantial to compensate for weak colour separation. Think of them as tweaks to the camera profile. With new versions of ACR you just get the newer profiles with more of those – which petentially could explain why you have seen improvemnt by just moving to a new ACR process version. The twists however have another drawback – they applied for certain lighting conditions and to certain colour hue/brightness ranges so as soon as you start tweaking exposure (or related controls in ACR like shadows, highlights, WB etc) that can make some colours no longer fit in the twisting range and results in the colour shifts. There are some tool to untwist the profiles so these can be used to give a closer to what camera can produce. The ACR is also notorious for having a hidden defauls (i.e. with all settings to 0 it is not exactly how it looks in RAW – some silent exposure compensation and channel blurring/NR is applied silently).

    You may want to look at RPP raw converter ( http://www.raw-photo-processor.com ) – that was actually created by photographers for the photographers. It also has untagged output mode which can basically produce TIFF file without using any profiles so you kind of see the colour as it comes out of the camera. And for the pure raw files analysis there is an excellent RawDigger now ( http://www.rawdigger.com/ ).

  • chris ashwin

    Hi Tim, great article – something I hadn’t really considered until reading it. Having read the article I can certainly relate to the theory of ‘sensor casts’. I have long suspected a blue cast on my 5D mkII . I usually shoot in the daylight setting and adjust accordingly in Lightroom and/or Photoshop, however, when I compare the daylight setting to the daylight setting within Lightroom the difference is huge. The same applies across all the settings I.e. auto Wb on camera has a distinct blue cast when compared to Lightroom’s auto, and the same applies to cloudy and shade. The more critically-tuned my eyes become to colour, the more I see a consistent blue cast straight from camera.

  • danfascia

    This has been another fascinating debate, the pattern emerging is that I always seem to really enjoy your heavily physics / science based posts Tim.

    Remembering back to the last time you visited this topic, you loosely concluded (as I remember) that one could not replicate the colour signature (characteristics) of one sensor easily onto the raw file from another… and I really wanted to add the suffix YET to that.

    Surely, theoretically as long as a sensor is able to record signal in the same wavelength (colour value) as another, then it would be possible to adjust / morph values in those areas of the individual RGB curves from one sensor to match another. Naturally, if there is low signal we may introduce noise in doing so and we may get no result if the sensor has not actually recorded any data in the zone we are attempting amplification on.

    To a former sound engineer, this may sound familiar I imagine. Pattern matching spectral analysis can be used to sample the “sonic signature” of a recording and morph the EQ compensation adjustments onto a target recording quite easily and it is used all the time for “digital remastering” of old recordings nowadays amongst other things. Back 20 years ago we didn’t have plugins to do it and you depended on a set of Golden Ears to listen to a piece and replicate the EQ for you. I feel that this is the stage we are at with digital imaging, we are qualitative and not yet quantitative in our approach.

    Is my belief that this should be entirely feasible within the realm of digital imaging an oversimplification? or is it that we just do not have such tools as the analytical EQ yet and we don’t “spectrum analyse” our image files in the same way.

    • AlexeyD

      > Surely, theoretically as long as a sensor is able to record signal in the same wavelength (colour value) as another, then it would be possible to adjust / morph values in those areas of the individual RGB curves from one sensor to match another.

      It is not that simple. If one of those two sensors has weaker CFA than another, then it will not be able to distinguish between all of the different input signals that the other sensor will giving the same response to some of the different inputs. The result will be pretty much the same as lossy compression and mathematical function will be able to restore lost details. In this case you will be theoretically able to “morph” the response of the sensor with greater selectivity (stronger CFA) to the one with lower but not the opposite.

      • AlexeyD

        Sorry, meant to say “no mathematical function” above

  • Seamuscamp

    “It’s also true that it may be the Nikon that has the wrong colour and it’s only through a little more testing that we’ll find out.”

    This is very subjective, isn’t it? It reminds me of a perennial discussion at school around 60 years ago – How do we know that the green we see isn’t the red that someone else sees? Of course we don’t; and usually it doesn’t matter much. What we see is the product of a lot of physical, chemical and electrical interactions. In my own case I see significantly different colours from each eye – I have a minor cataract in my leftgiving a muddy tint to some colours, and a clear plastic lens in my right – so God only knows what the composite is.

    I wish you luck; but, whatever conclusions you ultimately reach are likely to be valid only for you and your equipment. Different examples of the same camera are likely to give different responses. Never mind computer, monitor, paper, ink, printer, ……

    • Sadly not – these are repeatable colour differences and not just overall casts (which can be corrected easily in photoshop) but hue shifts with certain substances that have metameric shifts. And yes people do see slightly different colours but the ‘delta’ difference in hue tends to be consistent (i.e. people consistently judge the differences between colours more consistently than the absolute shade).

      e.g. If you shoot a picture of a scene and then compare the result with the scene then you can see what parts don’t match. If you do that with another camera you can give a comparative value of which camera is more accurate.

      And different examples of the same camera will have similar colour shifts because it’s the spectral response of the bayer filter material and the spectral sensitivity of the silicon that dictates the response. It’s just physics but quite complicated physics.

      The effect of individual variations in colour is minor compared with the effects I’m looking for.

      • AlexeyD

        It would be interesting to test in this respect how today’s cameras (sensors with their CFA) holding on in terms of colour vs film. A theme for another test perhaps? I have made conclusions for myself but this of course is very subjective and it would be interesting to see a more thorough and systematic approach (like you did in a fantastic article series about various films comparisons).

      • Seamuscamp

        “If you shoot a picture of a scene and then compare the result with the scene then you can see what parts don’t match. If you do that with another camera you can give a comparative value of which camera is more accurate.”

        That just seems wrong. As the saying goes “you can’t step in the same river twice.” Go back to any site repeatedly and colour rendition with the same camera and the same settings is almost certain to be different. The idea that subjective comparisons could end with an objective value seems farfetched.

        That is not to deny that there are real differences between cameras and between camera-lens combinations. Indeed the RAW conversion software from DxO addresses exactly this problem for a wide variety of cameras and lenses:
        “Each camera model performs this color rendition in its own way because camera manufacturers know that absolute color fidelity is usually not what is expected by photographers. Each camera therefore produces its own “image look”, very much in the same way that each film type produces a particular “image look”.

        So the problem can be addressed in the laboratory in controlled and repeatable conditions but not, I suggest by the human eye in variable conditions.


        I have had DxO Optics Pro for a few years now and find
        it easy to use. Another DxO product (which addresses a problem of no interest to me personally) is the ilm Pack which purports to reproduce digitally the profiles of film stock such as Velvia, Provia, Tri-X

        • Hi Seamus – I agree from day to day and hour to hour the light might change but not from minute to minute in consistent light and I’ll take a few shots side by side at the same time just as a check that the variation from minute to minute isn’t screwing things up.

          The results show will be given ‘as is’ and people will be able to make their own subjective judgements. I don’t plan on giving cameras a ‘score’ although I will publish my own personal conclusions (as per the Big Camera Test).

          And yes the raw converters also make a difference so they should be included.

          I’ll possibly try to do it under artifical, consistent lighting but I’m not sure how much use that is for landscape photographers as the whole issue is with the way daylight, subject and camera spectral responses combine.

          We’ve tried the film pack in a previous article and it didn’t really do anything beyond some ‘flavour’ presets – it certainly did’nt simulate anything I could recognise…

          • Seamuscamp

            I won’t pretend to believe that the DxO Optics methodology produces “accuracy” (though it does correct for some “inaccuracies”); and the “film pack” philosophy is certainly dodgy. But both approaches try to escape from subjective assessments – and your methodology is not just subjective but only individually relevant. Specifically, you assume that one specific rendition is correct and then assess other renditions by eye, for adherence to that standard. This is the very antithesis of repeatability and is inherently biased. The point is underlined by your own words:
            “It’s also true that it may be the Nikon that has the wrong colour and it’s only through a little more testing that we’ll find out.”
            I suppose it depends what you mean by “a little”; and what you mean by “wrong”.

            This is an interesting exercise which shows that camera-lens partnerships matter when it comes to colour rendition.

            • I don’t think I’ve published a methodology yet Seamus. I was planning to use a Pantone reference to pick colour matches for certain parts of the live scene but this is possibly not the most important aspect. It’s the preference of rendition that is more important.

              Who cares if a camera is getting it wrong if it’s getting it wrong in a believable and aesthetic way. Otherwise why would Velvia 50 and Portra got such a big following? Both render subjects in a way that doesn’t match reality but they depart from reality in a aesthetic way.

              The main goal of this is to show people that there is a difference and show them what the difference will look like. If we can also show that some renderings match reality better than others then that would be a bonus.

              As far as the Nikon possibly being wrong I was thinking by it would be classified as ziffly doodad wrong by 24.66 meganoodles. ;-)

  • danfascia

    I’m still not convinced by this ‘not possible’ argument. Call me stubborn, but nobody has given a compelling counter scientific argument yet.

    I acknowledge that the Bayer filters are different in every camera etc etc and therefore a different (say) green gets recorded for every value on the histogram, but ultimately the digital photo is merely a sum of a proportionate spread of RGB pixels across a 2d matrix.

    Therefore, surely if you shoot 2 identical (and I mean totally identical) scenes with 2 different cameras, then you match their individual R, G and B curves like for like; then providing both sensors capture some information in the same zones, you should be able to compensate the difference in amplitude of every RGB value on the curves and get a similar “look”.

    If one were to shoot a standard identical scene on every sensor, then compare the variations achieved against the known standard at every R, G and B value combination I cannot see why it would not be possible to create colour correction coefficients for individual sensors, at least relative to the accepted standard (perhaps the ITF8?)

    • Hi Dan – take a look at this blog post..


      If you look at the 5D shot you’ll see that two substances that show different colours show exactly the same colour in the 5D file. You can’t do anything in post processing to make these two substances look different again.

      This image shows that it isn’t an overall ‘cast’ issue because the two colours would still be separated if that were so. I’ll have to write a longer post about the science behind colour I think – hopefully that will make things a little clearer.

    • AlexeyD

      > Therefore, surely if you shoot 2 identical (and I mean totally identical) scenes with 2 different cameras, then you match their individual R, G and B curves like for like; then providing both sensors capture some information in the same zones, you should be able to compensate the difference in amplitude of every RGB value on the curves and get a similar “look”.

      And what if one camera gives a metameric response to two different colours and another one does not? I.e. for eaxmple camera one sees orange and light brown as the same colour as camera two sees them as two different colours? You can get “similar” look sure but the worse the initial colour separation the less “possible” it will become. The colours from camera with weakened colour separation (weak CFA) tend to bread down more and profiling only helps to a degree. Or rather profiling becomes very dependent on lighting.

      • Hi Alexey – it’s even worse than you mention. It is entirely possible that two colours could actually swap places completely. e.g. you might have some grasses showing yellowy green and lichen showing pure green on one sensor and on another the grasses show green and the lichen show yellow. I’ve actually almost seen this happen when comparing Velvia 50 with Canon 5D where Velvia renders flourescent green lichen as pale yellow and Canon’s render normal green grasses as yellowy green.

        • AlexeyD

          Thanks Tim you are right – I was appaled when compared Velvia 50 with Nikon D300 and spend a long time trying to make greens and yellows look right (and as I said ended up getting Kodak SLR/n instead ;). I hope this trend of lowering CFA colour fidelity (if I can call it that) is starting to reverse – D800 is a good example of it going the other direction but imho not entirely there yet.

  • Carlo Didier

    Thanks for this great article, Tim. Although I’m in the park of those who don’t see such subtle differences, I perfectly understand the physics and the resulting problems (as per colors which can be seperated by one sensor/camera but not by another for example).

    • Hi Carlo – It’s difficult when there is so little grass in the shot. We’ll see how prominent it is when we shoot some spring greens etc..

  • Adam Pierzchala

    This is really interesting even if simply as a quest for knowledge, but I am not sure where this is leading us. I am a lapsed chemist rather than a physicist and although I can understand why different film emulsions produce different colour renditions, I am quite happy to just accept that there will likewise be colour differences between digital sensors.
    Having said that, what does the difference in colour rendition mean to camera users? As someone who is dabbling with digital on the periphery of my film work, I don’t think that I will be able to experiment with different digital cameras to see which one I like best. That would be very expensive – much more so than buying and trying different film emulsions! Yet if it becomes possible to say with certainty that I prefer the rendering produced by the Cankon to that of the Niktax, I would buy the former. Would that rendering vary with the colour of the light? Certainly we know that films alter their rendition of colour with changing in light, which is only to be expected. Would my preference for the Cankon be negated when the light becomes warmer at sunset, or colder before dawn? Would the next generation of sensors change their rendition, just like new film emulsions do? Perhaps more importantly, do I want to reproduce what I see or to show what the eye misses – ie. show the ‘hidden’ cast from a blue sky or the warmth of sunset. Sticking to AWB washes away much of the hidden colour. And all this without opening the debate about what is ‘correct’ or ‘natural’ colour!
    I really am not sure what benefit there will be to photographers who have already invested possibly £££ thousands in lenses from one system or another, and are happy with the results, only to find that in a quasi-scientific comparison the colour rendition of the competitor’s brand is perhaps more appealing. I for one could not afford a major system change and would carry on with that which I already have. But maybe for someone about to buy into digital capture then it would make sense to know how different cameras render colours. So whereas this is a fascinating subject in itself, I don’t think that we should get too hung-up about it.

    • Hi Adam,

      I’m not sure if you’re saying that people won’t change systems or that if they do they won’t care about the colour response of the cameras? Personally I know of quite a few people who have chosen cameras systems based solely on the colour response (me, Dav, Joe to an extent).

      I don’t think the colour response we’re talking about changes with the colour of light because it’s the colour relationships that are the important thing, not the overall colour cast.

      At the end of the day people do change camera system from time to time (witness the large numer of people jumping ship to Nikon or picking a compact system camera to use with their primary tool) and if some cameras show more attractive colour relationships for some subjects it makes sense to include that as a criteria for selection.

      I think the problem is that it doesn’t get talked about at all and hence has become an ignored issue. If DxO demonstrated the differences and quantified them in a clearer way than their current metameric index I think people would be very interested. As it is, it’s not even included in their total scoring system.

      As for what benefit for people who have already chosen a system, if we can find out what the variances are we can suggest ways to counter the problem. e.g. for quite a while Joe has worked with a system of tweaking the greens out of one of his digital cameras in order to get a more natural look. However when using the IQ180 this hasn’t been necessary. Joe developed this technique organically over time perhaps we can find a more practical way of coming to these answers by comparing sensor results.

      Also it seems most cameras reviews are more interested in high iso performance or frame rate or video capability. Perhaps if we raise awareness of colour response manufacturers might pay a tiny bit more attention to it?

      p.s. I’ve yet to find a photographer who is happy enough with the results of their camera not to buy another…

      • Adam Pierzchala

        Hi Tim,
        OK, I was not aware that colour rendition has differed so much as to persuade people to change and I bow to your wider knowledge. But as always, there are compromises and a camera giving better colour may have worse resolution, dynamic range or low-light capability etc. In the end we all chose what’s most important to our photogrpahic needs.
        From what I have seen and heard, photographers jump ship mostly because of AF ability, particularly in low light for wildlife or sports and similar photo-journalism. And these photographers are either professionals who can offset equipment changes as a legitimate business expense, or they are well-heeled amateurs who can afford to do so. In fact I got my two best Nikon lenses second-hand, in pristine condition and at an excellent price, precisely because two different guys decided to switch to Canon. Unless the tests show me that Nikon’s digital SLRs are really hopelessly off in their colour rendition, I am not going to invest in another system with new lenses – it’s just too expensive.
        I certainly don’t mean to belittle the interest in this subject, far from it, but I didn’t think that the differences are so much as to be seriously non-fixable (technical term…) in software. When you read reviews of new models in the photo press, you invariably get just a rather superficial comment about the colour with occasional reference to flat RAW images – as is expected in any case. And quite often the comment is also made that colour imperfection can be corrected in post-processing. If in your tests you do indeed show that some cameras have a significantly ‘better’ colour output and if the mainstream photo press latches on to this, then indeed perhaps camera and sensor manufacturers will start to take notice. That can only be a good thing. There may be a difficulty in defining just what ‘better’ means because comparing against other cameras, or against film, may not suffice. Granted, one could try to use serious high-tech colour analysers, but can you use them in the field to measure the original subject? Or would you shoot standard subjects (test charts, colour patches…). And then the different quality or colour temperature of light comes into play as well. You mentioned testing indoors with controlled light, perhaps a slide projector would be a good choice. I admire you for wanting to have a go but I suspect that this will be a very difficult task indeed!

        • Sorry – didn’t mean to sound defensive :-)

          I don’t expect this to be a major issue for most people but given that most landscape photographers tend to care least for high iso performace and auto focus, then all that is left is the functional handling of the camera, resolution and colour (OK, I simplify).

          And yes it won’t be particularly easy but that is probably why it hasn’t been looked at before.. I like a challenge too! (although my PhD tutor might disagree)

  • Adam Pierzchala

    Well, we agree on what landscape photogs want most from their cameras – so we’re still friends! Have fun, Adam

On Landscape is part of Landscape Media Limited , a company registered in England and Wales . Registered Number: 07120795. Registered Office: 1, Clarke Hall Farm, Aberford Road, WF1 4AL