Return to Video

The Great Camera Shootout 2011: Episode 1 ~ "The Tipping Point"

  • 0:51 - 0:54
    8 months ago, we started this process
  • 0:54 - 0:56
    that lead to what you are about to see right now.
  • 0:56 - 1:00
    Episode 1! Welcome to the Great Camera Shootout 2011!
  • 1:00 - 1:02
    By far, one of the greatest camera evaluations
  • 1:02 - 1:04
    to date.
  • 1:04 - 1:06
    The testing this year was a huge undertaking.
  • 1:06 - 1:08
    We asked Robert Primes, ASC to design
  • 1:08 - 1:10
    and administer his own tests.
  • 1:10 - 1:12
    Thats right, we didn't want people to think these
  • 1:12 - 1:15
    tests were biased in any way, and they're not.
  • 1:15 - 1:18
    So Bob created the Single Chip Camera Evaluation
  • 1:18 - 1:21
    or SCCE. An independent organization
  • 1:21 - 1:23
    to conduct the testing.
  • 1:23 - 1:25
    Our documentary is about the process that Bob
  • 1:25 - 1:28
    took to design and and conduct his tests, and what
  • 1:28 - 1:31
    you'll see is quite a bit different then what we
  • 1:31 - 1:32
    did last year.
  • 1:32 - 1:34
    Bob designed the tests with his technicians.
  • 1:34 - 1:38
    Bob ran the tests, with zero Zacuto involvement.
  • 1:38 - 1:40
    Bob picked his crew, and Bob picked the cameras
  • 1:40 - 1:43
    to be tested. So essentially, there were two crews
  • 1:43 - 1:46
    on the set. Our Zacuto documentary crew
  • 1:46 - 1:48
    and Bob's SCCE crew.
  • 1:48 - 1:51
    Last year, the testing only focused on DSLR's
  • 1:51 - 1:53
    because they were new, and nobody knew much about them.
  • 1:53 - 1:55
    But a lot of guys chimed in and asked
  • 1:55 - 1:57
    "Why didn't you include the RED?"
  • 1:57 - 2:00
    and the reason is it was a DSLR test.
  • 2:00 - 2:03
    These tests are incredibly complicated, and last
  • 2:03 - 2:06
    year we just wanted to test DSLR's against film.
  • 2:06 - 2:08
    This year, we have a different mission.
  • 2:08 - 2:11
    To test cameras we like to call "cinema cameras".
  • 2:11 - 2:13
    You know, cameras you could potentially use to
  • 2:13 - 2:15
    shoot a feature or a short.
  • 2:15 - 2:17
    One criteria we decided on was that these
  • 2:17 - 2:19
    were to be large sensor cameras, that
  • 2:19 - 2:21
    could only come from the manufacture's pro line,
  • 2:21 - 2:23
    and not from the consumer divisions.
  • 2:23 - 2:26
    Again, it was Bob's call and he had the final say
  • 2:26 - 2:28
    on what cameras were included.
  • 2:28 - 2:30
    We're testing the Sony F35,
  • 2:30 - 2:31
    and the ARRI Alexa,
  • 2:31 - 2:33
    the RED ONE with the MX sensor
  • 2:33 - 2:35
    at the time the EPIC was not available to us,
  • 2:35 - 2:37
    believe me we begged for it.
  • 2:37 - 2:39
    the Sony F3
  • 2:39 - 2:40
    the Panasonic AF-100
  • 2:40 - 2:45
    35mm film, Kodak stocks 5219 and 5213
  • 2:45 - 2:47
    the Phantom FLEX
  • 2:47 - 2:48
    and WeissCam HS-2
  • 2:48 - 2:50
    the Canon 1D Mark IV
  • 2:50 - 2:51
    5D Mark II
  • 2:51 - 2:53
    and 7D
  • 2:53 - 2:55
    and the Nikon D7000
  • 2:55 - 2:57
    Bob and his team designed 15 different tests
  • 2:57 - 2:59
    that really stressed these cameras.
  • 2:59 - 3:01
    Here's the deal, of course we could have made
  • 3:01 - 3:04
    every one of the cameras look great, in each test.
  • 3:04 - 3:06
    that's not the point of this camera comparison.
  • 3:06 - 3:09
    The point is to show in a stressed scenario,
  • 3:09 - 3:12
    how well each camera can preform against the other.
  • 3:12 - 3:13
    So, I don't want to hear
  • 3:13 - 3:15
    "I know I can make my camera look better"
  • 3:15 - 3:18
    because, with proper lighting we know you can.
  • 3:18 - 3:21
    Plus, a great DP can make any of these cameras
  • 3:21 - 3:23
    look amazing in any of these situations.
  • 3:23 - 3:26
    They shot a combination of scientific charts
  • 3:26 - 3:27
    and real world scenes.
  • 3:27 - 3:31
    Both are necessary in making a meaningful evaluation.
  • 3:31 - 3:34
    Also, a lot of steps were taken to be as fair as possible.
  • 3:34 - 3:37
    Each test used the same PL mounted lens
  • 3:37 - 3:40
    for all the cameras. Except for the D7000
  • 3:40 - 3:43
    which was not available with a PL mount, so it
  • 3:43 - 3:45
    used Zeiss ZF glass.
  • 3:45 - 3:48
    For the most part, we used the Fujinon 4k rated zooms.
  • 3:48 - 3:51
    All the manufactures, were invited to be involved
  • 3:51 - 3:53
    with the test and provide a technician with
  • 3:53 - 3:56
    their camera. In the cases where a manufacturer
  • 3:56 - 3:58
    declined to send somebody, Bob assigned a
  • 3:58 - 4:00
    Camera Master to manage that camera as
  • 4:00 - 4:03
    it rotated through all the tests.
  • 4:03 - 4:05
    Each test also had a Station Chief,
  • 4:05 - 4:07
    who kept the test consistant between cameras.
  • 4:07 - 4:11
    And a dedicated team of data wranglers under
  • 4:11 - 4:12
    the direction of Mike Curtis and Bill Hogan to
  • 4:12 - 4:14
    manage all the media.
  • 4:14 - 4:17
    A lot of people say "Everything looks good on the Web"
  • 4:17 - 4:19
    and we have taken great care to compress
  • 4:19 - 4:21
    this documentary so that you can really see
  • 4:21 - 4:24
    the details. So, don't be pissed that this show takes
  • 4:24 - 4:25
    a while to load.
  • 4:25 - 4:28
    Like last year, we wanted to give you the
  • 4:28 - 4:30
    "cinema experience" and we do this in
  • 4:30 - 4:33
    an interesting way. By allowing you to hear comments
  • 4:33 - 4:36
    from viewers in theatrical screenings, which we
  • 4:36 - 4:39
    held in Sydney, Amsterdam, New York, London
  • 4:39 - 4:43
    NAB and Hollywood. You'll here from indie and
  • 4:43 - 4:47
    feature filmmakers, event shooters, commercial DP's
  • 4:47 - 4:49
    directors and corporate filmmakers. Even though
  • 4:49 - 4:53
    you aren't seeing this in a 2k theater, hearing
  • 4:53 - 5:00
    from people in the ASC, BSC, ACS, CSC, NSC, ICG
  • 5:00 - 5:04
    and the SOC should help you evaluate the cameras
  • 5:04 - 5:06
    as they did, and give you the theatrical experience
  • 5:06 - 5:08
    on the web.
  • 5:08 - 5:10
    The tests that we'll see in Episode 1 deal with
  • 5:10 - 5:12
    Dynamic Range and Latitude.
  • 5:12 - 5:14
    We'll see an ARRI Dynamic Range test chart,
  • 5:14 - 5:16
    shot by Michael Bravin.
  • 5:16 - 5:18
    And we'll see a pair of scenes that test
  • 5:18 - 5:19
    Underexposure and Overexposure
  • 5:19 - 5:21
    These scenes were lit by Matt Siegel
  • 5:21 - 5:23
    and Nancy Schreiber, ASC.
  • 5:23 - 5:25
    Theses scenes will help us see the usable latitude
  • 5:25 - 5:26
    of each camera.
  • 5:26 - 5:28
    But first we need to know
  • 5:28 - 5:30
    "What is Dynamic Range?"
  • 5:30 - 5:32
    A cameras dynamic range, is the difference
  • 5:32 - 5:37
    between the darkest object a camera can photograph
  • 5:37 - 5:39
    and the lightest.
  • 5:39 - 5:41
    And what we're doing in this station is we're measuring
  • 5:41 - 5:43
    dynamic range. The practice behind this is
  • 5:43 - 5:45
    there is a piece of motion picture film behind it
  • 5:45 - 5:47
    thats checked with a densitometer. And what you
  • 5:47 - 5:49
    do is you set the exposure for the camera,
  • 5:49 - 5:51
    and where you lose detail in the vertical and
  • 5:51 - 5:53
    horizontal lines is your clipping point. And where
  • 5:53 - 5:56
    you lose detail, because of noise in the shadow
  • 5:56 - 5:58
    area is your lowest exposure in your black area.
  • 5:58 - 6:01
    And in between, you end up finding the number
  • 6:01 - 6:02
    of stops of dynamic range.
  • 6:02 - 6:05
    Now let's see the actual test footage from
  • 6:05 - 6:07
    these cameras.
  • 6:37 - 6:39
    Jack Holm from Tarkus Imaging, took these
  • 6:39 - 6:41
    RAW files from this test and computed these
  • 6:41 - 6:44
    dynamic range numbers for each of the cameras.
  • 6:44 - 6:47
    But the numbers only tell part of the story,
  • 6:47 - 6:50
    there's a difference between calculated dynamic range
  • 6:50 - 6:54
    and what we would call usable exposure latitude.
  • 6:54 - 6:58
    I was surprised that the F3 and the 5D Mark II,
  • 6:58 - 7:01
    delivered the same amount of latitude at 11.2 stops.
  • 7:01 - 7:05
    The, um, 5D I tend to rate it more at 10, you know
  • 7:05 - 7:08
    to be honest. Just to be safe. I mean, personally,
  • 7:08 - 7:14
    11.2 stops is a little bit generous. On the other hand,
  • 7:14 - 7:18
    I think, um, the Alexa, I mean at 14.1, I found when
  • 7:18 - 7:22
    Tony and I did tests, we...we found
  • 7:22 - 7:25
    actually far more latitude in the Alexa
  • 7:25 - 7:27
    then we did on film.
  • 7:27 - 7:29
    I was surprised by the Phantom Flex actually.
  • 7:29 - 7:32
    I always thought of it as just a high speed camera,
  • 7:32 - 7:33
    I didn't think of it as a camera that you'd actually
  • 7:33 - 7:35
    shoot a lot of other stuff with.
  • 7:35 - 7:37
    Definitely the latitude, that it had I didn't
  • 7:37 - 7:39
    expect it to have anything like that.
  • 7:39 - 7:42
    Generally, most of them, you know, from the Sony
  • 7:42 - 7:47
    to the D7000 they're all very very close.
  • 7:47 - 7:49
    Once you get used to the flavor of those,
  • 7:49 - 7:52
    as a lighting guy, it seems that you could be
  • 7:52 - 7:54
    you know, comfortably moving from one to the other
  • 7:54 - 7:56
    But then when you go back to film, it's a whole
  • 7:56 - 7:59
    different ball game man.
  • 7:59 - 8:01
    You know a lot of the cameras up there,
  • 8:01 - 8:03
    have the same, like I don't know if the
  • 8:03 - 8:07
    Canon's can see 14...what is it, 11 stops.
  • 8:07 - 8:10
    I mean, 11 stops is certainly not shooting like reversal
  • 8:10 - 8:12
    thats a big amount of latitude.
  • 8:12 - 8:16
    That's the usable, versus unusable latitude, I'm sure
  • 8:16 - 8:18
    you could see a trace of light down there, but..
  • 8:18 - 8:19
    Right, yeah ok.
  • 8:19 - 8:21
    The noise floor...
  • 8:21 - 8:23
    But if it's got more noise in the picture does
  • 8:23 - 8:25
    it count?
  • 8:25 - 8:27
    We also need to see how these cameras record
  • 8:27 - 8:30
    real scenes. To do this, an underexposed scene
  • 8:30 - 8:32
    and an overexposed scene were designed.
  • 8:32 - 8:35
    The Camera Master had to set their camera
  • 8:35 - 8:37
    to record the widest dynamic range and were not
  • 8:37 - 8:40
    allowed to change settings between the two scenes
  • 8:40 - 8:42
    The combination of both of these scenes, will show
  • 8:42 - 8:46
    the usable dynamic range of each camera.
  • 8:46 - 8:48
    The Underexposure Scene, which was lit by Matt Siegel,
  • 8:48 - 8:50
    was designed to intentionally underexpose
  • 8:50 - 8:52
    the camera.
  • 8:52 - 8:55
    The new technology's pretty exciting out there.
  • 8:55 - 8:57
    And what we're finding is that you have cameras
  • 8:57 - 8:59
    that can allow you to be more creative
  • 8:59 - 9:01
    you have the What You See Is What You Get factor.
  • 9:01 - 9:05
    We've been able to light, in a more bold manner.
  • 9:05 - 9:08
    We've been able to take more chances, with our images,
  • 9:08 - 9:10
    because you can see it right away.
  • 9:10 - 9:12
    Like, in this set that's behind me, you know,
  • 9:12 - 9:14
    who in their right mind would light with a
  • 9:14 - 9:16
    practical unit thats only 15 watts?
  • 9:16 - 9:18
    You know, shooting film, thats pretty gutsy.
  • 9:18 - 9:20
    But now with the newer technology the sensitivity
  • 9:20 - 9:22
    that we're going to see in the cameras is this tests,
  • 9:22 - 9:25
    allow us to pretty bold in our choices, allow us to
  • 9:25 - 9:27
    maybe light with higher contrast, maybe allow
  • 9:27 - 9:30
    characters to go into the darkness a little bit more,
  • 9:30 - 9:32
    and take those chances.
  • 9:32 - 9:35
    So it's exciting to be able to push the technology
  • 9:35 - 9:37
    and have pleasant results.
  • 9:37 - 9:39
    Hey man! Tell us what you got
  • 9:39 - 9:40
    cooking here. Good to see ya!
  • 9:40 - 9:43
    Hello! Welcome to Set 3, um we have a nice night
  • 9:43 - 9:46
    interior going here. And, ah, should we take a
  • 9:46 - 9:48
    look at what's happening on the set here?
  • 9:48 - 9:50
    Here's Claudia, and she's settles right here
  • 9:50 - 9:52
    in this lovely position. And what we'll be
  • 9:52 - 9:54
    able to do is have a base exposure, at this point,
  • 9:54 - 9:56
    of a f/2.0 at ASA 320.
  • 9:56 - 9:58
    So this is going to be our base that we work from,
  • 9:58 - 10:01
    and set all the camera's to a nice standard.
  • 10:01 - 10:03
    And then the right side of the frame, is basically
  • 10:03 - 10:06
    4.5 stops underexposed. So what we've done is,
  • 10:06 - 10:09
    there's a little LED hiding back there as well,
  • 10:09 - 10:11
    'cause you want a little bit of separation.
  • 10:11 - 10:13
    We've matched the shadow play back here with
  • 10:13 - 10:19
    the levels on her face. So it's again, an f/0.5
  • 10:19 - 10:21
    and an f/0.75. We have the two kino's, and again
  • 10:21 - 10:24
    very low light levels, just single tube on each
  • 10:24 - 10:26
    of these two footers, coming through the glass,
  • 10:26 - 10:29
    and the spot reading on that matches our key.
  • 10:29 - 10:31
    So as we bring up this side of the set we'll also
  • 10:31 - 10:33
    be able to see how it compares to the mid tone
  • 10:33 - 10:34
    of the face.
  • 10:34 - 10:37
    We took it down, we took it down, to really test
  • 10:37 - 10:39
    the cameras and then went down to 25 watts.
  • 10:39 - 10:40
    Just too much light.
  • 10:40 - 10:42
    The other thing that's really interesting to see is that
  • 10:42 - 10:45
    we have a little bit of contrast. You know not even,
  • 10:45 - 10:47
    a 2:1 ratio on some of these things, depending on
  • 10:47 - 10:48
    where she stands.
  • 10:48 - 10:50
    To the eye, you don't really see it.
  • 10:50 - 10:52
    Now let's see how each of the cameras did
  • 10:52 - 10:53
    with this challenge.
  • 12:16 - 12:18
    It's nice to have a reference point.
  • 12:18 - 12:20
    So in this test Bob choose the RED ONE as
  • 12:20 - 12:22
    the reference to the other cameras.
  • 12:22 - 12:24
    This does not mean it preformed the best,
  • 12:24 - 12:26
    it's simply a reference.
  • 13:22 - 13:24
    I was impressed with the shadow detail in some
  • 13:24 - 13:26
    of the cameras, which I wasn't expecting to
  • 13:26 - 13:27
    see the shadow detail.
  • 13:27 - 13:30
    I focused on the woman's dress, which was
  • 13:30 - 13:34
    fairly high contrast it seemed monochromatic,
  • 13:34 - 13:37
    black and white. You almost couldn't see any detail
  • 13:37 - 13:40
    whatsoever and on the cameras that performed better,
  • 13:40 - 13:42
    I could see clearly discern the separation between
  • 13:42 - 13:43
    coat and dress.
  • 13:43 - 13:45
    You know, I think it's interesting how about a third
  • 13:45 - 13:49
    of the image was lost in terms of low light sensitivity.
  • 13:49 - 13:52
    And it just became a black, charcoal mass
  • 13:52 - 13:54
    with about a third of the cameras.
  • 13:54 - 13:57
    And I'm surprised that many of them, I think
  • 13:57 - 13:59
    three or four of them, really failed to resolve
  • 13:59 - 14:00
    anything in that area.
  • 14:00 - 14:01
    We've all come through the "digital revolution"
  • 14:01 - 14:04
    I still, would have expected film to be better
  • 14:04 - 14:05
    then what it was.
  • 14:05 - 14:07
    The shadow detail in the film,
  • 14:07 - 14:09
    -Yeah I was surprised by that
  • 14:09 - 14:12
    Which would be bound to get worse if you started
  • 14:12 - 14:13
    duping it.
  • 14:13 - 14:18
    Having shot film for so long, um,
  • 14:18 - 14:20
    thinking we always had the edge on
  • 14:20 - 14:23
    the digital realm. And suddenly seeing the differences
  • 14:23 - 14:27
    in the shadow detail, was actually staggering.
  • 14:27 - 14:29
    There was the one with the woman in the shadows,
  • 14:29 - 14:32
    and her dress seemed to have more detail with
  • 14:32 - 14:34
    the 7D versus the 5D.
  • 14:35 - 14:38
    I was surprised by the performance of the F3.
  • 14:38 - 14:41
    It was sharper in the shadows, and it was crisper,
  • 14:41 - 14:43
    compared to the AF-101, I thought it looked
  • 14:43 - 14:44
    a lot better.
  • 14:44 - 14:47
    The cameras really handling the low lights well, and
  • 14:47 - 14:49
    as we'll see in the test, they're handling the
  • 14:49 - 14:50
    highlights better.
  • 14:50 - 14:52
    So that was the underexposure, now lets
  • 14:52 - 14:54
    see how the cameras handled overexposure.
  • 14:54 - 14:57
    What we're testing here, is that if you have
  • 14:57 - 15:00
    a pretty normal lit scene. Maybe a little,
  • 15:00 - 15:03
    high key, but pretty normally lit. And you've
  • 15:03 - 15:08
    got something too hot, you know really really hot
  • 15:08 - 15:11
    Can the camera hold on to those tones?
  • 15:11 - 15:13
    If you could reach in and grab those tones
  • 15:13 - 15:16
    and bring them down. Would they be complete?
  • 15:16 - 15:17
    Or would they be burned out.
  • 15:17 - 15:19
    So, instead of just lighting this so it's
  • 15:19 - 15:21
    perfect for this camera, if we did this for all the
  • 15:21 - 15:22
    cameras, they'd all look the same. You know what
  • 15:22 - 15:24
    I'm saying? "Oh all the cameras are good"
  • 15:24 - 15:25
    And thats not much of a conclusion.
  • 15:25 - 15:27
    It would not tell you which cameras are better
  • 15:27 - 15:30
    at holding highlights. We want to see the latitude
  • 15:30 - 15:33
    of the cameras so the film speed that we used
  • 15:33 - 15:37
    for Matt's challange, which is "how well can you
  • 15:37 - 15:40
    bring up the shadows" and this one "how well
  • 15:40 - 15:41
    can you bring down the highlights"
  • 15:41 - 15:43
    We demand the same film speed, to keep everything honest.
  • 15:43 - 15:46
    And that tells you the total latitude of the camera.
  • 15:46 - 15:49
    My assignment was to light a day interior.
  • 15:49 - 15:53
    So, I asked for blondes. And we're testing for
  • 15:53 - 15:56
    the color white and black and then the red.
  • 15:56 - 15:58
    We'll see if even the poke-a-dots even resolve
  • 15:58 - 15:59
    in any of the cameras.
  • 15:59 - 16:02
    We wanted a lot of hot lights, so I have a leko
  • 16:02 - 16:05
    going on her face in the mirror. And we just have
  • 16:05 - 16:07
    little spots of light here, with all of the little
  • 16:07 - 16:09
    lekos back there. Then what we do is,
  • 16:09 - 16:13
    we make a power window in post and bring it down
  • 16:13 - 16:15
    and see what cameras could handle this amount
  • 16:15 - 16:17
    of overexposure. What I also did,
  • 16:17 - 16:21
    there's one really hot spot, right between
  • 16:21 - 16:23
    the branches. So that's really hot.
  • 16:23 - 16:26
    Most of the backdrop is like 5 stops over and
  • 16:26 - 16:28
    that is 7 stops over key.
  • 16:28 - 16:31
    Let's see how they compare. As a reminder,
  • 16:31 - 16:34
    the window was intentionally overexposed
  • 16:34 - 16:35
    to see how the cameras would handle
  • 16:35 - 16:37
    extreme highlights.
  • 17:57 - 17:59
    The Alexa was chosen as a reference
  • 17:59 - 18:01
    in this scene. Again, it isn't saying it performed
  • 18:01 - 18:04
    the best, it's just a reference.
  • 18:58 - 19:00
    There was something that caught my eye, the 7D
  • 19:00 - 19:03
    for some reason, you could see the details for the
  • 19:03 - 19:07
    7D fairly good, but the 5D everything else was
  • 19:07 - 19:10
    totally blown out.
  • 19:10 - 19:12
    And then you see the 7D performs a little better,
  • 19:12 - 19:16
    and then I hear somebody say that the 7D
  • 19:16 - 19:19
    has a standard underexposure...which leads it
  • 19:19 - 19:21
    to perform better here.
  • 19:21 - 19:24
    That may look better...but I still prefer the 5D.
  • 19:24 - 19:26
    It's more filmic.
  • 19:26 - 19:28
    Certainly as a photographer, the 5D performs more
  • 19:28 - 19:30
    filmic with it's large chip then a camera
  • 19:30 - 19:32
    with a smaller chip.
  • 19:32 - 19:33
    The lack of highlights in the DSLR's
  • 19:33 - 19:35
    -No yeah...
  • 19:35 - 19:38
    It's just remarkable...
  • 19:38 - 19:39
    That's mainly what I'm interested in, actually,
  • 19:39 - 19:41
    is the clipping that goes on.
  • 19:41 - 19:45
    And the Alexa seemed, next to film, to handle it.
  • 19:45 - 19:50
    Everyone said that the 1D Mark IV has a lot better
  • 19:50 - 19:53
    depth, in terms of the darks, the highlights
  • 19:53 - 19:55
    were gone. I mean they were....
  • 19:55 - 19:58
    And the difference, specifically between
  • 19:58 - 20:02
    the three Canon cameras, was quite varied in
  • 20:02 - 20:04
    terms of the highs and the lows. Which surprised
  • 20:04 - 20:06
    me, I would have thought they were a lot
  • 20:06 - 20:07
    closer together.
  • 20:07 - 20:08
    The thing, that really sticks out to me and that
  • 20:08 - 20:09
    screams video. Is when the highlights
  • 20:09 - 20:13
    go yellow. And that just drives me crazy.
  • 20:13 - 20:16
    And that's what I liked about the 5D is that it tends
  • 20:16 - 20:19
    to be, just stay white, and I saw that in the
  • 20:19 - 20:23
    F3 and the AF-100 really bad, it seemed like.
  • 20:23 - 20:26
    I've seen the AF-100 do some very awful
  • 20:26 - 20:27
    things with the highlights, and it was very well
  • 20:27 - 20:30
    behave here, compared to what I've seen in the past.
  • 20:30 - 20:32
    Again, strange to see the F3, sometimes
  • 20:32 - 20:34
    it looks a little too "juicy" in places.
  • 20:34 - 20:38
    Some cameras, I think, clip a channel earlier then others
  • 20:38 - 20:42
    So you have a color shift, and I've been shooting
  • 20:42 - 20:44
    film for ever, and I'm used to that nice, gradual
  • 20:44 - 20:47
    roll off, and going into these beautiful highlights.
  • 20:47 - 20:51
    That's one thing I feel that digital cameras have
  • 20:51 - 20:53
    a ways to go catch up yet, for the most part.
  • 20:53 - 20:56
    The Alexa looked really good, so you know,
  • 20:56 - 20:59
    we're getting there.
  • 20:59 - 21:02
    A lot of the cameras, were really poor, actually
  • 21:02 - 21:09
    I think the Alexa and the F35, clearly were the best.
  • 21:09 - 21:12
    I was disappointed with the highlights on the F3.
  • 21:12 - 21:14
    But, I loved the lowlights.
  • 21:14 - 21:18
    What surprised me, was that it was not as simple
  • 21:18 - 21:21
    as exposure latitude, or dynamic range,
  • 21:21 - 21:22
    a lot of cameras that did well in the highlights
  • 21:22 - 21:24
    did poorly in the shadows, and vice versa.
  • 21:24 - 21:26
    Which was really interesting to me.
  • 21:26 - 21:29
    One interesting calculation that Jack Holm did
  • 21:29 - 21:31
    to see how each of the camera's distributed
  • 21:31 - 21:34
    they're dynamic range based on an exposure
  • 21:34 - 21:37
    index rating of ISO 800.
  • 21:37 - 21:39
    You can see how there's no standard as to
  • 21:39 - 21:42
    how much highlight or lowlight latitude is recorded.
  • 21:42 - 21:45
    The 7D seems to have more recordable latitude
  • 21:45 - 21:50
    in the highlights, then the 5D or the 1D yet according
  • 21:50 - 21:52
    to their numbers, they actually have more
  • 21:52 - 21:56
    then the 7D, and they're right there with the F35
  • 21:56 - 22:00
    Which just in highlight rolloff, I know is not correct
  • 22:00 - 22:02
    and you can see it here, if you go back to the
  • 22:02 - 22:05
    F35, and compare how it recovers.
  • 22:05 - 22:07
    It recovers quite well.
  • 22:07 - 22:10
    And what's difficult about this, is that in film
  • 22:10 - 22:15
    there is a sweet spot of 320, that is where the stocks
  • 22:15 - 22:18
    have been tested. You can not compare that
  • 22:18 - 22:20
    to a digital sensor with sweet spot around 800.
  • 22:20 - 22:25
    Logically, the sensitivities in the highlights and lowlights
  • 22:25 - 22:28
    are comparable like apples and pears, but thats
  • 22:28 - 22:32
    a very technical subject.
  • 22:32 - 22:34
    But thats just something I noticed.
  • 22:34 - 22:37
    Yeah you know, video cameras don't handle highlights
  • 22:37 - 22:39
    very well, they were blown out, and if we
  • 22:39 - 22:41
    consider how that was shot they choose a mid point
  • 22:41 - 22:44
    so that everyone started in the mid point, and you can go
  • 22:44 - 22:47
    7 stops into the shadows with the F3.
  • 22:47 - 22:49
    You can't go over it very much, and so if you
  • 22:49 - 22:53
    had set your mid point farther down, on the slider.
  • 22:53 - 22:56
    Then I think, if you exposed for your highlights
  • 22:56 - 22:57
    which generally you do with video, you expose
  • 22:57 - 23:00
    for you highlights, and then grade up.
  • 23:00 - 23:03
    You know the new F3 and the AF-100, although
  • 23:03 - 23:06
    they didn't hold up to the bigger cameras, they're
  • 23:06 - 23:08
    like 1/10 the price? And, they didn't seem like
  • 23:08 - 23:12
    1/10 the image, so as somebody who's going to
  • 23:12 - 23:14
    be shooting with my own gear or relatively inexpensive
  • 23:14 - 23:17
    gear, cause I don't have the opportunity to rent
  • 23:17 - 23:19
    an Alexa or something like that, for my projects.
  • 23:19 - 23:21
    I'm very excited about what's coming down the pipe here.
  • 23:21 - 23:24
    As professionals who are under a budget the
  • 23:24 - 23:26
    whole time. We all need to know these limitations
  • 23:26 - 23:29
    of what every camera has. No producers going to save 300
  • 23:29 - 23:32
    quid on shooting with a 5D, you can fire back
  • 23:32 - 23:35
    and say no because it's going to cost us 20 grand
  • 23:35 - 23:36
    extra in the post.
  • 23:36 - 23:39
    There's no one best camera. There's a million
  • 23:39 - 23:41
    answers and theres a million best cameras.
  • 23:41 - 23:43
    It's the best camera, for each particular job,
  • 23:43 - 23:46
    but it does help to point out some of the strengths
  • 23:46 - 23:49
    and weaknesses that each camera has,
  • 23:49 - 23:53
    for each individual situation that you might be in.
  • 23:53 - 23:57
    The camera you use, the film stock you use.
  • 23:57 - 23:59
    I mean, they are preferences.
  • 23:59 - 24:04
    Compared to getting a good script, a good director,
  • 24:04 - 24:07
    and a good cast. Man it's a fraction of a percent,
  • 24:07 - 24:12
    of where you're going with the result. And your
  • 24:12 - 24:14
    own interpretation of all those.
  • 24:14 - 24:17
    If you're a great cinematographer, and this is the
  • 24:17 - 24:18
    instrument you're using. You don't say
  • 24:18 - 24:20
    "Oh, it's a piece of shit" and all that.
  • 24:20 - 24:23
    You say "Oh ok, what are it's strengths?"
  • 24:23 - 24:24
    "What are it's weaknesses?"
  • 24:24 - 24:25
    You work into the strengths of it, and you make it
  • 24:25 - 24:27
    look as good as possible. You know,
  • 24:27 - 24:28
    it's not the instrument, it's the....
  • 24:28 - 24:32
    it's what you have to say.
  • 24:32 - 24:34
    We have two more episodes coming that will cover
  • 24:34 - 24:37
    noise, motion artifacts, resolution
  • 24:37 - 24:39
    compression and color.
  • 24:39 - 24:41
    These tests are really interesting, and they've
  • 24:41 - 24:43
    never really been done like this before.
  • 24:43 - 24:46
    Those episodes will be coming in July and August.
  • 24:46 - 24:48
    We also need to thank, all the companies that
  • 24:48 - 24:50
    really helped put this whole thing together.
  • 24:50 - 24:52
    Especially Eric Kessler, from Kessler Crane,
  • 24:52 - 24:55
    who was a financial contributor, along with Zacuto,
  • 24:55 - 24:56
    for this documentary.
  • 24:56 - 24:59
    We also need to thank the hundreds of technicians,
  • 24:59 - 25:01
    and volunteers that donated their weekends
  • 25:01 - 25:04
    to put together such an incredible test.
  • 25:04 - 25:07
    The SCCE and the Great Camera Shootout 2011
  • 25:07 - 25:11
    was a huge undertaking, involving thousands of man-hours.
  • 25:11 - 25:13
    Additionally, thanks go to
  • 25:13 - 25:16
    many of the rental houses in LA. Especially,
  • 25:16 - 25:18
    Clairmont Camera, who donated over two million
  • 25:18 - 25:21
    dollars worth of equipment for the six days of
  • 25:21 - 25:22
    production.
  • 25:22 - 25:24
    People need to realize, that this is not a
  • 25:24 - 25:26
    winner-take-all type test. Some cameras preformed
  • 25:26 - 25:28
    well in certain situations, while others performed
  • 25:28 - 25:30
    well in other situations.
  • 25:30 - 25:32
    Not every camera is right for every job,
  • 25:32 - 25:35
    and I think this test shows this very well.
  • 25:35 - 25:38
    It was amazing to see this whole thing come together.
  • 25:38 - 25:40
    This was the biggest production we've ever done.
  • 25:40 - 25:41
    We had a team of producers,
  • 25:41 - 25:44
    two line producers, and hundreds involved.
  • 25:44 - 25:47
    We had four editors working two months, to make
  • 25:47 - 25:50
    one episode, and graphics teams both in house,
  • 25:50 - 25:55
    as well as Lorand Toth with his team in Romania.
  • 25:55 - 25:57
    I hope you enjoyed this episode and we'll
  • 25:57 - 25:59
    see you next month for Episode 2!
  • 25:59 - 26:21
  • 26:21 - 26:24
    English Subtitles by Scott Lynch
Title:
The Great Camera Shootout 2011: Episode 1 ~ "The Tipping Point"
Description:

In the most scientific camera comparison to date, "The Great Camera Shootout 2011: a documentary of the Single Chip Camera Evaluation (SCCE)" premieres with Episode 1: ''The Tipping Point." The first episode of the 3-part web series examines three SCCE Tests: The Dynamic Range Test, The Under Exposure Test and The Over Exposure Test.
Robert Primes, ASC, designed and administered the full series of tests. "That's right," says Web Series Director Steve Weiss,"We didn't want people to think that these tests were biased in any way. So Bob created the SCCE as an independent organization to conduct the testing." Additionally, Bob Primes designed the tests with his own technicians and selected the cameras to be tested.

The impressive 12-Camera line-up includes: 35mm Kodak 5213 & 5219 Film, Arri Alexa, RED ONE M-X, Weisscam HS-2, Phantom Flex, Sony F-35, Sony F3, Panasonic AG-AF100, Canon 5D Mark II, Canon 1D Mark IV, Canon 7D and Nikon D7000. In addition to the SCCE tests, the web series features commentary from some of the top DPs in the industry, which was filmed at worldwide screening locations in Sydney, Amsterdam, New York, London, Las Vegas (NAB) and Hollywood "You'll hear from indie filmmakers, event shooters, commercial DPs, directors and corporate filmmakers," says Steve Weiss, "Although you are not watching these tests in a 2K theatrical setting, hearing such commentary from people in the ASC, BSC, ACS, CSC, NSC, ICG and the SOC, will help you evaluate the significance of these tests."

This first episode of the series looks at the dynamic range and usable latitude of each camera. "We need to see how these cameras record real scenes," says Jens Bogehegn. Thus, Bob Primes, ASC, designed an under-exposure scene and an over-exposure scene. In regards to methodology, the camera master had to set their camera to record the widest dynamic range and they were not allowed to change any settings between the two scenes. The combination of both of these scenes will show the usable dynamic range of each camera. The featured scenes are shot by Michael Bravin and lit by Matt Siegel and Nancy Schreiber, ASC.

***The featured tests include three scenes: a back-lit test chart shot by Michael Bravin, an under-exposure scene lit by Matt Siegel and an over-exposure scene lit by Nancy Schreiber, ASC.

For more information on "The Great Camera Shootout 2011, go to http://www.zacuto.com/the-great-camera-shootout-2011. To watch previous episodes of The Great Camera Shootout 2010" visit http://www.zacuto.com/shootout

Commentary in this Episode Features:
Academy Award Winner Russell Boyd ASC, ACS (Master & Commander: The Far Side of the World, Ghost Rider, Forever Young); Academy Award Nominee Don McAlpine ASC, ACS (Moulin Rouge, Patriot Games, X-Men Origins: Wolverine); Calvin Gardiner ACS; BAFTA TV Award Winner Nic Knowland, ASC (The Final Passage, Around the World in 80 Days); Emmy Award Nominee & ASC Nominee Gale Tattersall, ASC (House, Ghost Ship, Wild Orchid); Mykelti T. Williamson, Actor (Forrest Gump, 24, CSI: NY); Ken Glassing (CSI: Miami); Philip Bloom (Sophia’s People, Greenpeace: Voices of Change); Terry Hopkins (London Film School); Dan Chung, James Mathers (Digital Cinema Society), David Wexler (Broken, The Wind) and more.

CAST & CREW
The web series documentary features two different independent crews. The SCCE Crew: Administrator Robert Primes, ASC; Station Chiefs: Michael Bravin, Stephen Lighthill ASC, Nancy Schreiber ASC, Matt Siegel and Mike Curtis; Line Producer Josh Siegel. The Shootout 2011 Crew: Editor Karen Abad, Graphic Designer Chris Voelz, Producers: Daniel Skubal, Scott Lynch, Jens Bogehegn and Eric Kessler; Web Series Director Steve Weiss.

more » « less
Video Language:
English

English subtitles

Incomplete

Revisions