View Single Post
  #3  
Old 7th December 2021, 12:11 AM
John King John King is offline
Friend
 
Join Date: Nov 2012
Location: County Durham
Posts: 3,318
Default

Quote:
Originally Posted by Mike O'Pray View Post
Terry, I have seen this mentioned before and in theory it sounds sensible to use the same method for both test strips and final print but in practice does the interval method say 3 strips of 4 secs to get to the 12 secs strip differ enough from then using one 12 sec exposure for the final print to make enough of a difference to be detected by the average( that's me ) printer's eye?

I think it was Ralph Lambrecht who said that he can see a 1/12th of a stop difference in a print. I doubt I can but assuming this is the average to good printer's ability then from the fstop tables it looks as if 1/12th of 12 secs exposure is 0.7 secs so even with good "printer's eyes" the difference in say 3 or 4 intervals of 4 or 3 secs compared to one straight 12 sec print has to be more than 0.7 secs to be noticeable so is it more than 0.7 secs? It doesn't seem likely to me

That's sounds to be more of a fine difference than I can see

Mike
I doubt very much if anyone can see the difference of 1/12th of a stop. Human eyes are not made that way. Possibly if you are using a hard grade, 4 and above you should be able to see the difference of 1/4 of a stop or perhaps 1/6th but 1/12th. Nope.

Try it using the method where you calculate the exposure using the stop method where step increases are by 1/4 of a second you will be able to decide for yourself.
Reply With Quote