Hey fellow Pier 1/9 members,
I recognized a lot of great targets being added to the Pier 9 queue.
I also recognized many RGB jobs without L being added to the filter list?!
I would like to understand why L was not selected / you don´t "need" L for your integration process?
To me L is a great way to add lots of details and reduce noise in the final images while saving a lot of imaging time as compared to pure RGB. The logic and physics / statistic behind it is pretty clear on that - some great Astrophotographers do even suggest to spend ~50% of the total imaging time on L and only 50% or even less on the RGB channels. So I wonder why some of you don´t add L to the RGB joblist at all?
CS
Martin
A little over a month ago there was a "debate" about collecting L data and it really depends on the target and focal length. Because of the smaller pixel size and sensitivity of the new CMOS sensors L data is not as important for capturing details. When I go after dark nebula or IFN I will capture L data, but with bright targets I will skip L data, but after stacking and processing the RGB channels if the image is still too noisy or missing detail I will shoot a session of L data.
Also shorter pier submissions help with scheduling and keeping the data moving.
Daniel
Hello Martin.
LRGB is quite an old approach to imaging. Originally used when Binning the colour channels at a higher rate to take advantage of the increase in signal while using the L channel to keep the resolution. With more modern processing techniques the L filter is becoming more and more redundant, using pseudo Luminance by combining the RGB channels to create a false Luminance works extremely well. Especially on a setup with a very high sampling rate like Pier 9 as software binning is the better choice for CMOS and every filter is binned at 1x1 giving the better option of software binning.
Personally I rarely use the Luminance and have actually taken in out of my setup at home, having said that there are times when Luminance is a good option ie, for IFN. Even for IFN because it has quite a high signal in the ir a clear filter with no IR blocking is actually better, unfortunately we don't have that option.
In conclusion I am of the mind where the Luminance would only need to be used on certain objects and not needed on the majority.
Cheers
Peter
Peter Shah
Roboscopes Observatory Controller
Luminance locks in the irregular response of sensors to different wavelengths of light... most sensors are most sensitive to blue/green boundary and least sensitive to hi frequency blue and even less sensitive to red and red / ir boundary.....
So luminance , whilst being a great shortcut for areas - ie home setups with limited exposure time and inability to acquire data all night even on nights that are totally clear.... is imperfect and locks in a frequency bias to images that can be avoided by taking individual RG and B data......
Theoretically we have enough good time for just loads of useful acquisition
Since pier 9 is massively oversampled at bin 1, we can reduce noise ( but not detail) bu software binning calibrated subs and reduce the amount of adjustment needed in colour balancing by specifying more red subs than blue and green!
The new RGB calculator (Thanks Manuel) highlights this - have a play with it
This website uses cookies to manage authentication, navigation, and other functions. By using our website, you agree that we can place these types of cookies on your device.
You have declined cookies. This decision can be reversed.
FLI
656 Imaging
10 Micron
Planewave
ZWO
Roboscopes
802 Kingsbury Road
Birmingham
B24 9PS
United Kingdom
This website uses cookies to manage authentication, navigation, and other functions. By using our website, you agree that we can place these types of cookies on your device.