Wednesday, 28 June 2023
  18 Replies
  526 Visits
0
Votes
Undo
  Subscribe

Hello chaps

As we approach nebula season I thought I would take a chance to have a discussion on the relative merits of scheduling RGB vs LRGB jobs.

It's an old argument, but what is not old is the technology and hardware we all have access to. I think there is certainly some CCD aftershocks still permeating through the astro community.

Modern CMOS cameras are so much more sensitive than CCD. AI deconvolution and noise reduction software are literal game changers (I'm looking at you, BlurXTerminator). BXT provides the biggest jump in image quality I have seen dollar for dollar and can really step up in not 'needing' a dedicated Lum stack.

So!

With these amazing new tools we have and the sensitive hardware, do we need dedicated Lum any more? Can we not just synthethise a Lum from the stacked RGB image? The synthetic lum can be processed as normal and be a *perfect match* for the RGB because that's where it came from.

L is a component of all R G B images anyway. This means we can put more imaging time in where it matters.

CMOS cameras are sensitve to different wavelengths, peaking in the red. So we need less Red than we need Blue. This means that using G2V star anaysis we can absolutely maximise the signal and colour balance of our image before we put it in to processing.

When we colour calibrate we are effectively asking the software to boost some and lower other colours. If we provide perfect R:G:B colour ratio raw files, or as near as possible, the software will have to 'do' less and more signal gets through.

The idea here is two fold: get more signal that isn't going to be dampened during colour calibration, and use less time for the same productivity on the pier. What is the point of doing 1:1:1 R:G:B when, for example, Pixinsight is going to boost the blue and dampen the red? The amount of imaging time that will be saved can be recycled for more relevant signal.

On a side note, narrowband imaging also requires ratios due to the massive abundance of Ha. 1 hour of Ha should be accompanied by much more O3 and S2 due to their rarified nature. This is more of an art, I think, but rarified O3 and S2 need some serious love :)

I've asked Robin Glover (of Sharp Cap) for his views on this and will post here if I get a response.

Perhaps we could do a short test?

An example of RGB only:

https://www.astrobin.com/xvxxym/

 

Cheers

 

Pete

 


“There are no bad pictures; that's just how your face looks sometimes.”

― Abraham Lincoln


10 months ago
·
#6338
0
Votes
Undo

Just a thought.. does anyone know if Hubble uses a Lum channel?


“There are no bad pictures; that's just how your face looks sometimes.”

― Abraham Lincoln


10 months ago
·
#6339
0
Votes
Undo

Just a thought.. does anyone know if Hubble uses a Lum channel?

Yep it has a full light filter on each of its devices

BTW 

I thought that the CMOS sensors used on most roboscopes piers are way more sensitive to blue and cyan light - that can be seen in the relative intensity of subs acquired on pier 14.  there may be some attempt to correct for that in the relative transmisivity of the broadband filters....

 

Interesting about the rgb vs lrgb - I personally like luminance (as, like you say, I have always used it!) - but I suspect it cements in the uneven response of the sensor to different photon energies.  Not using luminance will definitely allow us (in most cases) to increase individual sub length as less energy is transmitted to the sensor.  

I think as we are not attempting spectoscopy and photometry using the camera, the degree of subjectivity inaccuracy of the data we gather and process  is less important  in the way we treat it to make strong images.  The most important elements that we can control are levels of all noise, including LP and moon glow, focus and seeing 

10 months ago
·
#6341
0
Votes
Undo

Hi ChuckNorris,

I have been thinking about this too, and over the last few months I have looked into using RGB only versus LRGB. In fact, I am about to abandon Lum completely. We could indeed run a test on the basis of existing datasets and see what we get.

 

Hi pebster2005

I have looked into this and indeed, the QE is lower for R than G and B.  I have mapped some filters to this curve (see attached img). This led me to calculate the frame number ratios that need capturing (see attached img). 

CS
Manuel


Manuel
Roboscopes General Technical


10 months ago
·
#6344
0
Votes
Undo

Hi, 

In full agreement about abandoning a luminance channel on the majority of targets, just wondering about capturing IFN.

Regarding optimising the numbers of subs per filter, I may well be misunderstanding this.

Does that mean that when submitting a request we adjust the numbers per filter to optimise the integration time? I would be most grateful if someone would tell me if that is correct, or, what I'm not understanding.

Cheers and CS, 

Ray 


Ray
Roboscopes Guinea Pig


10 months ago
·
#6347
0
Votes
Undo

Hi, 

In full agreement about abandoning a luminance channel on the majority of targets, just wondering about capturing IFN.

Regarding optimising the numbers of subs per filter, I may well be misunderstanding this.

Does that mean that when submitting a request we adjust the numbers per filter to optimise the integration time? I would be most grateful if someone would tell me if that is correct, or, what I'm not understanding.

Cheers and CS, 

Ray 

Yes Ray, it's something I have been banging on about with you all forever LOL

Broadband with 6200 sensor:

for example it saves wasting time getting extra data for the blue filter as it shows you need to concentrate more on the red filter due to the lower figures in the sensors QE curve

Manuel's simple analyses will give you 95% of a G2v calibration without doing all the work and as Chucknorris says it will save you having to suppress channels because you have taken way to much on any one particular channel

 

Narrowband:

This does not apply

ps

I hope more members read this discussion and chip in, even if you still do LRGB and not RGB optimising your sub quantities for the RGB to suit your sensors QE will make a huge difference to productivity and post processing. I look forward to reading more from all of you on this subject :)

 

Steve


Please ignore my dylexia wherever possible, just be thankful I can control my Tourettes ;)

Things to do, so little time!

Steve
Roboscopes Tea Boy


10 months ago
·
#6352
0
Votes
Undo

Steve, 

You may well look forward to reading more on this  from members, but not from me you won't. :) 

This is sarcasm at its finest. 

When I got up this morning little did I realise that not only was it true but I'd actually woken up in one. A parallel Universe!

Great to be here and so completely different to the one I was in yesterday. In that one it wasn't always the case that you got everything you wanted. I recall that all to often in that Universe you'd request a specific number of something, only to find that when you received them you'd have to reject a few if not occasionally all of them.

I wonder what else I'm going to discover is different. Steve I hope you don't mind me asking, but by any chance do you have a full head of hair?

Yours sarcastically, 

Ray 


Ray
Roboscopes Guinea Pig


10 months ago
·
#6354
0
Votes
Undo

Steve, 

You may well look forward to reading more on this  from members, but not from me you won't. :) 

This is sarcasm at its finest. 

When I got up this morning little did I realise that not only was it true but I'd actually woken up in one. A parallel Universe!

Great to be here and so completely different to the one I was in yesterday. In that one it wasn't always the case that you got everything you wanted. I recall that all to often in that Universe you'd request a specific number of something, only to find that when you received them you'd have to reject a few if not occasionally all of them.

I wonder what else I'm going to discover is different. Steve I hope you don't mind me asking, but by any chance do you have a full head of hair?

Yours sarcastically, 

Ray 

Still bald in every version of the multiverse Ray. You are nuts you know that don't you :)

 

Steve


Please ignore my dylexia wherever possible, just be thankful I can control my Tourettes ;)

Things to do, so little time!

Steve
Roboscopes Tea Boy


10 months ago
·
#6355
0
Votes
Undo

Very interesting, this is something I've been thinking about quite a bit recently...

On one hand, pretty much all of the famous imagers on Astrobin shoot luminance.  The argument has always been that it adds more detail, so the real test would not be on nebulae as in the original image posted, but rather on galaxies.  Indeed, even Bart Delsaert, the author of said nebula image, shoots luminance on his galaxies.

But... Juan Conejero (CEO of PixInsight) is of the opinion that luminance is only useful as a "shortcut" to get more data when on a time budget, and that pure RGB is the way to go for longer integrations.  So there is at least one major figure in the other corner.  Speaking personally, I just prefer my pure RGB images to those I've processed with luminance...

Regarding B/G/R balance, much thanks to Steve for pointing me towards researching the 6200 QE curve.  Quite clear that red needs more integration time!  The next question is whether one should be running different exposure lengths for each filter, eg 180s blue / 300s red, or whether it would sufficient simply to run more frames for the "weaker" colour?  My gut feel is toward the former...

I look forward to further contributions to this fascinating thread :)

 

 

 

10 months ago
·
#6356
0
Votes
Undo

Very interesting, this is something I've been thinking about quite a bit recently...

On one hand, pretty much all of the famous imagers on Astrobin shoot luminance.  The argument has always been that it adds more detail, so the real test would not be on nebulae as in the original image posted, but rather on galaxies.  Indeed, even Bart Delsaert, the author of said nebula image, shoots luminance on his galaxies.

But... Juan Conejero (CEO of PixInsight) is of the opinion that luminance is only useful as a "shortcut" to get more data when on a time budget, and that pure RGB is the way to go for longer integrations.  So there is at least one major figure in the other corner.  Speaking personally, I just prefer my pure RGB images to those I've processed with luminance...

Regarding B/G/R balance, much thanks to Steve for pointing me towards researching the 6200 QE curve.  Quite clear that red needs more integration time!  The next question is whether one should be running different exposure lengths for each filter, eg 180s blue / 300s red, or whether it would sufficient simply to run more frames for the "weaker" colour?  My gut feel is toward the former...

I look forward to further contributions to this fascinating thread :)

 

 

You are quite right it can be done in two ways :)

Option 1, shorter exposures

This has was the way it was always done traditionaly, eg 100seconds in red and 70seconds in blue

This needs a wide away of calibration files that are not always available when remote imaging, especially with newer CMOS systems not having a shutter

Option 2, less subs

100x 180s exposers in red and 70x180s in blue

Its simply a different take on it and allows you to work within your normal exposure ranges that you have calibration frames for

Whilst purists would argue that option 1 is better and I am inclined to agree, its always a trade off between perfection and what's achievable whilst still delivering acceptable payback/results. 

For me altering the amount of subs rather than the length of exposures will get you a long long way towards perfection without a a lot of hassle of taking 100, 70 & 67 second exposures to suit each filter :)


Please ignore my dylexia wherever possible, just be thankful I can control my Tourettes ;)

Things to do, so little time!

Steve
Roboscopes Tea Boy


10 months ago
·
#6357
0
Votes
Undo

I've obviously been labouring under a false impression. I thought that with the same filter the higher the altitude that a sub was taken, the higher the signal to noise ratio will be. 

Is it not the case that capturing using a blue filter with a higher transmission factor than red but at a low altitude, may only have the same s/n ratio of a red filter sub captured at a higher altitude? Does that not negate the point under discussion.

When we can run sequences such as RRRGGGBBB then dither, I'll come on board. In the meantime I'm happy to remain a nutter. Software will allow the creation of a reasonable image without having a precise match of filters. That's already something we have to do most of the time. 

A final thought.

Have a look at the number of wasted hours of imaging time we get through incomplete jobs. Quite often a good percentage of a job is collected only for other jobs to then take priority. The job sits there week after week with nothing being done and the target may eventually set. If say 75 percent was collected it could be that one filter collected very few subs. That's wasted time. This is a far bigger issue. 

Cheers, 

Ray 

 

 


Ray
Roboscopes Guinea Pig


10 months ago
·
#6358
0
Votes
Undo

I've obviously been labouring under a false impression. I thought that with the same filter the higher the altitude that a sub was taken, the higher the signal to noise ratio will be. 

Is it not the case that capturing using a blue filter with a higher transmission factor than red but at a low altitude, may only have the same s/n ratio of a red filter sub captured at a higher altitude? Does that not negate the point under discussion.

When we can run sequences such as RRRGGGBBB then dither, I'll come on board. In the meantime I'm happy to remain a nutter. Software will allow the creation of a reasonable image without having a precise match of filters. That's already something we have to do most of the time. 

A final thought.

Have a look at the number of wasted hours of imaging time we get through incomplete jobs. Quite often a good percentage of a job is collected only for other jobs to then take priority. The job sits there week after week with nothing being done and the target may eventually set. If say 75 percent was collected it could be that one filter collected very few subs. That's wasted time. This is a far bigger issue. 

Cheers, 

Ray 

 

 

Ray altitude differences in signal is a totally different discussion to what we are discussing now, imagine all filters working at the same altitude. Do you not agree the response curve of the sensor has an effect on the relationship between data times required with each filter?

Modern sensors and imaging techniques are changing and all us old CCD dinosaurs (me included) need to be willing to discuss new ideas and ways of doing things to get the best out of what we do. There is no right or wrong but failing to adapt or listen to new ideas in our hobby is a waste :)

"Have a look at the number of wasted hours of imaging time we get through incomplete jobs. Quite often a good percentage of a job is collected only for other jobs to then take priority. The job sits there week after week with nothing being done and the target may eventually set. If say 75 percent was collected it could be that one filter collected very few subs. That's wasted time. This is a far bigger issue. "

Ray you appear to have something to say despite the fact we seem to have discussed this very topic a million times, if we put to one side bad weather, breakdowns etc then the sky above us is a half tennis ball. Most of the time members have been imaging one part of that ball only,despite the fact you have 130º of sky all around to use. Also, not taking any account of how the moon waxes, wanes and rises at different times depending upon its phase. Very few forward plan or communicate there intentions to one another

Broadband jobs over the last few years have had over long subs when in. most cases  60-120s is fine, Narrowband almost every job is 300second subs despite the telescope being F3.6 and it only being needed for dim objects. not to mention way way to much HA gets taken compared to the other filters which wastes imaging time that could be spent upon the OIII & SII filters producing more well rounded data sets

Our imaging algorithms are very powerful but work best if members get on board with how they works. Here is an option for you, nominate a member to run the pier for you including all the planning or you could volunteer if you wish?

Honestly Ray if you want to, I do not mind as it means the members have a channel to put job suggestions through and that person can concentrate on learning the system and getting the best out of the pier :)

Plus its less work for me....LOL

Steve


Please ignore my dylexia wherever possible, just be thankful I can control my Tourettes ;)

Things to do, so little time!

Steve
Roboscopes Tea Boy


10 months ago
·
#6359
0
Votes
Undo

When we can run sequences such as RRRGGGBBB then dither, I'll come on board. In the meantime I'm happy to remain a nutter. Software will allow the creation of a reasonable image without having a precise match of filters. That's already something we have to do most of the time. 

I missed this part Ray

This is the whole point of the topic!

Just because we can process easily into a great RGB the way its being done now why not discuss a new technique to help us. it has zero ill effect on you processing a "reasonable image" and may even make more images available

Existing method

  • 100x blue 180s
  • 100x red 180ss
  • 100x green 180s

Why not discuss being able to do for example

  • 100x red 180s
  • 70x blue 180s
  • 67x green 180s

look at the time savings alone not to mention it will make processing easier :)

Re RRRGGGBBB etc. The only way you can run a system 100% your own way Ray is one its in your back yard or you run your own remote system and watch it like a hawk but scheduling software is more efficient than a human as it does not not need sleep, forget or even colour its "personal" preferences towards one particular object as math does not work that way :)

 

Steve


Please ignore my dylexia wherever possible, just be thankful I can control my Tourettes ;)

Things to do, so little time!

Steve
Roboscopes Tea Boy


10 months ago
·
#6360
0
Votes
Undo

I was thinking... Another measure we can use to determine which filter needs more integration is to check the noise levels in each channel after combination?  For example, in the past I have noticed more noise in the blue channel for most of my remotely acquired images, so I was rather surprised to see the CMOS QE curve stronger in the blue...  But then I realised those previous images were acquired with a CCD sensor (16803 mostly), and sure enough a quick search revealed that this sensor has a lower QE in the blue...  I love learning new things in this hobby!

I agree with Ray that downloading a 66% job only to find zero green frames is a bit of a gutshot...  Perhaps we can try making some small adjustments in the way we submit requests to reduce the chances of this happening?  I notice there are 10 filter lines available on the submission form - would splitting the frames over several lines (ie 30xRGBx3 instead of 90xRGB) spread their acquisition out more evenly?  Or perhaps multiple submissions for the same target would be a better way to do it?  Some guidance from those with a better understanding of the backend would be appreciated :)

10 months ago
·
#6361
0
Votes
Undo

I was thinking... Another measure we can use to determine which filter needs more integration is to check the noise levels in each channel after combination?  For example, in the past I have noticed more noise in the blue channel for most of my remotely acquired images, so I was rather surprised to see the CMOS QE curve stronger in the blue...  But then I realised those previous images were acquired with a CCD sensor (16803 mostly), and sure enough a quick search revealed that this sensor has a lower QE in the blue...  I love learning new things in this hobby!

I agree with Ray that downloading a 66% job only to find zero green frames is a bit of a gutshot...  Perhaps we can try making some small adjustments in the way we submit requests to reduce the chances of this happening?  I notice there are 10 filter lines available on the submission form - would splitting the frames over several lines (ie 30xRGBx3 instead of 90xRGB) spread their acquisition out more evenly?  Or perhaps multiple submissions for the same target would be a better way to do it?  Some guidance from those with a better understanding of the backend would be appreciated :)

A better way to do it would be to split Narrowband from broadband data, otherwise so many smaller little jobs is a massive input overhead for the Roboscopes staff plus zipping lots of smaller jobs up is again another huge overhead

Over the next few weeks I will do a new syndicate crib sheet for members

I do however truly believe that lots of fundamental changes in how users input data and work out how much data is "actually" required for each job will all add up to a massive productivity boast for the members

Anyway we have digressed enough from the original posters topic so I will say no more :)

 

Steve


Please ignore my dylexia wherever possible, just be thankful I can control my Tourettes ;)

Things to do, so little time!

Steve
Roboscopes Tea Boy


10 months ago
·
#6362
0
Votes
Undo

This is a great discussion and I believe could lead lead to a huge improvement, so thank you for the original post. 

We are in a good place right now with pier 14 and I suggest we try something pretty radical and see how things go. Included in that as the first thing would be before making a submission to consider whether separate luminance is necessary, together with adjusting the number of RGB subs in line with what is being suggested. 

As I have tried to stress, rather monotonously, doing that alone will not make that much of a difference, so I will make a separate post later suggesting a new way forward, purely on pier 14. Steve has already just mentioned the kinds of things we can do to improve productivity. 

Thanks to everyone who has contributed to this post with their suggestions.

Cheers and CS, 

Ray 

Just to be clear, we the members and the weather are the ones creating the problems I describe, it's very little down to Roboscopes. 


Ray
Roboscopes Guinea Pig


10 months ago
·
#6402
0
Votes
Undo

I generally don't do L for broadband unless there is a specific reason (someone mentioned IFN as a possible example). Basically, because you need a good long luminance session to get additional detail from the targets I am interested in.

I do try and vary the exposures, and somewhere in the last year I found a website that allowed you to calculate exposures for your specific camera/scope combination, target SNR and object. Unfortunately, I have lost the URL. I used it mostly for NB imaging, but the results were OK.

Somewhere out there is a calculator we could use.

I found useful information on CMOS exposure selection from Robin Glover of SharpCap fame - Deep Sky Astrophotography With CMOS Cameras by Dr Robin Glover - YouTube and Choosing the right gain for Deep Sky imaging with CMOS cameras - YouTube

The thinking could be extended for different bandpasses and QEs of the filters. All you then need to know is the spectrum of hte target!

10 months ago
·
#6405
0
Votes
Undo

I generally don't do L for broadband unless there is a specific reason (someone mentioned IFN as a possible example). Basically, because you need a good long luminance session to get additional detail from the targets I am interested in.

I do try and vary the exposures, and somewhere in the last year I found a website that allowed you to calculate exposures for your specific camera/scope combination, target SNR and object. Unfortunately, I have lost the URL. I used it mostly for NB imaging, but the results were OK.

Somewhere out there is a calculator we could use.

I found useful information on CMOS exposure selection from Robin Glover of SharpCap fame - Deep Sky Astrophotography With CMOS Cameras by Dr Robin Glover - YouTube and Choosing the right gain for Deep Sky imaging with CMOS cameras - YouTube

The thinking could be extended for different bandpasses and QEs of the filters. All you then need to know is the spectrum of hte target!

Was it this?

 

https://snrcalc.vercel.app/calculators

 

I'm putting a post together with some examples of that. If anyone wants a headstart and to play with it, click Utilities and restore backup and paste this:

 

{"userTargets":{"list":[{"id":0,"name":"M27 Dumbbell Nebula","surfaceBrightness":"20.2","rightAscension":171203100673,"declination":194787672065},{"id":1,"name":"M51 Whirlpool Galaxy","surfaceBrightness":"22.5","rightAscension":115672612865,"declination":405295595521},{"id":2,"name":"NGC 2403 Intermediate Spiral Galaxy","surfaceBrightness":"22.77","rightAscension":65068335105,"declination":563196461057},{"id":3,"name":"M31","surfaceBrightness":"22.29","rightAscension":5727322113,"declination":354351579137},{"id":4,"name":"Thor's Helmet NGC 2359","surfaceBrightness":"24.33","rightAscension":62608375809,"declination":113491574784},{"id":5,"name":"IC 2177","surfaceBrightness":"30.36","rightAscension":60741910529,"declination":89690996736}],"nextid":6},"userTelescopes":{"list":[{"id":2,"name":"VC200L 0.71x","aperture":"200","focalLength":"1278","centralObstruction":"77","totalReflectanceTransmittance":"0.99"},{"id":3,"name":"94EDPH","aperture":"94","focalLength":"414","centralObstruction":"0","totalReflectanceTransmittance":"0.99"},{"id":4,"name":"Askar 200","aperture":"50","focalLength":"200","centralObstruction":"0","totalReflectanceTransmittance":"0.99"},{"id":5,"name":"ASA 12N","aperture":"305","focalLength":"1088","centralObstruction":"120","totalReflectanceTransmittance":"0.99"},{"id":6,"name":"VC200L 1.0x","aperture":"200","focalLength":"1800","centralObstruction":"77","totalReflectanceTransmittance":"0.99"},{"id":7,"name":"Takahashi Epsilon 130","aperture":"130","focalLength":"430","centralObstruction":"63","totalReflectanceTransmittance":"0.99"},{"id":8,"name":"Skywatcher Esprit 120","aperture":"120","focalLength":"840","centralObstruction":"0","totalReflectanceTransmittance":"1"}],"nextid":9},"userCameras":{"list":[{"id":2,"name":"2600MM (IMX 571)","pixelSize":"3.76","readNoise":"1.25","darkCurrent":"0.195","quantumEfficiency":"85"},{"id":3,"name":"6200MM (IMX 455)","pixelSize":"3.76","readNoise":"1.2","darkCurrent":"0.195","quantumEfficiency":"85"}],"nextid":4},"userObservatories":{"list":[{"id":0,"name":"Home","bortleClass":"7","skyBrightness":"19.4","latitude":441685377025,"longitude":1486880769},{"id":1,"name":"Roboscopes","bortleClass":"2.5","skyBrightness":"21.25","latitude":328183316481,"longitude":56618909696}],"nextid":2}}

 


“There are no bad pictures; that's just how your face looks sometimes.”

― Abraham Lincoln


10 months ago
·
#6411
0
Votes
Undo

That's the one. 

I have recently updated my browser, and since this app stores all the information about your setup in the browser, I lost contact completely.

Thanks for spotting, I can now check it out again.

  • Page :
  • 1
There are no replies made for this post yet.
Be one of the first to reply to this post!

Follow Us

Newsletter

Proud to use

  • FLI

  • 656 Imaging

  • 10 Micron

  • Planewave

  • ZWO

Company Details:

Roboscopes

802 Kingsbury Road
Birmingham
B24 9PS
United Kingdom


Roboscopes is a trading name of ENS Optical LTD ¦ Copyright© 2020 Roboscopes
Cron Job Starts