I was curious if the automation already available in Ekos could be put to use to search for asteroids. The idea was to use the scheduler to run a capture sequence on the same area of the sky three times at fixed intervals. Captured sequences could then be stacked to produced three images to compare to see if something had changed position between them. Additional fields could be imaged during the intervals, in a more or less continuous process. A swath of the sky could be scanned moving in RA direction, imaging several fields at different DEC at each RA before moving to the next RA, like this:
I started trying to manually setup a job schedule to do this, but after a while it was obvious that it was not reasonable. I have now written a Python script that creates the scheduler file (.esl) to scan a given number of RA "columns" and DEC "rows" starting from a given RA and DEC (lower right corner in the schema). It considers the field of the camera-telescope combination and has parameters for the desired overlap between images. The scheduled jobs execute a capture sequence that is be predefined in Ekos. Each field in a RA column is visited three times following the order: DEC1, DEC2 ... DECn, DEC1, DEC2, ...DECn, DEC1, DEC2, ...DECn, and then the telescope moves to the next RA. Focusing is called once per RA column, in the first field (at DEC1). I'm attaching it as a .txt file to this post (needs to be changed to .py), in case someone wants to try it. As a result, 3 folders are created for each field (named e.g. Field10, Field10b, and Field10c).
I wrote a SiriL script to process and stack the captures from each field and to align the resulting 3 images for comparison.
Finding moving points of light in the images "by eye" proved absurdly difficult, so I wrote a script in Python that compares the images and presents the findings in a visual way. This is the output from a couple of known asteroids:
That is (391) Ingeborg, a fairly fast moving asteroid of about 15.6 mag. Images were obtained with a 115mm f/5.6 refractor equipped with an ASI1600mm-pro camera (each image: 3x 60 sec Lum subs). The animated GIF is saved by the python script. The color image is also output by the script (each image is a color channel, showing moving objects as aligned R, G, and B dots).
This is another one, about magnitude 17.75 (can't remember the name now, will update info later). It shows the script does a good job at finding them.
I am attaching the image comparison script here too as a .txt file, for anyone to play with. Just be aware that I'm not a programmer (so my code is probably laughable at some parts...tongue.png).
Things that don't work well yet:
useless rant --> 1. I run INDI/Ekos/Kstars on a Raspberry Pi 4 at the telescope: There has not been a single night where something fails (if it's not a general crash, it's the solver taking forever, or the guider failing in some way pinch.png).
2. I haven't found a way to make the scheduler wait for the guider to stabilize before starting the capture sequence. Clearly, after the alignment procedure, waiting some seconds seems to help. I think changing the capture sequence to 6x 30sec LUM subs night help with this.
3. Did I say I use a Raspberry Pi? The scheduler takes some time to load the .esl file. I wouldn't dare include more than 6 fields in the file (6 fields are 18 jobs...).
I'm looking forward to knowing if someone else is trying something similar and what has been your approach. It would be nice sharing ideas.
Already a very long post. I hope it didn't come out to complicated because of my bad English.