@Joaquin: Thank you for the good explanation. You hit the nail on the head.

The indi_pylibcamera issue #65 is closed. Prof Huster got raw Bayer images. The tool he used interpreted the data as monochrome. That was also the reason for the checkerboard pattern in the lighter regions of his image.

The header of the FITS image has an attribute "BAYERPAT" which tells the color order in the Bayer pattern. A monochrome FITS image does not has this attribute. As far as I know this makes the difference between monochrome and raw Bayer images in FITS format.

KStars has a setting to enable debayering before showing FITS images on the screen. That allows to see colored images in the preview window. It does not affect the stored image data. Likely CCDCiel has a similar feature.

For my opinion using raw Bayer images is preferred over RGB images. All color cameras I know use Bayer filter. The RBG images are generated in an image signal processor which does the debayering. But this image signal processor usually calculates with integer numbers which leads to loss of information. This is not critical for good exposed images. But in astrophotography the interesting parts of the images are extremely underexposes: the faint nebulas between bright stars are often hidden by the camera noise. Only a special processing makes these details visible:

  • Correction of pixel errors: You should use Dark frames to subtract individual pixel bias. If you are a professional you can also use Flat frames to compensate for individual pixel sensitivities. I do not have the equipment for doing Flat frames. But the HQ camera has very homogeneous pixel sensitivity and my optics do not produce vignetting. So I don't use Flat images.
  • Debayering: The best debayer algorithms are nonlinear. That's the reason why you need to do debayering after the correction of linear pixel errors.
  • Aligning and stacking: This improves your noisy pixel data: the mean grows with the number of frames while the standard deviation grows with the square root of the number of frames. The stacking of hundreds of frames makes details visible which were hidden by noise in the single frames.
  • Postprocessing: Here you correct colors/contrast and apply filter.

All these steps should be done with floating point arithmetic! There is a lot of good astro software available for doing this. My preferred free software for deep-sky images is Siril .

The 2x binning of the RPi HQ camera is aware of the Bayer pattern and done in hardware. It only bins pixel of the same Bayer color. The resulting is still a raw Bayer color image.

The indi_pylibcamera indi_pylibcamera driver runs in my setup (2 telescopes with 3 cameras: 2x RPi3 and one RPi0) since over a year without any major issues. In particular of the bad seeing conditions (very bad light pollution due to a near town) I am proud of the pictures I make. See attached image of M27, made of 68 exposures with each 20 second.

Read More...

Hello Prof Huster,

please open an issue in github.com/scriptorron/indi_pylibcamera/issues
It would help me a lot when you add one of your FITS images to the issue.

Best Regards,
Ronald

Read More...

Hello Mat,

Sorry for the late answer. You give me hope and I will do more experiments with my Star Discovery mount.

Last week I could prove that the issue is not caused by my guiding camera, PHD2 installation or mechanical setup. I used my optics with an equatorial mount and could successfully guide. Before that I was not fully sure if the camera is good enough for guiding (a Raspberry Pi HQ camera with my self made driver github.com/scriptorron/indi_pylibcamera). Attached is a picture of my equipment after a frosty observation night. The guiding scope is a Svbony with 190mm focal length, the main OTA is a Newtonian with 750mm focal length. Both have their own Pi with HQ camera. The Pi on the guiding scope operates the guiding camera, runs PHD2 and sends the guiding commands over WIFI to the Pi on the main OTA. The Pi on the main OTA runs KStars, operates the main camera and drives the mount. That setup sounds strange but it was the only which avoids sending camera pictures from one Pi to the other (that would be too slow). Controlled is everything using remote desktop (xrdp) from a Linux laptop in the same WIFI. I can do this from the couch and do not stay connected all the time.

As mentioned guiding works with the same optics, same cameras and same Pis on my equatorial mount but not on the StarDiscovery mount (www.astroshop.de/azimutal-mit-goto/skywa...an-wifi-goto/p,57362). The only differences on the StarDiscovery setup are:

  • different mount (of corse),
  • different mount driver (indi_synscan_telescope and SynScan app on Android, connect in same WIFI).

Testing indoors with a terrestrial target is a good idea. I will do that in the next days.

Regards,
Ronald

Read More...

Hello Mat,


Happy New Year!

Amazing! The diagram of the RA/DEC errors shows immediate reactions on guiding pulses. In my setup the errors continuously increase. It makes no difference if I enable or disable the guiding pulses in PHD2.

Do I understand right: you did this with

  • indi_synscan_telescope driver,
  • SynScan app and
  • AZ-GTi mount?

If so I can still have hope and will continue to search the issue with my setup.

I have a different mount "Star Discovery AZ SynScan WiFi GoTo" (www.astroshop.de/azimutal-mit-goto/skywa...an-wifi-goto/p,57362). But it is from same manufacturer and also controlled by the SynScan app. I can not imagine that the app allows guiding for AZ-GTi and not for Star Discovery. I would not wonder when they use the same hard- and software in both mounts.

My guiding scope has a higher focal length of 250mm and my camera has a different sensor size. But my field of view (86' x 65') is not much smaller than yours (137' x 103'). In fact my setup should react more sensitive to guiding pulses.

The connection between my guiding scope and the mount was provisional and maybe not mechanical stiff enough. In the meantime I made that better. I also got a Bahtinov mask for better focusing my finder scope. I should repeat my tests.

My camera delivers Bayer pattern raw data. Maybe that confuses the calculation of the star center and makes the guiding more noisy. But what I see is a slow and large drift similar to what you see without guiding.


Regards,
Ronald

Read More...

Hi Mat,

Short answer: I did not have success and I gave it up. :-(

Long answer:
I tried different INDI driver:

  • indi_azgti_telescope gives many “Serial read error”. I am not sure anymore but I think basic alignment and goto did not work.
  • indi_skywatcherAltAzMount which connects directly to the mount. With that driver basic alignment and goto did not work. Even a goto to the alignment point ended in a complete different direction. I got many “Mount Error: Motor not stopped” - maybe the protocol between driver and my mount does not fit.
  • indi_synscan_telescope and indi_synscanlegacy_telescope. For both you still need the SynScan App because the driver sends commands to the app which controls the mount. I use the app since long and I like the app! After alignment with the app I can connect with the driver and control the mount. The slew buttons on the app make the fine adjustment easy. But the indi_synscanlegacy_telescope does not support guiding and with the indi_synscan_telescope the guiding pulses do strange things. My mount reacts on these guiding pulses very randomly.

I looked in the source code of the indi_synscan_telescope driver. It uses slew commands (like the slew buttons in the app) for guiding. A guiding command sends a slew to the app and after the guiding pulse length a timer stops the slew again. That sounds simple and straight forward. But for any reason the app does not execute such short slews correctly. Maybe it has something to reject keybounce or there are other timing restrictions in the app.

The indi_synscan_telescope driver uses the "SynScan Serial Communication Protocol" (inter-static.skywatcher.com/downloads/sy...otocol_version33.pdf) which does not support guiding pulses like other mount protocols. The app also supports the "SynScan App Protocol" (inter-static.skywatcher.com/downloads/sy...rotocol_20230902.pdf) which is based on ASCOM's ITelescopeV3 Interface (www.ascom-standards.org/Help/Developer/h...ace_ITelescopeV3.htm). This second protocol is prepared for guiding pulses but I do not know if guiding is implemented in the app (the app should set the CanPulseGuide property to True - I have not tested this). The SkyWatcher support team did not answer my questions. As far as I know there is no INDI driver using the SynScan App protocol. I could not find an evidence on the internet that an ASCOM driver can do guiding with the app and this mount.

On internet you can read that guiding is only possible with an equatorial mount. For my opinion this is a wrong doctrine! It is only easier with an equatorial mount and you will get field rotation with an Alt/AZ - but it is possible and viable. On an Alt/AZ mount the software must adjust both motors. But that should not be an issue for modern computers and microcontroler. Both motors rotate already all the time. For a guiding pulse the software just needs to do some coordinate transformation and adjust the motor speeds for the given time. That can't be so difficult. I am thinking about making my own driver or fixing the issues in the indi_skywatcherAltAzMount when I have more time.


Marry Xmas and Happy New Year,
Ronald

Read More...

Ronald Schreiber replied to the topic 'INDI LibCamera Driver' in the forum. 5 months ago

@Anjo
Am I right that you mean the 'indi_pylibcamera' driver when you speak about the Python driver? If so, please open an issue in github.com/scriptorron/indi_pylibcamera and provide more details for debugging.

Thank you.

Read More...

In my post I stated that guiding worked with the indi_synscanlegacy_telescope driver. I can not reproduce this with a real camera. Likely I did something wrong during my first test. Maybe the guiding interface of the indi_simulator_ccd driver was used.

Asking the Sky-Watcher support regarding guiding interface of the SynScan App was not helpful.

Next days I will have a deeper look in the code of the indi_synscan_telescope driver.

Read More...

Simon is friends with Ronald Schreiber

I observe a very strange behavior when trying guiding (internal guider and PHD2) my SkyWatcher “Star Discovery AZ SynScan WiFi GoTo” mount. After wasting many nights I did some experiments at daytime with the CCD simulator. My setup for this test was:

  • Mount: “Star Discovery AZ SynScan WiFi GoTo” (Alt-Az mount with a WiFi dongle)
  • SynScan Pro app (latest version) on Android phone, in same WiFi network as the WiFi dongle of the mount
  • Linux laptop with KStars/EKOS and INDI (latest versions), in same WiFi network as SynScan Pro app and mount
  • INDI server runs on laptop, drivers are: indi_simulator_ccd (to simulate guiding camera) and indi_synscan_telescope (to control mount)
SynScan app and mount work fine together. I do alignment with the SynScan app before connecting the INDI mount driver. When connected to EKOS I can control the mount without problems:
  • “Find telescope” in KStars sky map shows same position as the SynScan Pro app
  • “GOTO” in KStars sky map moves mount as expected to new positions (position in SynScan app also gets updated)
  • Tracking works as expected
  • When taking photos with the CCD simulator I get pictures of a simulated sky
  • When tracking is enabled the simulated stars hold position on the photos
  • When tracking is disabled the simulated stars move on the photos (as expected)
  • When moving mount with SynScan app in North, South, East and West the simulated stars move as expected
  • When moving mount with the “Motion Control” tab of the indi_synscan_telescope driver the simulated stars move as expected
Everything fine so far. But when I try guiding the trouble begins. I slewed the mount to a good guiding star in the East at about 30° altitude. During calibration the guider makes movements in RA and DEC. The calibration diagram (and the photos of the simulated stars) show:
  • RA forward pulses move star.
  • RA reverse pulses move star in same direction as forward pulses. Step size is larger and growing.
  • RA and DEC movements are not orthogonal.
I tried guiding with PHD2 and the internal guider, both show the same strange behavior. The symptoms match to the ones I saw at night with a real camera and real sky, so I think it is not caused by the CCD simulator.

The driver has a tab which allows to send guiding pulses in North, South, East and West directions. I tried a number of such guiding pulses in each direction and watched the effect on the photos:
  • Pulsing to North moves the simulated guiding star in one direction
  • Pulsing to South moves the star in the same direction like a pulse to North
  • Pulsing to East moves the star to a direction which is not orthogonal to the North/South movements
  • Pulsing to West moves in opposite direction as the East movement
It is clear that guiding can not work when North/South movements are not reversible. When this does not work with simulated sky it will also not work with real sky. I am sure this is an issue in the indi_synscan_telescope driver or in the SynScan app.

I also tried the indi_synscanlegacy_telescope driver. This driver has no tab to send guiding pulses and is not supported in PHD2. But the internal guider seems to support this driver (does it use the normal motion control for guiding?). With the legacy driver the internal guider works nicely: it has a perfect orthogonal calibration diagram and makes a stable guiding. Of course, on a real sky I will see the imperfections of my mount.

The SynScan App user’s manual (inter-static.skywatcher.com/downloads/sy...nual_en_20201008.pdf) states that a 3rd party software (INDI driver) has to connect to TCP port 11882. But the command set description (inter-static.skywatcher.com/downloads/sy...and_set_20210824.pdf) specifies port 11881. Both INDI driver work on port 11882 only. The command set description is about a complete different protocol than the one used by the INDI driver. I could not find a description of the protocol used by the INDI driver. Can it be that Sky-Watcher uses now a new protocol and does not bug fix the old one anymore?

Is there an INDI driver which uses the new communication protocol?
Is there something I need to setup in the indi_synscan_telescope driver to make it work?

Read More...

Hi Denis,

First, congratulation to your great software! Really good!

I developed the indi_pylibcamera driver. It is a new project and certainly not free of bugs. I still try to improve it. When reading this discussion I got some questions:

1. What is POLLING_PERIOD used for? I saw this in the indi_simulator_ccd driver but I could not find out what it does. When searching internet I found something related to a focuser. But most of the cameras do not have a focuser. Is POLLING_PERIOD an importent attribute for a camera?

2. In the comment above you wrote, your program expects "len" instead of "size" in oneBLOB. I have all my half knowledge from github.com/indilib/docs and from analyzing network traffic. As far as I know github.com/indilib/docs/blob/master/protocol/INDI.pdf specifies "size". Is that outdated and should I change it to "len"?

Regards,
Ronald

Read More...

Ronald Schreiber replied to the topic 'INDI LibCamera Driver' in the forum. 1 year ago

@SigvaldS:
I opened an issue github.com/scriptorron/indi_pylibcamera/issues/15 in my GitHub project. Can we move the discussion there? It would be boring for the other people here.

When you open the link you will find on the right side a button "Subscribe". When you press this you will get an email every time when someone (me) writes a comment for this issue.

Read More...

Ronald Schreiber replied to the topic 'INDI LibCamera Driver' in the forum. 1 year ago

@SigvaldS: There is nothing behind "Raw sensor modes:"? This is strange. It means, the libcamera (and kernel driver) do not provide raw data for this camera. I can't belive that. Maybe it is too new and still not fully supported. Do you use the latest Raspberry Pi OS Bullseye and have you done "sudo update; sudo upgrade" to get the newest versions of libcamera and kernel? Can you make pictures with "libcamera-raw" (see www.raspberrypi.com/documentation/comput...e.html#libcamera-raw)?

My focus for "indi_pylibcamera" was on raw images. And I am still convinced that raw images will better perform for astro photography than processed RGB images. But I will try to extend the driver for cameras which provide processed frames only. Please give me some days.

Read More...