after experiencing problems while testing the polar assistant with the new manual slew feature I experimented quite a while yesterday.
Finally after reading this thread it occurred to me to check the orientation of the image on my sensor. Indeed (presumably due to the flattener I use) the image is upright.
When I switched on parity detection in the astrometry options it started to work and I got below one ' polar alignment accuracy with the star adventurer (the polar scope gives me typically 3.5' accuracy).
It might be a good idea to highlight the parity switch. Maybe a new tutorial would be worth it.
the autofocuser based on HFR will not be affected by this and stay the default option.
In fact it would be possible to use the offset calculated by the Bahtinov fitter to drive the autofocuser.
In a fully automatic setup this could be combined with a mechanics that places the mask onto the scope (similar to what some people have for flatfield caps). But we are digressing.
For the streaming mode we have to see. The fitter works on the image frames, so in principle it can be done. The question is performance, i.e. at which framerate can we produce the solutions. It definitely will work in a frameing mode.
Short status report: the algorithm that extracts the diffraction pattern and fits the Bahtinov lines to it, thereby measuring the offset of the center line is implemented in my local copy of EKOS.
Since I had to switch to GSL for the numerical libraries, I am working some convergence issues. Next will be work on the graphics (plotting the lines corresponding to the found solution).
I'll post pictures as soon as I have something presentable.
Indeed as I wrote above APT is one of the inspirations for this.
I'd be curious for suggestions how we can be better than APT. E.g. the two circles that indicate the Focus error in APT.. are they really useful? Why use circles here? What is measured is just a single number.
Instead what I want to provide is an uncertainty of the measurement, so that the user sees when they have reached the sensitivity of the method and further tuning becomes pointless.
This can be expressed visually through a "Focus achieved" message.
BTW: is color vision deficiency (CVD) a design consideration in KStars? Might be an issue when we use differently colored indicators.
This is exactly what I was looking for. Many thanks!!
I'll need to make an option to import pictures into EKOS for testing purposes, I guess I can copy how that is done in the alignment module, right?
Also if you can point me to a piece of code where a FITS picture is read into a 2D array, that'd be helpful.
Canon raw files should be fine. I'll need to learn to work with FITSIO anyhow, which can handle them. For initial testing I can just convert them myself.