All is well here. The world has gone crazy, but I try not to pay much attention. One of my students soloed today and didn't crash, so all is well.
It looks like you have made real progress. I hope SVBONY is responsive to your questions. I look forward to more testing when it is required.
The new AstroDMx_Capture program beta for Pi4 64 bit is out now and the SVBONY camera seems to work very well with the new SDK in that environment. A good time to be into astronomy, I think.
You are correct. My little mind gets confused with too many "S" words. I did mean Skywatcher and not SynScan. It certainly was my intent to never use Synscan again and I think you every time I use these drivers for that. I am also pleased to find that my exposure estimates are within the realm of expectations. The image of Jupiter, by the way, was last night's attempt. 300, 4 second subs stacked with Siril and finished with GIMP using the indi_skywatcherAltAzSimple driver. So, apparently, it can track effectively. The other image is my indi control panel motor detail. Apparently, we have different motors, if the numbers mean anything.
I really need to wrap my head around plate-solving as it applies to Kstars (that means I probably should actually READ the instructions). I understand the principle, but never have tried it. I am clear that it goes somewhere and can give you the coordinates or give them to Kstars, so now everyone knows there they are, but if the next GOTO, (to Vega, for example) doesn't end up at Vega =exactly= and you plate solve again, so everyone knows where they are, how do you get to Vega? I need to try it and see, I suppose.
Thank you once again for all your hard work on these drivers and your patience with my questions and concerns. I wish I could assist. I used to be a fair programmer in Java and even have written a fair number of motor and device controllers in assembly language, but I really am no good at all in C or C++. Never had a good reason to do more than just a few simple compiles in C.
No worries here. I got a beta from AstroDMx_Capture for the RaspberryPi4 version of their software with the new SDK for the SV305 to look at and have been banging my head against the wall trying to get either indi_skywatcherAltAzSimple OR indi_skywatcherAltAzMount to work with my Skywatcher 250P properly, so I haven't been idle.
I look forward to see your progress. I hear they did a pretty good job with the SDK.
I thought we had similar equipment. Mine is a SkyWacher 250P 10" Dob.
I spent a lot of time last night (we finally got clear skies) working with the scope with a fresh download/compile from Github. I initially tried to use indi_skywatcherAltAzMount, but its tracking seemed to drift quite a bit. I went back to indi_skywatcher_AltAzSimple and had good pointing results and reasonable tracking, but I had to play a bit with the timing and step rate to get it to be less jerky. I was able to shoot some images, but could not rely on an image capture longer than 4 seconds. I did notice that the arrow keys on the Kstars simulated hand controller worked without using the STOP, although I am convinced that -sometimes- hitting an arrow key turns tracking off.
My preference is using indi_skywatcherAltAzMount, so I probably should just stop using indi_skywatcherAltAzSimple. Are you getting good results with your scope with indi_skywatcherAltAzMount? If so, what do you think I may be doing incorrectly?
I setup using indi_skywatcherAltAzMount as follows:
* Point scope directly ad Polaris.
* Change the unpark to point North (it keeps defaulting to South)
* Unpark (scope remains oriented North but slews up about 30-25 degrees.
* Select Center & Track from the right-click menu while mouse set on Polaris.
* Select Synscan->Goto from the same right-click menu
* Use the Kstars hand unit simulation [T] arrow keys to center the image
* Select Synscan->Sync from the right-click menu
-IF- this is stable, I pick another star, and repeat the process I used for Polaris
-IF- I manage to get Polaris and another star reasonably stable and tracking isn't drifting, I try a deep sky object. No good luck past this point with this driver.
I will say that my camera (SV305) has a VERY narrow FOV, so I am very sensitive to tracking drift and missing a slew to target by even a small amount.
Which driver are you using? Mount or Simple? Also, what scope/mount?
I seem to have better pointing accuracy with Simple (Wedge), but it seems a bit "jerky" on tracking. I played with some of the timing settings and it may have improved a bit.
The largest problem I still have with either driver is after it slews to an image and begins tracking. If it doesn't have the target centered, I have tried opening up the KStars [T] icon, that displays a hand unit simulation, then hitting the stop button (which apparently is for stopping the tracking), and then using the arrow keys to center the image. This is all well and good, but after centering the target, I can find no way to reactivate tracking. I have tried GOTO and SYNC on the simulated hand unit, SynScan GOTO and SYNC from the Kstars map right-click display and pressing the TRACK button in the indi control panel. No joy.
The only thing I have found that works is doing a quick GOTO to a nearby star, followed by a quick GOTO back to my target. This is tedious...it works, and tracking is re-enabled, but it isn't easy to do before the target becomes off-center once again.
Any suggestions will be greatly appreciated.
I think what I am seeing regarding the stars is vibration or wind, rather than focus. Only once cure. Don't shoot with vibration or wind
The washout could be post-processing. I'd have to see the original image to make a better guess.
Focus looks good. You might get a bit more detail if you had more subs, but fortunately, globular clusters are fairly bright, so getting a good image with 1 sub is possible.
I'd have to be obnoxiously picky to criticize, but if I must, a couple of the stars have a bit of jiggle (not many) and the center of the cluster is ever-so-slightly washed out. Is this raw data or did you process it by some means? Because, if it is raw data, it is really quite good.
I have heard similar things about the SDK. Apparently, it is being widely distributed. Let me know when you need testing.
Your English is not an issue. If you want to experience real pain, I can reply in Spanish or German. I have never studied French and would not attempt it out of consideration for the poor soul who would have to decode my jibberish. I probably know more words in Japanese. I certainly know more "impolite" words in Russian. The owner of the Russian Tango Gyro company used to assemble machines in one of my hangars and was very vocal when dropping wrenches on his feet or scrubbing his knuckles on something sharp.
Quick story about where I live: I spend most of my life in big cities, so when I decided to move to a rural area (Northwest Georgia), I flew my plane on a dark night and looked for the biggest area of unpopulated dark property I could spot, marked it on my chart, and the next days transferred the coordinates to a land map and told a realtor to "find me something in this red circle." Been here ever since.
Some folks I know in the UK have just received the SVBONY SDK and it is alleged to have all the armh 32/64 support. Very encouraging.
The best default resolution for the 305 is RGB24. And, it isn't an expensive camera, so the target user is probably not loaded up with filters and filter wheels. So, I'd program to the masses. Keep it color (just an opinion).
Don't worry too much about what our media reports. We have had a rise in cases since reopening businesses, especially in dense population areas, but we have also increased testing so comparisons have to be judged carefully. My "gut" tells me there is at least a spot increase in cases. Time will tell.
For myself, I live in a very tiny town on a farm in the woods. My contact risk is very minimal. I am retired, so I don't HAVE to go out. My greatest exposure is from gyroplane instruction that I do part-time when I feel like it. My students are pretty careful to keep me safe, as there aren't many gyroplane instructors in the US,
I see there has been quite a bit of activity on GITHUB. Anything specific you'd like me to test yet?
...and by the way...thank you for all the effort!
@Jon, you're far from being an inept operator
In France, we talk about "monkey testing". I don't know if you have something equivalent in USA
Oh good, I have you fooled! I don't know of a specific term for your "monkey testing," but we do have a few dozen terms that are similar (and probably even less polite). One that comes to mind is the ID-ten-T error (ID10T).
I am pleased to see that you were able to duplicate my results.
Good to know (and clearly, I did not). My primary use of live view is focus and alignment, but toggling the bayer button would not have jumped to mind as a method to enables the frames. I did play with the frames per second settings in the hope of getting an image, but that didn't help.
There is, of course, great value in knowing what a rank amateur or otherwise inept operator would do with a particular piece of software. I am pleased to provide that service.
I'm not a previous user of Ekos LiveView, but I think there may be issues. Using a field spotting scope with the camera in daytime, I got reasonable FITS images in the preview viewer. However, clicking live view gave me a dark gray screen. I know something was going on, because I could point the camera at the daytime sky and get a brighter gray and a complete dark with the cap on the camera. The SER file I received seemed very grainy and the resolution was not as set (1080P) as seen here in a SER header display from SER Player:
* FileId: LUCAM-RECORDER
* LuID: 0
* ColorID: 9 (GRBG)
* LittleEndian: 1
* ImageWidth: 960
* ImageHeight: 540
* PixelDepth: 16
* FrameCount: 37
* Observer: Unknown Observer
* Instrument: Unknown Instrument
* Telescope: Unknown Telescope
* DateTime: 28/05/0120 11:06:31.862589 (0x85de096a193462)
* DateTime_UTC: 28/06/2020 15:06:31.862603 (0x8d81b74d780d4ee)
I am not used to working with SER files, but may stack them later in Siril to see if anything more encouraging appears.
At this time, I am willing to "guess" that liveview is not working properly. I did play with gain and exposure duration with no obvious result. Although, changes to gain made significant impact to preview images
I checked binning yesterday. Seemed good. I'll check "stream enabled" today.
I did look at the v4l2 driver using my SV105 as a test. The indi control panel does have a significant image control (hue, saturation, et al) screen. That said, I'm not certain that it isn't a bit buggy, though I didn't take time to do a thorough test of v4l2.