A few questions:
If the hand controller is not connected to RST-135, can EKOS/KStars update the date to the mount?
If cannot, I'd assume the goto would not work. Would the autoguide be affected?
Can the RST-135 accept connection from Wifi with SkySafari, and at the same time accept autoguiding commands from USB?
Although I do not own a rainbow - my Temma2 experience the same problem messing up the DEC direction during different imaging & platesolving sessions. After some testing, I've found that I have to start the mount by pointing at the eastern side of sky, and platesolve at the eastern hemisphere. After a successful platesolve then slew to the target. The RA/DEC movement will then always correct. If I start platesolving at the western hemisphere, the slew direction of the DEC would be reversed (I can't remember it's always the case or sometimes). It's strange that both KStar and SkySafari shows the mount is slewing in one way, but the mount is actually slewing in an opposite direction - and that would definitely messed up the goto.
I just ran another experiment and it does seem to be slewing the wrong direction - in at least one axis. I reset everything again with the scope pointing west (as close as possible anyway). I manually slewed to north and was able to plate solve again and the mount pointing seemed to show around the right location (90 deg DEC && the approximate correct latitude for my location - ~37deg). I then attempted to slew to an object that was higher in the sky than Polaris and to the northwest (effectively higher and more westward than the current location at that time). Instead however it slewed to a location to the northeast. After slewing, the coordinates shown in Ekos seemed right (AZ ~326 deg / ALT 56 deg). However the scope definitely slewed to the east instead of west. The altitude seems like it was probably about right.
In my case: I have EKOS installed on RPI4 (4Gb) and I uses VNC to remote from my MacBook Air 11". I set VNC the resolution is 1366 * 768 which matches the monitor display. My imaging camera is ATik 490ex which the image size is 3379*2903. You can see the image at a much higher resolution than my monitor.
At my imaging session, I usually has to zoom in and out to check a few things:
At low resolution, I wanted to see the overall image like the object position, orientations are good.
At high resolution, I'd like to check for alignment (i.e. any star trailing) or may be focus shift or look for other artifacts
For my case, loading the full image into the memory is in fact a waste of resources - my monitor cannot display all the pixels.
If I want to check focus shifts or alignment problems I only need to look at part of the image to find out.
I'm not sure a magnifying glass approach would be worth to look at? Like the FITS viewer would initially show the preview at low resolution, which can be pre-generated. Display this low-resolution at the FITS viewer. Then on mouse-over (or may be a mouseclick is better) to load that part of the image from disk with an area say 50x50 and show that part as 1:1 (or may be magnified 2:1) in a "magnified" image box?
Did you check the one suggested earlier - there's a 5v and 3.3v feed from the RTC which you can connect to the fan. But I don't think it's PWM controlled ....
For my case the one with the superconductor was not really good. The battery may only last for a few days. So i switch to the one using the CR battery
Here's the version I've made with a DC breakout board.
The focuser I have is with the same model as the Orion AccuFocus.
But I'm sure the same idea can be applied to stepper version with a similar setup.
What I did:
1. Add a DC jack for feeding 12V and power both the PI and stepper breakout. As the motor requires a 12v to operate.DC breakout. The DC motor requires
2. I got a tiny 12V-5V step down breakout for an integrated power source. The 5V feeds to the GPIO pin for powering up the RPI4. While I split one wire from the 12V input to the +ve power feed at the DC stepper, allowing it to feed the 12V to the motor
3. I've a TB6612FNG for driving the motor. I check the pins and I solder one side with dupont joints, allowing the breakout to plug into the GPIO pins.
4. The original focuser uses a RJ10 cable and socket for connecting between the controller and the motor. I reuse the same way so I get a RJ10 jack and connect to the output pins of the DC driver board.
5. You can also see I have a RTC breakout that allows the RPI to have a real time clock.
6. Source code of the driver here: github.com/sywong2000/indi-gpio_pwm_focus
This setup gives me the advantage of:
1. a compact package connecting to the motor. Do not need an intermediate motor controller box.
2. a single 12V that feeds the PI box and the motor.
Yet the heat problem on the motor breakout for stepper driver would requires some attention.
Use this one
Battery is not rechargeable but in theory it goes up to 10 years...
My guess is that it would live for 3-4 years which is good enough..
Did you check the driver development page?
LucaR post=66997 wrote: Thanks g_gagnon for you advice, i try this way!
Moreover i'm looking for a driver like astroberry focuser with which to control a stepper motor using the bcm pins of the odroid, do you know any driver like this?
I thought the RPI4 would only be up to like 6-7 fps. I tried OACapture the best I can get is like 7fps.
rbarberac post=61145 wrote: I've been using FireCapture 2.6 on Astroberry on a Raspberry Pi4 with great success. I've KStars/Ekos running in the background guiding the mount (running OnStep). My camera is ZWO ASI120MM-S running with USB3.0. Taking videos of Mars and Jupiter I was able to capture 70fps frames with a ROI of 400x400px. When I increase the frame size I must reduce the frame rate.
On this video you can see a screen recording from the vnc client running on on iPad connected via wifi Astroberry on RPI4
And the results