Understood (I've been playing with INDI for a number of years for personal projects. Mostly taking drivers apart to match command sets for Arduino controllers (ie filter wheels).

The current plan is to setup and INDI driver and a web page for control (so no driver is needed for "manual control). Then to write a scheduler or to mod EKOS for spectroscopy. The issue with EKOS is that the sequence needed is more complex than for imaging so takes a lo of typing and scripting-- move to a target, check alignment or ADC, check temp, setup Instrument (grating, filters etc), image cal lamps, image target, image cal lamp, image target (?X), move to ref and repeat the previous, then move to the next target and repeat. This while maintaining guiding and plate solve capabilities and a tie into SExtractor.

I could probably force EKOS to do the job, but a new client could also process the data as it comes in and could be much easier to set up observing sessions.

Don't hold you breath BTW... kind of slow...



Thank you again. Yes, I had looked at the list for a Spectroscopy specific client and/or a small widget like one Shelyak has for the SPOX in Windows/ASCOM.

In our case we need something more sophisticated that can change the grating angle, temp control and a few other things so just looking at what was there.



Simon, Thank you. I just wondered how this driver is used? I haven't but will load up a copy and see what I can get to .



Hey, I know this is old, but there is interest in automating a 3D printed and an Introspec (Harvard) clone. I'm assuming this is mostly for Cal and Flat but would like to add some other functions as well.

I think I found Sh87's driver on Github last night, so can use it. First real question, is there an IndiLib client for this?

Thank you.



Can I take exception to a few of the statements made?

I care about bandwidth... If I'm at the observatory or it's in the back yard, maybe not so much and remote desktop may work. If I am running to a remote observatory serviced by a microwave link at $30/month vs a broadband link at $150/month, I care. I do mostly speckle so each data cube is about 1 GB. I can do one every 6-10 minutes... I almost have to use a NAS to store raw data at the observatory and reduce the data there (1000:1 reduction in size or more, already programmed into the GPU on a Jetson TK1). To arbitrarily say bandwidth/date usage doesn't matter is simply wrong. It even matters for places like Kit Peak and have the data on a jump drive to prove it.

ASCOM- Let's face it, it's legacy from the RS232 world. The world has changed, plug and play, configuration apps, posted "Open" interfaces and so on. I can plug in my GigE Flea3 camera anyplace on my network, run a config app and it finds it. One click and the software is set up... Auto config and/or configuration help would be to my mind the minimum needed for a new system. Yes, this can be left to the instrument makers, but shouldn't it be called out as an expectation?

From what I can tell, adding a data field to ASCOM will cause the known universe to implode (just making a point). The word "anarchy" was applied when I asked about self-declaration. I don't think any developer or user can predict what will be of interest or needed in 5 years, apparently others do. So there is, IMHO, a need within the interface to add instrument types, data types and commands. I wanted to add an instrument selector, which I did by calling it a filter wheel, though I see there is one in INDI. This worked but is confusing to those looking on. Now I want an automated ADC (Atmospheric Dispersion Correction)... how do I do that: via scripting with Poth, easy in INDI (via snoop)?

The added data field was for my observatory controller and you have no idea what I want to report from my observatory or when... by adding the data type, units, name limits and actions, the field could be added on the fly to any application with minimal effort, the application not even knowing the relevance: real, Dec current, ma, -300 , 300, warn. This doesn't feel like anarchy :-).

There's one other minor fly in the ointment. Yes it's minor. To reuse the ASCOM driver on an ALPACA device means running windows on the device... anyone see a problem with that? It just means the interface will need to be rewritten for the embedded web sever and the ASCOM example code will be helpful. Just not the slam dunk initially thought.

So I like the fact ALPACA will give the instruments a network based interface that is OS agnostic, huge plus. It's a bit too archaic and closed for my tastes but a major step forward for ASCOM. It will be interesting to see how the instrument makers respond.

Anyone see IBM's "Cognative Telescope Network" information? I'm probably not a fan, but there are some interesting points.


Stash, No flack from me, I agree.



I went through the published Alpaca spec and without trying to be catty it still follows the ASCOM protocol restrictions, which I'm having a hard time buying into in this day and age. Other than that, if the instrument manufacturers buy into Alpaca (I see no reason they would not, I would in a heart beat) when we finely have OS independent instruments.

I haven't used it, but I understand there is a windows sever that uses ASCOM drivers and talks to INDI? If so, that should be a good start on an Alpaca interface for INDI(?).

Back in about 2011 I floated (on this forum I think) an idea for embedded INDI, which is the same basic concept. I'll show my lack of INDI depth, but I thought INDI servers needed to be daisy chained. If that isn't true or there could be a "combiner," that is a way to talk to multiple servers from a single point, the INDI server could move into the instrument now... hint, hint.

Anyway, I've been on board with the web based instrument concept for a very long time... glad to see it finally happening.

2 cents from the cheap seats...


Greg Jones created a new topic ' Adding a new device category?' in the forum. 4 years ago

I did quick search, but didn't get a hit.

What's involved with adding a new device category? I can fool the system into supporting an instrument selector by calling it a filter wheel, or by writing a script in EKOS. It would be nice to add it as an actual device though, as it would be less confusing.

Yes, I know none of the clients would initially support a new category, but some might over time.

There are two other instruments that would be nice to automate: an ADC and spectrometer. The spectrometer might be difficult as they can have more moving parts, but a few simple commands might be enough to get moving.



Greg Jones replied to the topic 'EKOS imaging pipeline?' in the forum. 5 years ago


That may be the best option. In fact the more I think about it, running the scheduler on the observatory hardware seems like a more robust solution (WAN disconnect). Nice thing about INDI is there are a number of options here. The local file processor was the initial plan, but for several reasons, doing it as part of the capture process looks like a better option.

I have some homework to do... Thank you again.


Greg Jones replied to the topic 'EKOS imaging pipeline?' in the forum. 5 years ago

knro, I can make that work in theory, but it dosn't do what I am looking for.

Think remote observatory with slow data connection. We shoot about 5000 frames per target and 10+ targets an hour. We can fill a terabyte drive in 3 nights. So the goal is to collect the raw frames in the observatory and do the Speckle reduction (Fourier based) on the fly (GPU) returning just the Speckle results. That will give us close to a 5000 to 1 reduction in data needing to be transferred. We also need to put the files into a reasonable directory structure and Ideally store each target in a FITs cube instead of a frame per file.

I know that's a lot and I'll take a shot at it. Since the first note I've looked at the code and have an idea what to do.

TIt looks like EKOS does the actual file save, correct? That means that EKOS would need to be running on the observatory computer. If all that is correct, then splitting the scheduler from the capture module would be the Ideal way forward. That is, let the capture module run under the server, and direct it with the scheduler via the server.

Hope that makes sense?

I'm hoping to get the cuda/GPU portion working, probably with Open Astro, but CCDCiel is also a possibility as it uses INDI for the cameras from what I can tell.

Thank you,


Greg Jones created a new topic ' EKOS imaging pipeline?' in the forum. 5 years ago

I'm looking at setting up a pipeline for Speckle Interferometry. In a nutshell, I would like to pre-process images as the frames are taken. Conceptually this would be from a remote site. We take as many as 10,000 frames per target and while small, that's a lot of data.

Not sure how to even search for what I am asking for, but want the save to disk function in the observatory and the scheduling to be done in my office. That is EKOS scheduler is running on my system in the office, but the data is being collected and reduced at the observatory. Once processed the result is about the size of a single frame. This is of interest to anyone ding high cadence imaging.

Can this be done now? If so a hint on how would be nice. Also a pointer to where you would recommend hacking in the cuda code for the processing :-)

Thank you,


Greg Jones replied to the topic 'Astroberry PiFace' in the forum. 6 years ago

Motor is has a spindle with an O-ring around it for friction drive to the filter wheel. Small 1/16" diameter magnets are glued into holes in the filter wheel using a home made jig for location. Hall sensors sense magnetic fields so no physical connection between wheel and sensors.

Motor can slip under the posts and the spring fixed to the fixed posts supplies tension/force on the friction drive.

The wheel portion (that holds the filters) is actually flipped upside down from it's original configuration and the detente balls and springs removed so wheel is free to spin.

The 2 boards are screw mounted to the case using tapped holes in the case (small holes, like #1 screws ). Sensors are mounted over holes in the case (holes probably not needed in the aluminum) with slots milled to allow the sensor to get closer to the wheel (may also not be needed. Once working I'll "pot" the sensors in epoxy or similar.

Connection to the RPI (or other) is via USB to the Arduino Nano (skinny board shown).