At the moment it's offline so I'm processing the images on a mac.
However there are smaller embedded systems that will have the capability to process data (at a slower rate). My thinking is possibly:
C2 Odroid 64bit (2GB) performing the control, focus etc but then cross mount the images to stream over to a second system vs wifi which then takes the feed and processes the images. The difference is it's running on a dedicated embedded system (12V).
There's a couple of deep learning ideas I want to try - including noise removal and processing. The idea is that you can see both the raw images but also the processed ones.
I've done a lot of OpenCL too so I may be tempted to make a GPU library for the pipeline but as always GPU works best with it's own memory and you need a couple of GB to keep the images in memory or do FFT etc.
I've started playing with Octave
I run a 64 bit version of INDI (compiled myself) and use the 32bit version of the drivers.
So one issue with the 64bit version and the latest "numpty" is that the issue with the astrometry script and the rule changes in numpty about dynamic casting:
TypeError: Cannot cast ufunc add output from dtype('int32') to dtype('uint16') with casting rule 'same_kind'
Ok I'll have a look at that.
OpenPHD2 can store guider frames (the raw FITS) for the entire run.
Not many people use this perhaps, however I use it to create the point spread function for each long exposure frame.
I've had a look but couldn't see how todo this on Ekos.. can you?
Just got a new iPhone for my birthday (old one was 2009 3GS!).. VNC client app and hey presto!