Yes, thank you canalrun. Great ideas to consider. I have several sensors but the phone only sees them as one. The embedded controller manages all of the sensors, does a bunch of maths on the data, and creates data packets to send. It also has to respond to some simple commands.
The data consists of one 10 or 12 bit sample per degree of rotation from each of two sensors. 720 FIR Filters are instantiated and each sample is fed to a filter. Some timing stuff is also computed based on an angle sensor and a motor/solenoid is also driven. All of this is packaged into a payload with some security bits added ready for sending.
This is all working fine using a USB connection and an 8 bit controller sans the FIR filters. The filters are handled by the i7 on the PC running an app written in BC++. I expect the planned 32 bitter will eat the job, filters and all.
Because there are only 2 sensors and data from both is in the one packet I planned to identify the sensor by the order of the data, say two 1-dim arrays or a single 2-dim array.
The phone app is essentially complete. Data structures are in place to hold the packet data and a simulation mode fills these with simulated data. The app works and shows the data the way it should. All that is needed now is to get the data and stuff it into the data structures, and add a little logic to send commands to the controller.
The hardest part of the project was to get the real time response I desired. It seems that the graphics is quite fast but the maths can bog down. I forced integer arithmetic wherever I could and now achieve my desired 50Hz update rate. There were a couple of tasks I would have liked to give to a different core but it seemed like too big a "can of worms" for a modest performance gain - if any.
I have been reading this forum about the asynch streams and they look very promising for this app. I will have a go this afternoon to get this working.
Your other ideas are good to try. I want this to be transparent, if possible, to the users. I do not want the user to have to be a network engineer to get the app to work. Start the app, find the controller, and do its stuff without the user being aware of what's under the hood.
For interest, the app is the smarts for a small gas turbine balancer, as used in radio controlled aircraft. My friend and I have developed the hardware which is all CNC machined aluminium contruction and is unique in that it accepts the entire engine. There is no need to disassemble the engine to remove the rotor for balancing. The rotor is spun up using the starter motor (electric) or compressed air via a solenoid valve. We plan to supply it as a kit - a box full of machined parts and the user screws it together.
The problem we faced was that an integrated solution required tooling for a case, an expensive display, user input hardware, complex circuit, among other things, and still would be unlikely to offer all of the features we could offer by using a phone app. The target market is reasonably technically astute people so we assume they have a phone modern enough to run the app.
Techniques learned from this app will be directly applied to my Uni research project. I need to get engineering data off a drone being used for geophysical survey. There are other telemetry links on the drone, but this one is for ad-hoc development data to be sent to a phone - like, what do I need to know now - let's implement it in 5 minutes.
Thanks for your help. This discussion is good because I am working almost completely in a vacuum - my research is new and unique and no one I know has the slightest idea what I am talking about.