Type: User Story
Priority: P3: Somewhat important
Resolution: Out of scope
Affects Version/s: None
Fix Version/s: None
Data rate (the hardware-specific thing) is meta-data at best and a control stuff at worst.
There's really a few concepts going on here.
What is the LATENCY that the app can tolerate? The app should set this as large as it can tolerate so the system can optimize delivery.
How much data does the app want? An app with a latency of 100ms (10Hz) may want readings at 10Hz or faster (100Hz data == 10 samples per latency).
I'm going to assume that we can't (directly) determine the hardware rate so there is going to be a fixed maximum rate. If we want a slower rate it'll either be sampled at a lower rate or sub-sampled. The maximum rate should be made available to the app because this determines the minimum latency.
Also, since an app can legitimately request rates slower than 1Hz we need either a different data type (float) or a different measurement. Possibly even a few measurements (eg. Enum for coarse, device-agnostic control, hertz for precise control of high-speed data and milliseconds to allow for longer sampling periods).
- A solid understanding of what the applications should be able to request and how we'll provide that.
- User Stories for implementaiton