We need to create user stories to narrow down the scope of the analytics tracker to further specify the implementation. It depends on what the actual outcome of the analytics tracker should be to where we integrate it into the software stack.
Some of the core ideas are similar to QSignalSpy.
- Real-time analytics? Real-time event data (send data directly over the network) or cache data for a certain amount and then send it in chunks?
- 3D support needed (UV mapping?)
- granularity of the analytics (only QQC2 components, all objects, "screens")?
- Integration with tracing?
- Can we rely on tracing being available
- Generation of "heat-map" of events (e.g. clicked pressed). This is global and can be turned on and off.
- Collection of specific data usually connected to a signal and defined in QML. Similar to the prototype.
- No real-time required
- data is cached on disk
- the cadence should be configurable (hours, days...)
- abstraction of transport protocol would be nice, but for now tcp/ip/https is sufficient for MVP
- 3D support as in "mapping events back to 3D space" is important, but not part of the scope for the MVP
- C++ integration should be part of the final product but not MVP
- Collect unaccepted user input events and map them to 2D screen. The architecture should take 3D into account.
- Abstraction of transport protocol would be nice, but for now tcp/ip/https is sufficient.
- Track custom elements/components using signals
- Leave screen changes to the user to define the correct signal to send an event if a screen becomes active.
- Allow tracking QML state changes and send them
- 3D support for mapping unaccepted events
- Collect mouse movement
- Dump frame buffer. Visualize frame buffer together with heat-map in the cloud.
- Replay events on the actual application/prototype
- Custom arguments for events (e.g. add coordinates in infinite scroll scenario)
- Reduce the amount of data/edge filtering
- Constraints for the amount of data/ring buffer - if exceeded then old data is lost - configurable