- Forum posts: 1
Mar 21, 2017, 3:57:35 PM via Website
Mar 21, 2017 3:57:35 PM via Website
I'm developing an Android app which retrieves data via BLE from a device in real-time.
As stated in the device's datasheet, BLE uses a 20-ms connection interval. Twenty user-data bytes (which is equal to 2- samples for each channel and 2-bytes running counter) are sent in GATT notifications. Data from the device is ping-pong buffered and up to six BLE-notification packets are sent every 14 ms based on an OSAL timer. The sample rate is set as 160 samples/sec. Each sample is 3 bytes and is sending 3 channels.
Each notification packet consists of 20 bytes containing the following:
Measurement Sample1 (Raw ADC data)
- Channel1 (3 bytes)
- Channel2 (3 bytes)
- Channel3 (3 bytes)
Measurement Sample 2 (Raw ADC data)
- Channel1 (3 bytes)
- Channel2 (3 bytes)
- Channel3 (3 bytes)
Afterwards I plot this data, but it looks like that I am only getting a sample rate around 105, while there should be 160 samples/sec. It just looks like some samples are missing looking from the signal.
I was wondering what could be the cause for that, is there a bug or a design flaw in the code? Are there any alternative methods to retrieve the data?
Here's a code that I use: codepaste.net/gqqgab
Recommended editorial content
With your consent, external content is loaded here.
By clicking on the button above, you agree that external content may be displayed to you. Personal data may be transmitted to third-party providers in the process. You can find more information about this in our Privacy Policy.