I have created an app which shows me the acceleration of my phone at different times with time stamps. But when I change the minimum interval of the sensor by a factor of 10 I still have the same data density. I thought the minimum interval of the acceleration sensor limited how much data input I get but it doesn’t change anything. For example when the minium interval is set to 1000ms I get every second or more input from the sensor.
These are the blocks that make it all work maybe there is my mistake.