Abstract:
In some implementations, a mobile device can analyze motion sensor data and proximity sensor data during a voice call to determine whether the mobile device is on a stationary object or worn on a user's body (e.g., in the lap or pocket of a user of the mobile device). The mobile device can adjust the transmit power level of the telephony transceiver during the voice call based on the determination.
Abstract:
In some implementations, a mobile device can be configured with virtual motion fences that delineate domains of motion detectable by the mobile device. In some implementations, the mobile device can be configured to invoke an application or function when the mobile device enters or exits a motion domain (by crossing a motion fence). In some implementations, entering or exiting a motion domain can cause components of the mobile device to power on or off (or awaken or sleep) in an incremental manner.
Abstract:
In one aspect, the present disclosure relates to a method, including obtaining, by the fitness tracking device, motion data of the user over a period of time, wherein the motion data can include a first plurality motion measurements from a first motion sensor of the fitness tracking device; determining, by the fitness tracking device, using the motion data an angle of the fitness tracking device relative to a plane during the period of time; estimating by the fitness tracking device, using the motion data, a range of linear motion of the fitness tracking device through space during the period of time; and comparing, by the fitness tracking device, the angle of the fitness tracking device to a threshold angle and comparing the range of linear motion of the fitness tracking device to a threshold range of linear motion to determine whether the user is sitting or standing.
Abstract:
An electronic device may include a motion sensor for detecting movement of the electronic device and a pressure sensor for detecting changes in elevation of the electronic device. Applications that run on the electronic device such as health and fitness applications may use motion sensor and pressure sensor data to track a user's physical activity. For example, processing circuitry in the electronic device may use the motion sensor to track a user's steps and the pressure sensor to track changes in the user's elevation. The processing circuitry may determine whether the user is climbing stairs based on the user's step rate and the user's changes in elevation. When the processing circuitry determines that the user is climbing stairs, the processing circuitry may use the pressure sensor and motion sensor to track and store the number of flights of stairs climbed by the user.
Abstract:
Embodiments are disclosed for crash detection on one or more mobile devices (e.g., smartwatch and/or smartphone. In some embodiments, a method comprises: detecting a crash event on a crash device; extracting multimodal features from sensor data generated by multiple sensing modalities of the crash device; computing a plurality of crash decisions based on a plurality of machine learning models applied to the multimodal features, wherein at least one multimodal feature is a rotation rate about a mean axis of rotation; and determining that a severe vehicle crash has occurred involving the crash device based on the plurality of crash decisions and a severity model.
Abstract:
Embodiments are disclosed for head dimension estimation for spatial audio applications. In an embodiment, a method comprises: obtaining, using one or more processors of an audio headset worn on a user's head, acceleration samples and rotation rate samples over a specified time window while the user rotates their head, the acceleration samples and rotation rate samples measured using motion sensors in the headset; determining a function that relates the acceleration samples to the rotation rate samples; comparing the function to a plurality of reference functions, where each reference function corresponds to a different head dimension in a nominal range of head dimensions; and estimating a dimension of the user's head based on the comparing.
Abstract:
In an example method, a mobile device obtains sample data generated by one or more sensors over a period of time, where the one or more sensors are worn by a user. The mobile device determines that the user has fallen based on the sample data, and determines, based on the sample data, a severity of an injury suffered by the user. The mobile device generates one or more notifications based on the determination that the user has fallen and the determined severity of the injury.
Abstract:
The present disclosure relates to methods and systems of determining swimming metrics of a user during a swimming session. The method can include receiving, by a processor circuit of a user device, motion information from one or more motion sensors of the user device; determining, by the processor circuit using the motion information, a first set of rotational data of the user device, wherein the first set of rotational data is expressed in a first frame of reference; converting, by the processor circuit, the first set of rotational data into a second set of rotational data, wherein the second set of rotational data is expressed in a second frame of reference; determining, by the processor circuit, one or more swimming metrics of the user; and outputting the one or more swimming metrics.
Abstract:
In an example method, a mobile device obtains a signal indicating an acceleration measured by a sensor over a time period. The mobile device determines an impact experienced by the user based on the signal. The mobile device also determines, based on the signal, one or more first motion characteristics of the user during a time prior to the impact, and one or more second motion characteristics of the user during a time after the impact. The mobile device determines that the user has fallen based on the impact, the one or more first motion characteristics of the user, and the one or more second motion characteristics of the user, and in response, generates a notification indicating that the user has fallen.
Abstract:
Ear buds may have optical proximity sensors and accelerometers. Control circuitry may analyze output from the optical proximity sensors and the accelerometers to identify a current operational state for the ear buds. The control circuitry may also analyze the accelerometer output to identify tap input such as double taps made by a user on ear bud housings. Samples in the accelerometer output may be analyzed to determine whether the samples associated with a tap have been clipped. If the samples have been clipped, a curve may be fit to the samples. Optical sensor data may be analyzed in conjunction with potential tap input data from the accelerometer. If the optical sensor data is ordered, a tap input may be confirmed. If the optical sensor data is disordered, the control circuitry can conclude that accelerometer data corresponds to false tap input associated with unintentional contact with the housing.