Abstract:
Disclosed are methods, devices, systems, apparatus, servers, computer-/processor-readable media, and other implementations, including a method of estimating a range between a first wireless device and a second wireless device that includes obtaining, at the first wireless device, first information related to a first broadcast message transmitted by the first wireless device, and obtaining, at the first wireless device, second information related to a second broadcast message transmitted by the second wireless device, with the second broadcast message including at least some of the first information. The method also includes determining the range between the first wireless device and the second wireless device based, at least in part, on the first information and the second information.
Abstract:
A range between a first wireless device and a second wireless device is estimated using a first mechanism based on messages transmitted over a first communication channel. The first communication channel is associated with a first radio access technology capability of the wireless devices. One or more metrics indicative of an accuracy of the range estimates provided by the first mechanism are obtained. A second mechanism to estimate a range between the first wireless device and the second wireless device may be implemented in favor of the first mechanism when the metric fails to satisfy a criterion. The second mechanism is based on unicast messages transmitted over a second communication channel. The second communication channel is associated with a second radio access technology capability of the wireless devices and may be the same as, or different from, the first communication channel.
Abstract:
Various embodiments may include methods, devices, and non-transitory processor-readable media for performing data stream encoding by identifying a first data chunk and calculating a first hash value for the first data chunk. A device may determine whether the calculated first hash value is located within a hash table. If so, then the computing device may encode the first data chunk as the first hash value, but if the hash value is not stored in the hash table, a new entry for the hash value may be added to the hash table. A second data chunk may be identified and a hash value calculated. The device compares the second hash value to a next value stored in the hash table. If the second hash value matches the next hash value, the device encodes the second data chunk as a flag indicating that a predicted pattern of data chunks is being followed.
Abstract:
System, methods, and apparatus are described that facilitate transmission/reception of data over a multi-line parallel bus. In an example, the apparatus selects from a sequential series of data bits a plurality of data bits for transmission over a plurality of parallel bus lines. For each bus line of the plurality of parallel bus lines, the apparatus compares a state of a current data bit selected for transmission on a current bus line during a current clock cycle with one or more conditions related to the current bus line or at least one bus line adjacent to the current bus line, wherein the one or more conditions includes a state of two data bits respectively transmitted on two bus lines adjacent to the current bus line during a previous clock cycle, and determines whether to transmit the current data bit on the current bus line based on the comparison.
Abstract:
System, methods, and apparatus are described that facilitate transmission/reception of data over a multi-line parallel bus. In an example, the apparatus selects from a sequential series of data bits a plurality of data bits for transmission over a plurality of parallel bus lines. For each bus line of the plurality of parallel bus lines, the apparatus compares a state of a current data bit selected for transmission on a current bus line during a current clock cycle with one or more conditions related to the current bus line or at least one bus line adjacent to the current bus line, wherein the one or more conditions includes a state of two data bits respectively transmitted on two bus lines adjacent to the current bus line during a previous clock cycle, and determines whether to transmit the current data bit on the current bus line based on the comparison.
Abstract:
Disclosed are techniques for radar-aided single-image three-dimensional (3D) depth reconstruction. In an aspect, at least one processor of an on-board computer of an ego vehicle receives, from a radar sensor of the ego vehicle, at least one radar image of an environment of the ego vehicle, receives, from a camera sensor of the ego vehicle, at least one camera image of the environment of the ego vehicle, receives, from a light detection and ranging (LiDAR) sensor of the ego vehicle, at least one LiDAR image of the environment of the ego vehicle, and generates a depth image of the environment of the ego vehicle based on the at least one radar image, the at least one LiDAR image, and the at least one camera image.
Abstract:
Methods, systems, computer-readable media, and apparatuses for radar or LIDAR measurement are presented. Some configurations include transmitting, via a transceiver, a first beam having a first frequency characteristic; calculating a distance between the transceiver and a moving object based on information from at least one reflection of the first beam; transmitting, via the transceiver, a second beam having a second frequency characteristic that is different than the first frequency characteristic, wherein the second beam is directed such that an axis of the second beam intersects a ground plane; and calculating an ego-velocity of the transceiver based on information from at least one reflection of the second beam. Applications relating to road vehicular (e.g., automobile) use are described.
Abstract:
Various embodiments disclose a device with one or more processors which may be configured to translate a radio detection and ranging (RADAR) reference depth map into depth information in at least one image plane of at least one camera, to form a three-dimensional RADAR depth image. The 3D RADAR depth image includes a depth estimate of each pixel. The one or more processors may also be configured to initialize a RADAR-aided visual inertial odometer based on the depth estimates from the RADAR reference depth image to track the device position.
Abstract:
Various embodiments disclose a device with one or more processors which may be configured to translate a RADAR velocity map in at least one image plane of at least one camera, to form a three-dimensional RADAR velocity image. The 3D RADAR velocity image includes a relative velocity of each pixel in the one or more images, and the relative velocity of each pixel is based on a RADAR velocity estimate in the three-dimensional RADAR velocity map. The one or more processors may be configured to determine whether visual features correspond to a moving object based on the relative velocity of each pixel determined, and may be configured to remove the visual features that correspond to a moving object, prior to providing them as an input into a state updater, in a RADAR-aided visual inertial odometer.
Abstract:
Disclosed embodiments pertain to a method on a UE may comprise determining a first absolute position of the UE at a first time based on GNSS measurements from a set of satellites. At a second time subsequent to the first time, the UE may determine a first estimate of displacement of the UE relative to the first absolute position using non-GNSS measurements. Further, at the second time, the UE may also determine a second estimate of displacement relative to the first absolute position and/or a second absolute position of the UE based, in part, on: the GNSS carrier phase measurements at the first time from the set of satellites, and GNSS carrier phase measurements at the second time from a subset comprising two or more satellites of the set of satellites, and the first estimate of displacement of the UE.