Abstract:
Enables coupling or retrofitting a golf club with active motion capture electronics that are battery powered, passive or active shot count components, for example a passive RFID, and/or a visual marker on the cap for use with visual motion capture cameras. Does not require modifying the golf club. Electronics package and battery can be easily removed and replaced, for example without any tools. May utilize a weight that is removed when inserting the electronic package in the mount, wherein the weight element may have the same weight as an electronics package, for no net change or minimal change in club weight. May be implemented with a shaft enclosure and expander that may be coupled with a screw aligned along an axis parallel to the axis of the shaft. May utilize non-permanently and/or friction coupling between the mount and shaft. Cap may include a visual marker and/or logo.
Abstract:
A system that analyzes data from sensors and video cameras to generated synchronized event videos and to automatically select or generate tags for an event. Enables creating, transferring, obtaining, and storing concise event videos generally without non-event video. Events stored in the database identifies trends, correlations, models, and patterns in event data. Tags may represent for example activity types, players, performance levels, or scoring results. The system may analyze social media postings to confirm or augment event tags. Users may filter and analyze saved events based on the assigned tags. The system may create highlight and fail reels filtered by metrics and by tags.
Abstract:
A method that integrates sensor data and video analysis to analyze object motion. Motion capture elements generate motion sensor data for objects of interest, and cameras generate video of these objects. Sensor data and video data are synchronized in time and aligned in space on a common coordinate system. Sensor fusion is used to generate motion metrics from the combined and integrated sensor data and video data. Integration of sensor data and video data supports robust detection of events, generation of video highlight reels or epic fail reels augmented with metrics that show interesting activity, and calculation of metrics that exceed the individual capabilities of either sensors or video analysis alone.
Abstract:
Enables recognition of events within motion data obtained from portable wireless motion capture elements and video synchronization of the events with video as the events occur or at a later time, based on location and/or time of the event or both. May use integrated camera or external cameras with respect to mobile device to automatically generate generally smaller event videos of the event on the mobile device or server. Also enables analysis or comparison of movement associated with the same user, other user, historical user or group of users. Provides low memory and power utilization and greatly reduces storage for video data that corresponds to events such as a shot, move or swing of a player, a concussion of a player, or other medical related events or events, such as the first steps of a child, or falling events.
Abstract:
Enables intelligent synchronization and transfer of generally concise event videos synchronized with motion data from motion capture sensor(s) coupled with a user or piece of equipment. Greatly saves storage and increases upload speed by uploading event videos and avoiding upload of non-pertinent portions of large videos. Provides intelligent selection of multiple videos from multiple cameras covering an event at a given time, for example selecting one with least shake. Enables near real-time alteration of camera parameters during an event determined by the motion capture sensor, and alteration of playback parameters and special effects for synchronized event videos. Creates highlight reels filtered by metrics and can sort by metric. Integrates with multiple sensors to save event data even if other sensors do not detect the event. Also enables analysis or comparison of movement associated with the same user, other user, historical user or group of users.
Abstract:
A broadcasting system for broadcasting images with augmented motion data, which includes at least one camera, a computer and a wireless communication interface. The system obtains data from motion capture elements, analyzes data and optionally stores data in database for use in broadcasting applications, virtual reality applications and/or data mining. The system also recognizes at least one motion capture data element associated with a user or piece of equipment, and receives data associated with the motion capture element via the wireless communication interface. The system also enables unique displays associated with the user, such as 3D overlays onto images of the user to visually depict the captured motion data. Ratings, compliance, ball flight path data can be calculated and displayed, for example on a map or timeline or both. Furthermore, the system enables performance related equipment fitting and purchase.