Abstract:
A method comprises receiving in a governor device, from a plurality of data owner devices, metadata for one or more datasets maintained by the plurality of data owner devices, registering the metadata for the one or more datasets with the governor device, in response to a request from an aggregator, providing at least a portion of the metadata for the one or more datasets to the aggregator, receiving, from the aggregator, a compute plan to be implemented by the plurality of data owner devices, distributing at least a portion of the compute plan to the plurality of data owner devices, in response to receiving, from the plurality of data owner devices, a verification report and a certification for an enclave, binding the enclave to a host device, and providing the compute plan to the plurality of data owner devices.
Abstract:
The present invention discloses a secure ML pipeline to improve the robustness of ML models against poisoning attacks and utilizing data provenance as a tool. Two components are added to the ML pipeline, a data quality pre-processor, which filters out untrusted training data based on provenance derived features and an audit post-processor, which localizes the malicious source based on training dataset analysis using data provenance.
Abstract:
An apparatus includes a data interface to obtain first sensor data from a first sensor and second sensor data from a second sensor of a monitored system; a data analyzer to extract a feature based on analyzing the first and second sensor data using a model, the model trained based on historical sensor data, the model to determine the feature as a deviation between the first and second sensor data to predict a future malfunction of the monitored system; an anomaly detector to detect an anomaly in at least one of the first sensor data or the second sensor data based on the feature, the anomaly corresponding to the future malfunction of the monitored system; and a system applicator to modify operation of the monitored system based on the anomaly.
Abstract:
Methods, systems, articles of manufacture and apparatus to detect process hijacking are disclosed herein. An example apparatus to detect control flow anomalies includes a parsing engine to compare a target instruction pointer (TIP) address to a dynamic link library (DLL) module list, and in response to detecting a match of the TIP address to a DLL in the DLL module list, set a first portion of a normalized TIP address to a value equal to an identifier of the DLL. The example apparatus disclosed herein also includes a DLL entry point analyzer to set a second portion of the normalized TIP address based on a comparison between the TIP address and an entry point of the DLL, and a model compliance engine to generate a flow validity decision based on a comparison between (a) the first and second portion of the normalized TIP address and (b) a control flow integrity model.
Abstract:
Technologies for analyzing a Uniform Resource Locator (URL) include a multi-stage URL analysis system. The multi-stage URL analysis system analyzes the URL using a multi-stage analysis. In the first stage, the multi-stage URL analysis system analyzes the URL using an ensemble lexical analysis. In the second stage, the multi-stage URL analysis system analyzes the URL based on third-party detection results. In the third stage, the multi-stage URL analysis system analyzes the URL based on metadata related to the URL. The multi-stage URL analysis system advances the stages of analysis if a malicious classification score determined by each stage does not satisfy a confidence threshold. The URL may also be selected for additional rigorous analysis using selection criteria not used in by the analysis stages.
Abstract:
In an embodiment, a security engine of a processor includes an identity provider logic to generate a first key pair of a key pairing associating system user and a service provider that provides a web service and having a second system coupled to the system via a network, to perform a secure communication with the second system to enable the second system to verify that the identity provider logic is executing in a trusted execution environment, and responsive to the verification, to send a first key of the first key pair to the second system. This key may enable the second system to verify an assertion communicated by the identity provider logic that the user has been authenticated to the system according to a multi-factor authentication. Other embodiments are described and claimed.
Abstract:
Systems and methods for providing trusted time service for the off-line mode of operation of a processing system. An example processing system comprises: a first processing device communicatively coupled to a real-time clock, the first processing device to modify an epoch value associated with the real-time clock responsive to detecting a reset of the real-time clock; and a second processing device to execute, in a first trusted execution environment, a first application to receive, from the first processing device, a first time value outputted by the real-time clock and a first epoch value associated with the real-time clock.
Abstract:
Technologies are provided in embodiments to manage an authentication confirmation score. Embodiments are configured to identify, in absolute session time, a beginning time and an ending time of an interval of an active user session on a client. Embodiments are also configured to determine a first value representing a first subset of a set of prior user sessions, where the prior user sessions of the first subset were active for at least as long as the beginning time. Embodiments can also determine a second value representing a second subset of the set of prior user sessions, where the prior user sessions of the second subset were active for at least as long as the ending time. Embodiments also determine, based on the first and second values, a decay rate for the authentication confidence score of the active user session. In some embodiments, the set is based on context attributes.
Abstract:
Embodiments of an invention for virtualizing a hardware monotonic counter are disclosed. In one embodiment, an apparatus includes a hardware monotonic counter, virtualization logic, a first non-volatile storage location, and a second non-volatile storage location. The virtualization logic is to create a virtual monotonic counter from the hardware monotonic counter. The first non-volatile storage location is to store an indicator that the count of the hardware monotonic counter has changed. The second non-volatile storage location is to store an indicator that the count of the virtual monotonic counter has changed.
Abstract:
Sensor data may be filtered in a secure environment. The filtering may limit distribution of the sensor data. Filtering may modify the sensor data, for example, to prevent identification of a person depicted in a captured image or to prevent acquiring a user's precise location. Filtering may also add or require other data use controls to access the data. Attestation that a filter policy is being applied and working properly or not may be provided as well.