Abstract:
Generally, this disclosure describes a continuous authentication confidence module. A system may include user device including processor circuitry configured to determine presence data; a confidence factor including at least one of a sensor configured to capture sensor input and a system monitoring module configured to monitor activity of the user device; memory configured to store a confidence score and an operating system; and a continuous authentication confidence module configured to determine the confidence score in response to an initial authentication of a specific user, update the confidence score based, at least in part, an expectation of user presence and/or selected presence data, and notify the operating system that the authentication is no longer valid if the updated confidence score is within a tolerance of a session close threshold; the initial authentication configured to open a session, the confidence score configured to indicate a current strength of authentication during the session.
Abstract:
In an embodiment, a security engine of a processor includes an identity provider logic to generate a first key pair of a key pairing associating system user and a service provider that provides a web service and having a second system coupled to the system via a network, to perform a secure communication with the second system to enable the second system to verify that the identity provider logic is executing in a trusted execution environment, and responsive to the verification, to send a first key of the first key pair to the second system. This key may enable the second system to verify an assertion communicated by the identity provider logic that the user has been authenticated to the system according to a multi-factor authentication. Other embodiments are described and claimed.
Abstract:
Generally, this disclosure describes technologies for securely storing and using biometric authentication information, such as biometric reference templates. In some embodiments, the technologies include a client device that stores one or more biometric reference templates in a memory thereof. The client device may transfer such templates to an authentication device. The transfer may be conditioned on verification that the authentication device includes a suitable protected environment for the templates and will execute an acceptable temporary storage policy. The technologies may also include an authentication device that is configured to temporarily store biometric reference templates received from a client device in a protected environment thereof. Upon completion of biometric authentication or the occurrence of a termination event, the authentication devices may delete the biometric reference templates from the protected environment.
Abstract:
Methods, apparatus, systems and articles of manufacture for federated training of a neural network using trusted edge devices are disclosed. An example system includes an aggregator device to aggregate model updates provided by one or more edge devices. The one or more edge devices to implement respective neural networks, and provide the model updates to the aggregator device. At least one of the edge devices to implement the neural network within a trusted execution environment.
Abstract:
Methods, systems, articles of manufacture and apparatus to detect process hijacking are disclosed herein. An example apparatus to detect control flow anomalies includes a parsing engine to compare a target instruction pointer (TIP) address to a dynamic link library (DLL) module list, and in response to detecting a match of the TIP address to a DLL in the DLL module list, set a first portion of a normalized TIP address to a value equal to an identifier of the DLL. The example apparatus disclosed herein also includes a DLL entry point analyzer to set a second portion of the normalized TIP address based on a comparison between the TIP address and an entry point of the DLL, and a model compliance engine to generate a flow validity decision based on a comparison between (a) the first and second portion of the normalized TIP address and (b) a control flow integrity model.
Abstract:
Various embodiments are generally directed to an apparatus, method, and other techniques to maintain user authentications with common trusted devices. If a user is in possession of a first computing device (e.g., a smartphone), an unlocked state of the first trusted device is maintained if the user is using a nearby trusted device (e.g., a computer) within a certain amount of time. If the first trusted device is in a pocket or other container, a longer span of time is granted to the user to register an on-body state.
Abstract:
Methods, apparatus, systems and articles of manufacture are disclosed that provide an apparatus to analyze vehicle perspectives, the apparatus comprising a profile generator to generate a first profile of an environment based on a profile template and first data generated by a first vehicle; a data analyzer to: determine a difference between the first profile and a second profile obtained from a first one of one or more nodes in the environment; and in response to a trigger event, update the first profile based on the difference; and a vehicle control system to: in response to the trigger event, update a first perspective of the environment based on one or more of second data from the first one of the one or more nodes or the updated first profile; update a path plan for the first vehicle based on the updated first perspective; and execute the updated path plan.
Abstract:
Methods, apparatus, systems and articles of manufacture for federated training of a neural network using trusted edge devices are disclosed. An example system includes an aggregator device to aggregate model updates provided by one or more edge devices. The one or more edge devices to implement respective neural networks, and provide the model updates to the aggregator device. At least one of the edge devices to implement the neural network within a trusted execution environment.
Abstract:
Sensor data may be filtered in a secure environment. The filtering may limit distribution of the sensor data. Filtering may modify the sensor data, for example, to prevent identification of a person depicted in a captured image or to prevent acquiring a user's precise location. Filtering may also add or require other data use controls to access the data. Attestation that a filter policy is being applied and working properly or not may be provided as well.
Abstract:
An embodiment includes a method executed by at least one processor comprising: initializing first and second secure enclaves each comprising a trusted software execution environment that prevents software executing outside the first and second secure enclaves from having access to software and data inside the first and second secure enclaves; the first secure enclave (a)(i) inspecting a policy, (a)(ii) authenticating the second secure enclave in response to inspecting the policy; and (a)(iii) communicating encrypted content to the second secure enclave in response to authenticating the second secure enclave; and the second secure enclave (b)(i) decrypting the encrypted content to produce decrypted content, and (b)(ii) inspecting the decrypted content. Other embodiments are described herein.