-
公开(公告)号:EP3283997A1
公开(公告)日:2018-02-21
申请号:EP16780420.2
申请日:2016-03-03
申请人: Intel Corporation
发明人: MARTIN, Jason , GHOSH, Rahuldeva , CORNELIUS, Cory , OLIVER, Ian R. , NAGISETTY, Ramune , MCGOWAN, Steven B.
摘要: In one embodiment, a first device includes: a first logic to generate a first token when a user adapts the first device in approximate contact to the user, the first token including a first timestamp; a storage to store the first token and a second token, the second token obtained from an authenticator and associated with an authentication of the user to a second device, the second token including a second timestamp; and a communication module to communicate the first and second tokens to the second device to cause the second device to authenticate the user based at least in part on the first and second tokens. Other embodiments are described and claimed.
-
2.
公开(公告)号:EP3712822A1
公开(公告)日:2020-09-23
申请号:EP20157749.1
申请日:2020-02-17
申请人: Intel Corporation
发明人: KOUNAVIS, Michael , ANTONIOS, Papadimitriou , SHELLER, Micah J. , CORNELIUS, Cory , CHEN, Li , PAUL, Anindya , EDWARDS, Brandon
摘要: In one example an apparatus comprises a memory and a processor to create, from a first deep neural network (DNN) model, a first plurality of DNN models, generate a first set of adversarial examples that are misclassified by the first plurality of deep neural network (DNN) models, determine a first set of activation path differentials between the first plurality of adversarial examples, generate, from the first set of activation path differentials, at least one composite adversarial example which incorporates at least one intersecting critical path that is shared between at least two adversarial examples in the first set of adversarial examples, and use the at least one composite adversarial example to generate a set of inputs for a subsequent training iteration of the DNN model. Other examples may be described.
-