Description
We believe that formal methods in security should be leveraged in all the standardisation’s of security protocols in order to strengthen their guarantees. To be effective, such analyses should be:* maintainable: the security analysis should be performed on every step of the way, i.e. each iteration of the draft;* pessimistic: all possible threat models, notably all sort of compromise should be considered;* precise: the analysis should notably include as many real life weaknesses of the concrete cryptographic primitives specified.In this talk, we illustrate how such a goal may be approached by detailing our analysis of the current IETF draft standard of the EDHOC protocol, as well as our subsequent interactions with its LAKE working group. We will proceed in three steps, first introducing the Sapic+ platform that allows from a single modeling of a protocol to benefit from all the capabilities of multiple automated verification tools (ProVerif, Tamarin, DeepSec). We will then introduce multiple recent advances on how to better model the cryptographic primitives and their real life weaknesses. We will finally show how we leveraged Sapic+ along with the advanced primitive models to analyze the EDHOC protocol and provide feedback to the LAKE working group that has been integrated in latter drafts.
Prochains exposés
-
[CANCELLED] Black-Box Collision Attacks on Widely Deployed Perceptual Hash Functions and Their Consequences
Orateur : Diane Leblanc-Albarel - KU Leuven
[CANCELLED] Perceptual hash functions identify multimedia content by mapping similar inputs to similar outputs. They are widely used for detecting copyright violations and illegal content but lack transparency, as their design details are typically kept secret. Governments are considering extending the application of these functions to Client-Side Scanning (CSS) for end-to-end encrypted services:[…]-
Cryptography
-
SoSysec
-
Protocols
-
-
A non-comparison oblivious sort and its application to private k-NN
Orateur : Sofiane Azogagh - UQÀM
Sorting is a fundamental subroutine of many algorithms and as such has been studied for decades. A well-known result is the Lower Bound Theorem, which states that no comparison-based sorting algorithm can do better than O(nlog(n)) in the worst case. However, in the fifties, new sorting algorithms that do not rely on comparisons were introduced such as counting sort, which can run in linear time[…]-
Cryptography
-
SoSysec
-
Privacy
-
Databases
-
Secure storage
-