Saarixx Labs

Displaying 1-40 of 41 results

  • White Papers // Apr 2014

    Main Memory Adaptive Indexing for Multi-core Systems

    Adaptive indexing is a concept that considers index creation in databases as a by-product of query processing; as opposed to traditional full index creation where the indexing effort is performed up front before answering any queries. Adaptive indexing has received a consider-able amount of attention, and several algorithms have been...

    Provided By Saarixx Labs

  • White Papers // Feb 2014

    Reverse Engineering of Cache Replacement Policies in Intel Microprocessors and Their Evaluation

    Performance modeling techniques need accurate cache models to produce useful estimates. However, properties required for building such models, like the replacement policy, are often not documented. In this paper, using a set of carefully designed microbenchmarks, the authors reverse engineer a precise model of caches found in recent Intel processors...

    Provided By Saarixx Labs

  • White Papers // Jan 2014

    Architecture-Parametric Timing Analysis

    Platforms are families of microarchitectures that implement the same instruction set architecture but that differ in architectural parameters, such as frequency, memory latencies, or memory sizes. The choice of these parameters influences execution time, implementation cost, and energy consumption. In this paper, the authors introduce the first general framework for...

    Provided By Saarixx Labs

  • White Papers // Jan 2014

    Selfish-LRU: Preemption-Aware Caching for Predictability and Performance

    The authors introduce Selfish-LRU, a variant of the LRU (Least Recently Used) cache replacement policy that improves performance and predictability in preemptive scheduling scenarios. In multitasking systems with conventional caches, a single memory access by a preempting task can trigger a chain reaction leading to a large number of additional...

    Provided By Saarixx Labs

  • White Papers // Dec 2013

    Differential Indistinguishability for Cryptographic Primitives with Imperfect Randomness

    In-distinguishability based definitions of cryptographic primitives such as encryption, commitments, and zero-knowledge proofs are proven to be impossible to realize in scenarios where parties only have access to non-extractable sources of randomness. In this paper the authors demonstrate that it is, nevertheless, possible to quantify this secrecy loss for non-extractable...

    Provided By Saarixx Labs

  • White Papers // Nov 2013

    Towards Compositionality in Execution Time Analysis - Definition and Challenges

    For hard real-time systems, timeliness of operations has to be guaranteed. Static timing analysis is therefore employed to compute upper bounds on the execution times of a program. Analysis results at high precision are required to avoid over-provisioning of resources. For current processors, timing analysis is a complex task mainly...

    Provided By Saarixx Labs

  • White Papers // Oct 2013

    TUC: Time-Sensitive and Modular Analysis of Anonymous Communication

    The Anonymous Communication (AC) protocol tor constitutes the most widely deployed technology for providing anonymity for user communication over the internet. Tor has been subject to several analyzes which have shown strong anonymity guarantees for tor. However, all previous analyses ignore time-sensitive leakage: timing patterns in web traffic allow for...

    Provided By Saarixx Labs

  • White Papers // Jun 2013

    Flexible and Fine-Grained Mandatory Access Control on Android for Diverse Security and Privacy Policies

    In this paper the authors tackle the challenge of providing a generic security architecture for the android OS that can serve as a flexible and effective ecosystem to instantiate different security solutions. In contrast to prior paper their security architecture, termed FlaskDroid, provides mandatory access control simultaneously on both android's...

    Provided By Saarixx Labs

  • White Papers // Jun 2013

    Automated Security Proofs for Almost-Universal Hash for MAC Verification

    Message Authentication Codes (MACs) are an essential primitive in cryptography. They are used to ensure the integrity and authenticity of a message, and can also be used as a building block for larger schemes, such as chosen-ciphertext secure encryption, or identity-based encryption. MACs are often built in two steps: first,...

    Provided By Saarixx Labs

  • White Papers // May 2013

    Impact of Resource Sharing on Performance and Performance Prediction: A Survey

    Multi-core processors are increasingly considered as execution platforms for embedded systems because of their good performance/energy ratio. However, the interference on shared resources poses several problems. It may severely reduce the performance of tasks executed on the cores, and it increases the complexity of timing analysis and/or decreases the precision...

    Provided By Saarixx Labs

  • White Papers // May 2013

    Towards a Minimal Cost of Change Approach for Inductive Reference Process Model Development

    Business Process Management (BPM) has advanced to be one of the most intensely discussed topics in Information Systems (IS) research. Based on the growing maturity of its concepts, methods and techniques BPM has, furthermore, gained tremendous importance in organizational practice. More and more organizations develop and use individual business process...

    Provided By Saarixx Labs

  • White Papers // May 2013

    Towards Empirically Validated Ubiquitous Information Systems: Results From a Pretest and Three Empirical Studies

    Research on Ubiquitous Information Systems (UIS) has recently gained attention in the IS community. But dedicated empirical instruments that are robust and capture individual characteristics of UIS are still missing. A rather new empirical construct derived from Task-Technology Fit theory was proposed and denoted as Situation-Service Fit (SSF) (Maass et...

    Provided By Saarixx Labs

  • White Papers // Mar 2013

    Limitations of the Meta-Reduction Technique: The Case of Schnorr Signatures

    The authors revisit the security of fiat-shamir signatures in the non-programmable random oracle model. The well-known proof by pointcheval and stern for such signature schemes (Journal of Cryptology, 2000) relies on the ability to re-program the random oracle, and it has been unknown if this property is inherent. The researchers...

    Provided By Saarixx Labs

  • White Papers // Feb 2013

    Embedded Systems: Many Cores - Many Problems

    The embedded-systems industry introduces multicore architectures for their good performance-energy ratio. This trend coincides with the transition from federated to integrated system architectures in the automotive and the aeronautics industries. The embedded systems industry is about to make a transition to multi-core platforms. This is a highly risky step as...

    Provided By Saarixx Labs

  • White Papers // Jan 2013

    AppGuard - Enforcing User Requirements on Android Apps

    The success of Android phones makes them a prominent target for malicious software, in particular since the Android permission system turned out to be inadequate to protect the user against security and privacy threats. This paper presents AppGuard, a powerful and flexible system for the enforcement of user-customizable security policies...

    Provided By Saarixx Labs

  • White Papers // Jan 2013

    Measurement-Based Modeling of the Cache Replacement Policy

    Modern microarchitectures employ memory hierarchies involving one or more levels of cache memory to hide the large latency gap between the processor and main memory. Cycle-accurate simulators, self-optimizing software systems, and platform-aware compilers need accurate models of the memory hierarchy to produce useful results. Similarly, worst-case execution time analyzers require...

    Provided By Saarixx Labs

  • White Papers // Jan 2013

    Affine Refinement Types for Authentication and Authorization

    Refinement type systems have proved very effective for security policy verification in distributed authorization systems. In earlier work, the authors have proposed an extension of existing refinement typing techniques to exploit sub-structural logics and affine typing in the analysis of resource aware authorization, with policies predicating over access counts, usage...

    Provided By Saarixx Labs

  • White Papers // Dec 2012

    Preventing Side-Channel Leaks in Web Traffic: A Formal Approach

    Internet traffic is exposed to potential eavesdroppers. Standard encryption mechanisms do not provide sufficient protection: Features such as packet sizes and numbers remain visible, opening the door to so-called side-channel attacks against web traffic. This paper develops a framework for the derivation of formal guarantees against traffic side-channels. The authors...

    Provided By Saarixx Labs

  • White Papers // Oct 2012

    Logical Foundations of Secure Resource Management in Protocol Implementations

    Recent research has shown that it is possible to leverage general purpose theorem proving techniques to develop powerful type systems for the verification of a wide range of security properties on application code. Although successful in many respects, these type systems fall short of capturing resource-conscious properties that are crucial...

    Provided By Saarixx Labs

  • White Papers // Sep 2012

    Automatic Cache Modeling by Measurements

    Modern microarchitectures employ memory hierarchies involving one or more levels of cache memory to hide the large latency gap between the processor and main memory. Cycle-accurate simulators need to accurately model such memory hierarchies to produce useful results. Similarly, worst-case execution time analyzers require faithful models, both for soundness and...

    Provided By Saarixx Labs

  • White Papers // Sep 2012

    Generating Test Suites With Augmented Dynamic Symbolic Execution

    Unit test generation tools typically aim at one of two objectives: to explore the program behaviour in order to exercise automated oracles, or to produce a representative test set that can be used to manually add oracles or to use as a regression test set. Dynamic Symbolic Execution (DSE) can...

    Provided By Saarixx Labs

  • White Papers // Aug 2012

    Building and Maintaining Halls of Fame Over a Database

    Halls of Fame are fascinating constructs. They represent the elite of an often very large amount of entities - persons, companies, products, countries etc. Beyond their practical use as static rankings, changes to them are particularly interesting - for decision making processes, as input to common media or novel narrative...

    Provided By Saarixx Labs

  • White Papers // Jul 2012

    AppGuard - Real-Time Policy Enforcement for Third-Party Applications

    Android has become the most popular operating system for mobile devices, which makes it a prominent target for malicious software. The security concept of Android is based on app isolation and access control for critical system resources. However, users can only review and accept permission requests at install time, or...

    Provided By Saarixx Labs

  • White Papers // Jun 2012

    Quantifying Information Flow in Cryptographic Systems

    The authors provide a novel definition of quantitative information flow, called transmissible information, that is suitable for reasoning about informational-theoretically secure (or non-cryptographic) systems, as well as about cryptographic systems with their polynomially bounded adversaries, error probabilities, etc. Transmissible information captures deliberate communication between two processes, and it safely over-approximates...

    Provided By Saarixx Labs

  • White Papers // Jun 2012

    Fuzzing with Code Fragments

    Fuzz testing is an automated technique providing random data as input to a software system in the hope to expose a vulnerability. In order to be effective, the fuzzed input must be common enough to pass elementary consistency checks; a JavaScript interpreter, for instance, would only accept a semantically valid...

    Provided By Saarixx Labs

  • White Papers // Apr 2012

    Relational Cache Analysis for Static Timing Analysis

    In hard real-time systems, operations are subject to timing constraints, i.e. there are operational deadlines from events to system responses. To show the correctness of such systems, one must therefore derive guarantees on the timeliness of reactions. Thereby, one fundamental problem is to characterize the execution time of programs. The...

    Provided By Saarixx Labs

  • White Papers // Nov 2011

    Enhancement of Traditional Business Process Management With Reflection - A New Perspective for Organisational Learning?

    The successful management of learning and knowledge has become a critical success factor for organizations in today's knowledge-intensive business world. However, the question remains how an organization should act and react in order to fulfill this management task. A common answer to the question is that organizations need employees who...

    Provided By Saarixx Labs

  • White Papers // Aug 2011

    Combining Search-Based and Constraint-Based Testing

    Many modern automated test generators are based on either meta-heuristic search techniques or use constraint solvers. Both approaches have their advantages, but they also have specific drawbacks: Search-based methods get stuck in local optima and degrade when the search landscape offers no guidance; constraint-based approaches, on the other hand, can...

    Provided By Saarixx Labs

  • White Papers // Jun 2011

    On Parameter Tuning in Search Based Software Engineering

    When applying Search-Based Software Engineering (SBSE) techniques one is confronted with a multitude of different parameters that need to be chosen: Which population size for a genetic algorithm? Which selection mechanism to use? What settings to use for dozens of other parameters? This problem not only troubles users who want...

    Provided By Saarixx Labs

  • White Papers // Jan 2011

    Exploiting Common Object Usage in Test Case Generation

    Generated test cases are good at systematically exploring paths and conditions in software. However, generated test cases often do not make sense. The authors adapt test case generation to follow patterns of common object usage, as mined from code examples. Their experiments show that generated tests thus reuse familiar usage...

    Provided By Saarixx Labs

  • White Papers // Dec 2010

    Evolutionary Generation of Whole Test Suites

    Recent advances in software testing allow automatic derivation of tests that reach almost any desired point in the source code. There is, however, a fundamental problem with the general idea of targeting one distinct test coverage goal at a time: Coverage goals are neither independent of each other, nor is...

    Provided By Saarixx Labs

  • White Papers // Oct 2010

    It Is Not the Length That Matters, It Is How You Control It

    The length of test cases is a little investigated topic in search-based test generation for object oriented software, where test cases are sequences of method calls. While intuitively longer tests can achieve higher overall code coverage, there is always the threat of bloat - a complex phenomenon in evolutionary computation,...

    Provided By Saarixx Labs

  • White Papers // Jun 2010

    Branch Target Buffers: WCET Analysis Framework and Timing Predictability

    One step in the verification of hard real-time systems is to determine upper bounds on the Worst Case Execution Times (WCET) of tasks. To obtain tight bounds, a WCET analysis has to consider micro-architectural features like caches, branch prediction, and Branch Target Buffers (BTB). The authors propose a modular WCET...

    Provided By Saarixx Labs

  • White Papers // May 2010

    Mutation-Driven Generation of Unit Tests and Oracles

    To assess the quality of test suites, mutation analysis seeds artificial defects (mutations) into programs; a non-detected mutation indicates a weakness in the test suite. The authors present an automated approach to generate unit tests that detect these mutations for object-oriented classes. This has two advantages: First, the resulting test...

    Provided By Saarixx Labs

  • White Papers // Apr 2010

    Precise and Efficient FIFO-Replacement Analysis Based on Static Phase Detection

    Schedulability analysis for hard real-time systems requires bounds on the execution times of its tasks. To obtain useful bounds in the presence of caches, static timing analyses must predict cache hits and misses with high precision. For caches with Least-Recently-Used (LRU) replacement policy, precise and efficient cache analyses exist. However,...

    Provided By Saarixx Labs

  • White Papers // Sep 2008

    Limits of Constructive Security Proofs

    The collision-resistance of hash functions is an important foundation of many cryptographic protocols. Formally, collision-resistance can only be expected if the hash function in fact constitutes a parametrized family of functions, since for a single function, the adversary could simply know a single hard-coded collision. In practical applications, however, unkeyed...

    Provided By Saarixx Labs

  • White Papers // Sep 2008

    OAEP is Secure Under Key-Dependent Messages

    Key-Dependent Message (KDM) security was introduced by the researchers to address the case where key cycles occur among encryptions, e.g., a key is encrypted with itself. The authors extend this definition to include the cases of adaptive corruptions and arbitrary active attacks, called adKDM security incorporating several novel design choices...

    Provided By Saarixx Labs

  • White Papers // Apr 2008

    Estimating the Performance of Cache Replacement Policies

    Caches are commonly employed to hide the latency gap between memory and the CPU by exploiting locality in memory accesses. The cache performance strongly influences a system's overall performance, as this gap is large and ever-increasing. The efficiency of a given cache architecture - usually measured by its miss ratio...

    Provided By Saarixx Labs

  • White Papers // Aug 2007

    Towards Enhanced Business Process Models Based on Fuzzy Attributes and Rules

    In business process management, decision situations are often characterized by fuzziness. This means that the decision premises are not available in the form of mathematic models or numeric values, but rather as fuzzy conditions, such as \"Low processing time\" or \"High quality\". This paper will show how fuzzy conditions and...

    Provided By Saarixx Labs

  • White Papers // Aug 2007

    Timing Predictability of Cache Replacement Policies

    Hard real-time systems must obey strict timing constraints. Therefore, one needs to derive guarantees on the Worst-Case Execution Times (WCETs) of a system's tasks. In this context, predictable behavior of system components is crucial for the derivation of tight and thus useful bounds. This paper presents results about the predictability...

    Provided By Saarixx Labs

  • White Papers // Jun 2011

    On Parameter Tuning in Search Based Software Engineering

    When applying Search-Based Software Engineering (SBSE) techniques one is confronted with a multitude of different parameters that need to be chosen: Which population size for a genetic algorithm? Which selection mechanism to use? What settings to use for dozens of other parameters? This problem not only troubles users who want...

    Provided By Saarixx Labs

  • White Papers // Aug 2011

    Combining Search-Based and Constraint-Based Testing

    Many modern automated test generators are based on either meta-heuristic search techniques or use constraint solvers. Both approaches have their advantages, but they also have specific drawbacks: Search-based methods get stuck in local optima and degrade when the search landscape offers no guidance; constraint-based approaches, on the other hand, can...

    Provided By Saarixx Labs

  • White Papers // Dec 2010

    Evolutionary Generation of Whole Test Suites

    Recent advances in software testing allow automatic derivation of tests that reach almost any desired point in the source code. There is, however, a fundamental problem with the general idea of targeting one distinct test coverage goal at a time: Coverage goals are neither independent of each other, nor is...

    Provided By Saarixx Labs

  • White Papers // Oct 2010

    It Is Not the Length That Matters, It Is How You Control It

    The length of test cases is a little investigated topic in search-based test generation for object oriented software, where test cases are sequences of method calls. While intuitively longer tests can achieve higher overall code coverage, there is always the threat of bloat - a complex phenomenon in evolutionary computation,...

    Provided By Saarixx Labs

  • White Papers // Jan 2011

    Exploiting Common Object Usage in Test Case Generation

    Generated test cases are good at systematically exploring paths and conditions in software. However, generated test cases often do not make sense. The authors adapt test case generation to follow patterns of common object usage, as mined from code examples. Their experiments show that generated tests thus reuse familiar usage...

    Provided By Saarixx Labs

  • White Papers // May 2010

    Mutation-Driven Generation of Unit Tests and Oracles

    To assess the quality of test suites, mutation analysis seeds artificial defects (mutations) into programs; a non-detected mutation indicates a weakness in the test suite. The authors present an automated approach to generate unit tests that detect these mutations for object-oriented classes. This has two advantages: First, the resulting test...

    Provided By Saarixx Labs

  • White Papers // Sep 2012

    Generating Test Suites With Augmented Dynamic Symbolic Execution

    Unit test generation tools typically aim at one of two objectives: to explore the program behaviour in order to exercise automated oracles, or to produce a representative test set that can be used to manually add oracles or to use as a regression test set. Dynamic Symbolic Execution (DSE) can...

    Provided By Saarixx Labs

  • White Papers // Aug 2012

    Building and Maintaining Halls of Fame Over a Database

    Halls of Fame are fascinating constructs. They represent the elite of an often very large amount of entities - persons, companies, products, countries etc. Beyond their practical use as static rankings, changes to them are particularly interesting - for decision making processes, as input to common media or novel narrative...

    Provided By Saarixx Labs

  • White Papers // Nov 2011

    Enhancement of Traditional Business Process Management With Reflection - A New Perspective for Organisational Learning?

    The successful management of learning and knowledge has become a critical success factor for organizations in today's knowledge-intensive business world. However, the question remains how an organization should act and react in order to fulfill this management task. A common answer to the question is that organizations need employees who...

    Provided By Saarixx Labs

  • White Papers // Dec 2012

    Preventing Side-Channel Leaks in Web Traffic: A Formal Approach

    Internet traffic is exposed to potential eavesdroppers. Standard encryption mechanisms do not provide sufficient protection: Features such as packet sizes and numbers remain visible, opening the door to so-called side-channel attacks against web traffic. This paper develops a framework for the derivation of formal guarantees against traffic side-channels. The authors...

    Provided By Saarixx Labs

  • White Papers // Jul 2012

    AppGuard - Real-Time Policy Enforcement for Third-Party Applications

    Android has become the most popular operating system for mobile devices, which makes it a prominent target for malicious software. The security concept of Android is based on app isolation and access control for critical system resources. However, users can only review and accept permission requests at install time, or...

    Provided By Saarixx Labs

  • White Papers // Oct 2012

    Logical Foundations of Secure Resource Management in Protocol Implementations

    Recent research has shown that it is possible to leverage general purpose theorem proving techniques to develop powerful type systems for the verification of a wide range of security properties on application code. Although successful in many respects, these type systems fall short of capturing resource-conscious properties that are crucial...

    Provided By Saarixx Labs

  • White Papers // Jan 2013

    Affine Refinement Types for Authentication and Authorization

    Refinement type systems have proved very effective for security policy verification in distributed authorization systems. In earlier work, the authors have proposed an extension of existing refinement typing techniques to exploit sub-structural logics and affine typing in the analysis of resource aware authorization, with policies predicating over access counts, usage...

    Provided By Saarixx Labs

  • White Papers // Jan 2013

    AppGuard - Enforcing User Requirements on Android Apps

    The success of Android phones makes them a prominent target for malicious software, in particular since the Android permission system turned out to be inadequate to protect the user against security and privacy threats. This paper presents AppGuard, a powerful and flexible system for the enforcement of user-customizable security policies...

    Provided By Saarixx Labs

  • White Papers // May 2013

    Towards a Minimal Cost of Change Approach for Inductive Reference Process Model Development

    Business Process Management (BPM) has advanced to be one of the most intensely discussed topics in Information Systems (IS) research. Based on the growing maturity of its concepts, methods and techniques BPM has, furthermore, gained tremendous importance in organizational practice. More and more organizations develop and use individual business process...

    Provided By Saarixx Labs

  • White Papers // Nov 2013

    Towards Compositionality in Execution Time Analysis - Definition and Challenges

    For hard real-time systems, timeliness of operations has to be guaranteed. Static timing analysis is therefore employed to compute upper bounds on the execution times of a program. Analysis results at high precision are required to avoid over-provisioning of resources. For current processors, timing analysis is a complex task mainly...

    Provided By Saarixx Labs

  • White Papers // Apr 2014

    Main Memory Adaptive Indexing for Multi-core Systems

    Adaptive indexing is a concept that considers index creation in databases as a by-product of query processing; as opposed to traditional full index creation where the indexing effort is performed up front before answering any queries. Adaptive indexing has received a consider-able amount of attention, and several algorithms have been...

    Provided By Saarixx Labs

  • White Papers // Jun 2010

    Branch Target Buffers: WCET Analysis Framework and Timing Predictability

    One step in the verification of hard real-time systems is to determine upper bounds on the Worst Case Execution Times (WCET) of tasks. To obtain tight bounds, a WCET analysis has to consider micro-architectural features like caches, branch prediction, and Branch Target Buffers (BTB). The authors propose a modular WCET...

    Provided By Saarixx Labs

  • White Papers // Aug 2007

    Timing Predictability of Cache Replacement Policies

    Hard real-time systems must obey strict timing constraints. Therefore, one needs to derive guarantees on the Worst-Case Execution Times (WCETs) of a system's tasks. In this context, predictable behavior of system components is crucial for the derivation of tight and thus useful bounds. This paper presents results about the predictability...

    Provided By Saarixx Labs

  • White Papers // Apr 2010

    Precise and Efficient FIFO-Replacement Analysis Based on Static Phase Detection

    Schedulability analysis for hard real-time systems requires bounds on the execution times of its tasks. To obtain useful bounds in the presence of caches, static timing analyses must predict cache hits and misses with high precision. For caches with Least-Recently-Used (LRU) replacement policy, precise and efficient cache analyses exist. However,...

    Provided By Saarixx Labs

  • White Papers // Apr 2008

    Estimating the Performance of Cache Replacement Policies

    Caches are commonly employed to hide the latency gap between memory and the CPU by exploiting locality in memory accesses. The cache performance strongly influences a system's overall performance, as this gap is large and ever-increasing. The efficiency of a given cache architecture - usually measured by its miss ratio...

    Provided By Saarixx Labs

  • White Papers // Jun 2012

    Fuzzing with Code Fragments

    Fuzz testing is an automated technique providing random data as input to a software system in the hope to expose a vulnerability. In order to be effective, the fuzzed input must be common enough to pass elementary consistency checks; a JavaScript interpreter, for instance, would only accept a semantically valid...

    Provided By Saarixx Labs

  • White Papers // Aug 2007

    Towards Enhanced Business Process Models Based on Fuzzy Attributes and Rules

    In business process management, decision situations are often characterized by fuzziness. This means that the decision premises are not available in the form of mathematic models or numeric values, but rather as fuzzy conditions, such as \"Low processing time\" or \"High quality\". This paper will show how fuzzy conditions and...

    Provided By Saarixx Labs

  • White Papers // May 2013

    Towards Empirically Validated Ubiquitous Information Systems: Results From a Pretest and Three Empirical Studies

    Research on Ubiquitous Information Systems (UIS) has recently gained attention in the IS community. But dedicated empirical instruments that are robust and capture individual characteristics of UIS are still missing. A rather new empirical construct derived from Task-Technology Fit theory was proposed and denoted as Situation-Service Fit (SSF) (Maass et...

    Provided By Saarixx Labs

  • White Papers // Jun 2012

    Quantifying Information Flow in Cryptographic Systems

    The authors provide a novel definition of quantitative information flow, called transmissible information, that is suitable for reasoning about informational-theoretically secure (or non-cryptographic) systems, as well as about cryptographic systems with their polynomially bounded adversaries, error probabilities, etc. Transmissible information captures deliberate communication between two processes, and it safely over-approximates...

    Provided By Saarixx Labs

  • White Papers // Jun 2013

    Automated Security Proofs for Almost-Universal Hash for MAC Verification

    Message Authentication Codes (MACs) are an essential primitive in cryptography. They are used to ensure the integrity and authenticity of a message, and can also be used as a building block for larger schemes, such as chosen-ciphertext secure encryption, or identity-based encryption. MACs are often built in two steps: first,...

    Provided By Saarixx Labs

  • White Papers // Jun 2013

    Flexible and Fine-Grained Mandatory Access Control on Android for Diverse Security and Privacy Policies

    In this paper the authors tackle the challenge of providing a generic security architecture for the android OS that can serve as a flexible and effective ecosystem to instantiate different security solutions. In contrast to prior paper their security architecture, termed FlaskDroid, provides mandatory access control simultaneously on both android's...

    Provided By Saarixx Labs

  • White Papers // Dec 2013

    Differential Indistinguishability for Cryptographic Primitives with Imperfect Randomness

    In-distinguishability based definitions of cryptographic primitives such as encryption, commitments, and zero-knowledge proofs are proven to be impossible to realize in scenarios where parties only have access to non-extractable sources of randomness. In this paper the authors demonstrate that it is, nevertheless, possible to quantify this secrecy loss for non-extractable...

    Provided By Saarixx Labs

  • White Papers // Oct 2013

    TUC: Time-Sensitive and Modular Analysis of Anonymous Communication

    The Anonymous Communication (AC) protocol tor constitutes the most widely deployed technology for providing anonymity for user communication over the internet. Tor has been subject to several analyzes which have shown strong anonymity guarantees for tor. However, all previous analyses ignore time-sensitive leakage: timing patterns in web traffic allow for...

    Provided By Saarixx Labs

  • White Papers // Mar 2013

    Limitations of the Meta-Reduction Technique: The Case of Schnorr Signatures

    The authors revisit the security of fiat-shamir signatures in the non-programmable random oracle model. The well-known proof by pointcheval and stern for such signature schemes (Journal of Cryptology, 2000) relies on the ability to re-program the random oracle, and it has been unknown if this property is inherent. The researchers...

    Provided By Saarixx Labs

  • White Papers // Sep 2008

    Limits of Constructive Security Proofs

    The collision-resistance of hash functions is an important foundation of many cryptographic protocols. Formally, collision-resistance can only be expected if the hash function in fact constitutes a parametrized family of functions, since for a single function, the adversary could simply know a single hard-coded collision. In practical applications, however, unkeyed...

    Provided By Saarixx Labs

  • White Papers // Sep 2008

    OAEP is Secure Under Key-Dependent Messages

    Key-Dependent Message (KDM) security was introduced by the researchers to address the case where key cycles occur among encryptions, e.g., a key is encrypted with itself. The authors extend this definition to include the cases of adaptive corruptions and arbitrary active attacks, called adKDM security incorporating several novel design choices...

    Provided By Saarixx Labs

  • White Papers // Aug 2006

    Joint Reference Modeling: Collaboration Support through Version Management

    The derivation of specific models from reference models corresponds with the creation of reference model variants. Research on the design of such variant constructions generally assumes an unchangeable stock of reference models. The potential inherent in the management of these variant constructions which reflect the changes in jointly designed reference...

    Provided By Saarixx Labs

  • White Papers // Apr 2012

    Relational Cache Analysis for Static Timing Analysis

    In hard real-time systems, operations are subject to timing constraints, i.e. there are operational deadlines from events to system responses. To show the correctness of such systems, one must therefore derive guarantees on the timeliness of reactions. Thereby, one fundamental problem is to characterize the execution time of programs. The...

    Provided By Saarixx Labs

  • White Papers // Jan 2014

    Selfish-LRU: Preemption-Aware Caching for Predictability and Performance

    The authors introduce Selfish-LRU, a variant of the LRU (Least Recently Used) cache replacement policy that improves performance and predictability in preemptive scheduling scenarios. In multitasking systems with conventional caches, a single memory access by a preempting task can trigger a chain reaction leading to a large number of additional...

    Provided By Saarixx Labs

  • White Papers // Jan 2013

    Measurement-Based Modeling of the Cache Replacement Policy

    Modern microarchitectures employ memory hierarchies involving one or more levels of cache memory to hide the large latency gap between the processor and main memory. Cycle-accurate simulators, self-optimizing software systems, and platform-aware compilers need accurate models of the memory hierarchy to produce useful results. Similarly, worst-case execution time analyzers require...

    Provided By Saarixx Labs

  • White Papers // May 2013

    Impact of Resource Sharing on Performance and Performance Prediction: A Survey

    Multi-core processors are increasingly considered as execution platforms for embedded systems because of their good performance/energy ratio. However, the interference on shared resources poses several problems. It may severely reduce the performance of tasks executed on the cores, and it increases the complexity of timing analysis and/or decreases the precision...

    Provided By Saarixx Labs

  • White Papers // Sep 2012

    Automatic Cache Modeling by Measurements

    Modern microarchitectures employ memory hierarchies involving one or more levels of cache memory to hide the large latency gap between the processor and main memory. Cycle-accurate simulators need to accurately model such memory hierarchies to produce useful results. Similarly, worst-case execution time analyzers require faithful models, both for soundness and...

    Provided By Saarixx Labs

  • White Papers // Feb 2013

    Embedded Systems: Many Cores - Many Problems

    The embedded-systems industry introduces multicore architectures for their good performance-energy ratio. This trend coincides with the transition from federated to integrated system architectures in the automotive and the aeronautics industries. The embedded systems industry is about to make a transition to multi-core platforms. This is a highly risky step as...

    Provided By Saarixx Labs

  • White Papers // Feb 2014

    Reverse Engineering of Cache Replacement Policies in Intel Microprocessors and Their Evaluation

    Performance modeling techniques need accurate cache models to produce useful estimates. However, properties required for building such models, like the replacement policy, are often not documented. In this paper, using a set of carefully designed microbenchmarks, the authors reverse engineer a precise model of caches found in recent Intel processors...

    Provided By Saarixx Labs