University of Texas

Displaying 121-160 of 258 results

  • White Papers // Dec 2010

    Hybrid Partial Evaluation

    The authors present Hybrid Partial Evaluation (HPE), a pragmatic approach to partial evaluation that borrows ideas from both online and offline partial evaluation. HPE performs offline-style specialization using an online approach without static binding time analysis. The goal of HPE is to provide a practical and predictable level of optimization...

    Provided By University of Texas

  • White Papers // Dec 2010

    A Customizable Two-Step Framework for General Equipment Provisioning in Optical Transport Networks

    Optical Transport Network (OTN) is a standard approach to offering transport support to a variety of existing service technologies, e.g., ESCON, HDTV, GE, etc. Multiple service technologies can be concurrently multiplexed onto one common transport network, which offers hierarchical transmission rate wrappers physically supported by Wavelength Division Multiplexing (WDM) lambda...

    Provided By University of Texas

  • White Papers // Dec 2010

    It's on Me! the Benefit of Altruism in BAR Environments

    Cooperation, a necessity for any Peer-To-Peer (P2P) cooperative service, is often achieved by rewarding good behavior now with the promise of future benefits. However, in most cases, interactions with a particular peer or the service itself eventually end, resulting in some last exchange in which departing participants have no incentive...

    Provided By University of Texas

  • White Papers // Dec 2010

    Design of the FutureGrid Experiment Management Framework

    FutureGrid provides novel computing capabilities that enable reproducible experiments while simultaneously supporting dynamic provisioning. This paper describes the Future-Grid experiment management framework to create and execute large scale scientific experiments for researchers around the globe. The experiments executed are performed by the various users of FutureGrid ranging from administrators, software...

    Provided By University of Texas

  • White Papers // Dec 2010

    Parallel Graph Partitioning on Multicore Architectures

    Graph partitioning is a common and frequent preprocessing step in many high-performance parallel applications on distributed and shared-memory architectures. It is used to distribute graphs across memory and to improve spatial locality. There are several parallel implementations of graph partitioning for distributed-memory architectures. In this paper, the authors present a...

    Provided By University of Texas

  • White Papers // Dec 2010

    ASSERT: A Wireless Networking Testbed

    As wireless networking is becoming more pervasive, there has been a greater desire to develop communication hardware and protocol stacks that have a number of desirable properties like increased throughput, reduced latency, reduced energy consumption, quality of service, security, etc. Consequently, several academic and industrial research groups are actively working...

    Provided By University of Texas

  • White Papers // Dec 2010

    Vertical Specialization, Intermediate Tariffs, And The Pattern Of Trade: Assessing The Role Of Tariff Liberalization To U.S.

    How important are intermediate tariffs in determining trade patterns? Empirical work measuring the impact of tariff liberalization most commonly focuses on the effects of barriers imposed by importers, but exporter trade policy should also matter when exports are produced with imported intermediates. Guided by extensions of the Eaton and Kortum...

    Provided By University of Texas

  • White Papers // Nov 2010

    Corporate Leverage, Debt Maturity And Credit Default Swaps: The Role Of Credit Supply

    Does the ability of suppliers of debt to hedge risk through Credit Default Swap (CDS) contracts impact firms' capital structures? This paper uses CDS markets as a proxy for a relaxation of firms' credit supply constraints and tests whether supply frictions impact capital structure and debt maturity. The authors find...

    Provided By University of Texas

  • White Papers // Nov 2010

    Transmission Capacity of Carrier Sensing Ad Hoc Networks With Multiple Antennas

    Multiple antennas have become a common component of wireless networks, improving range, throughput, and spatial reuse, both at the link and network levels. At the same time, carrier sensing is a widely used method of improving spatial reuse in distributed wireless networks, especially when there is limited coordination among non-communicating...

    Provided By University of Texas

  • White Papers // Nov 2010

    The Impact Of Bank Health On The Investment Of Its Corporate Borrowers

    This paper investigates the impact of changes in a bank's health on the investment behavior of its current borrowers for a panel of U.S. firms. The author finds that, after controlling for aggregate credit availability and the condition of outside banks, firms reduce their investment when the health of their...

    Provided By University of Texas

  • White Papers // Nov 2010

    How to Leak on Key Updates

    In the continual memory leakage model, security against attackers who can repeatedly obtain leakage is achieved by periodically updating the secret key. This is an appealing model which captures a wide class of side-channel attacks, but all previous constructions in this model provide only a very minimal amount of leakage...

    Provided By University of Texas

  • White Papers // Nov 2010

    Cooperative Spectral Covariance Sensing Under Correlated Shadowing

    This paper investigates the theoretical limits of white space sensing in a Cognitive Radio (CR) network limited by channel correlation. In a log-normal shadowing channel, the received signal power is correlated based on the distance between the sensors and this makes sensing the presence of a signal difficult, even with...

    Provided By University of Texas

  • White Papers // Oct 2010

    User Admission in MIMO Interference Alignment Networks

    In this paper, the authors consider an interference channel where a set of primary active users are cooperating through interference alignment over a constant multiple-input-multiple-output channel while a set of secondary users desire access to the channel. They present the conditions under which a secondary user can be admitted to...

    Provided By University of Texas

  • White Papers // Sep 2010

    Depot: Cloud Storage With Minimal Trust

    The paper describes the design, implementation, and evaluation of Depot, a cloud storage system that minimizes trust assumptions. Depot tolerates buggy or malicious behavior by any number of clients or servers, yet it provides safety and liveness guarantees to correct clients. Depot provides these guarantees using a two-layer architecture. First,...

    Provided By University of Texas

  • White Papers // Sep 2010

    Identity-Based Encryption Secure Against Selective Opening Attack

    The authors present the first Identity-Based Encryption (IBE) schemes that are proven secure against Selective Opening Attack (SOA). This means that if an adversary, given a vector of ciphertexts, adaptively corrupts some fraction of the senders, exposing not only their messages but also their coins, the privacy of the unopened...

    Provided By University of Texas

  • White Papers // Sep 2010

    Challenges and Future Directions of Software Technology: Secure Software Development

    Developing large scale software systems has major security challenges. This paper describes the issues involved and then addresses two topics: formal methods for emerging secure systems and secure services modeling. Large scale software development is one of the biggest challenges faced by corporations. Incorporating security into the software development process...

    Provided By University of Texas

  • White Papers // Sep 2010

    A Virtual Network Approach for Testing Wireless Mesh in Industrial Process Control

    Unlike wired networks, the configuration and the behavior of wireless networks are heavily dependent on many environmental factors. While this makes it more difficult to build a wireless network testbed whose behavior is controllable, it also makes the availability of such a testbed all the more desirable. Indeed, the difficulties...

    Provided By University of Texas

  • White Papers // Sep 2010

    The Click Convergence Layer: Putting a Modular Router Under DTN2

    The Bundle Protocol shows great promise as a general purpose application-layer protocol for Delay-Tolerant Networks (DTNs) and has found many adopters within the research community. As an application layer protocol, a convergence layer (UDP/TCP sockets or some other domain-specific transport mechanism) is required to deliver bundles between nodes. The authors...

    Provided By University of Texas

  • White Papers // Sep 2010

    On the Accuracy of the Wyner Model in Downlink Cellular Networks

    Compared to real cellular systems where users are spatially distributed and interference levels vary by several orders of magnitude over a cell, in the Wyner model user locations are fixed and the interference intensity is characterized by a single fixed parameter. Although it is a fairly extreme simplification, the Wyner...

    Provided By University of Texas

  • White Papers // Sep 2010

    The Effect of Interference Cancellation on Spectrum-Sharing Transmission Capacity

    The efficiency of spectrum sharing can be improved by interference suppression and/or cancellation. This paper analyzes the performance of spectrum sharing networks with Interference Cancellation (IC) based on the Spectrum-sharing Transmission Capacity (S-TC), defined as the number of successful transmissions per unit area while guaranteeing all target outage probabilities of...

    Provided By University of Texas

  • White Papers // Sep 2010

    Comparison of Fractional Frequency Reuse Approaches in the OFDMA Cellular Downlink

    Fractional Frequency Reuse (FFR) is an interference coordination technique well-suited to OFDMA based wireless networks wherein cells are partitioned into spatial regions with different frequency reuse factors. This paper focuses on evaluating the two main types of FFR deployments: strict FFR and Soft Frequency Reuse (SFR). Relevant metrics are discussed,...

    Provided By University of Texas

  • White Papers // Sep 2010

    Two-Way Transmission Capacity of Wireless Ad-Hoc Networks

    The transmission capacity of an ad-hoc network is the maximum density of active transmitters per unit area, given an outage constraint at each receiver for a fixed rate of transmission. Most prior work on finding the transmission capacity of ad-hoc networks has focused only on one-way communication where a source...

    Provided By University of Texas

  • White Papers // Sep 2010

    A New Tractable Model for Cellular Coverage

    Cellular networks are usually modeled by placing the base stations according to a regular geometry such as a grid, with the mobile users scattered around the network either as a Poisson point process (i.e. uniform distribution) or deterministically. These models have been used extensively for cellular design and analysis but...

    Provided By University of Texas

  • White Papers // Aug 2010

    POET: A Scripting Language for Applying Parameterized Source-to-Source Program Transformations

    The authors present POET, a scripting language designed for applying advanced program transformations to code in arbitrary programming languages as well as building ad-hoc translators between these languages. The authors have used POET to support a large number of compiler optimizations, including loop interchange, parallelization, blocking, fusion/ fission, strength reduction,...

    Provided By University of Texas

  • White Papers // Aug 2010

    CDMA Uplink Capacity in Both Open and Closed Access Two-Tier Femtocell Networks

    Femtocells are assuming an increasingly important role in cellular coverage. Femtocells can be configured to be either open access or closed access. Seemingly, the network operator would prefer an open access deployment since this provides an inexpensive way to expand their network capabilities, whereas the femtocell owner would prefer closed...

    Provided By University of Texas

  • White Papers // Aug 2010

    The Case for End-User Programming of Ubiquitous Computing Environments

    Gone are the days that computers will be used by select users sitting at a desk with a mouse and keyboard. The next wave of computing, ubiquitous computing, is upon one. With smart phones, tablet computers, and embedded sensors/ actuators ourishing, users are already interacting with dozens of computers per...

    Provided By University of Texas

  • White Papers // Aug 2010

    Reducing Configurations to Monitor in a Software Product Line

    A software product line is a family of programs where each program is defined by a unique combination of features. Product lines, like conventional programs, can be checked for safety properties through execution monitoring. However, because a product line induces a number of programs that is potentially exponential in the...

    Provided By University of Texas

  • White Papers // Aug 2010

    Minimum-Delay Service Provisioning in Opportunistic Networks

    Opportunistic networks are created dynamically by exploiting contacts between pairs of mobile devices that come within communication range. While forwarding in opportunistic networking has been explored, investigations into asynchronous service provisioning on top of opportunistic networks are unique contributions of this paper. Mobile devices are typically heterogeneous, possess disparate physical...

    Provided By University of Texas

  • White Papers // Aug 2010

    Template-Based Reconstruction of Complex Refactorings

    Knowing which types of refactoring occurred between two program versions can help programmers better understand code changes. The authors' survey of refactoring identification techniques found that existing techniques cannot easily identify complex refactorings, such as an replace conditional with polymorphism refactoring, which consist of a set of atomic refactorings. This...

    Provided By University of Texas

  • White Papers // Aug 2010

    Objective Risk Evaluation for Automated Security Management

    Network security depends on a number of factors. And a common characteristic of these factors is that they are dynamic in nature. Such factors include new vulnerabilities and threats, the network policy structure and traffic. These factors can be divided into two broad categories. Network risk and service risk. As...

    Provided By University of Texas

  • White Papers // Aug 2010

    Greedy Distance Vector Routing

    Greedy Distance Vector (GDV) is the first geographic routing protocol designed to optimize end-to-end path costs using any additive routing metric, such as: hop count, latency, ETX, ETT, etc. GDV requires no node location information. Instead, GDV uses estimated routing costs to destinations which are locally computed from node positions...

    Provided By University of Texas

  • White Papers // Aug 2010

    A Super-Logarithmic Lower Bound for Shuffle-Unshuffle Sorting Networks

    A variety of different classes of sorting networks have been described in the literature. Of particular interest here are the so-called AKS network discovered by Ajtai, Komlos, and Szemeredi, and the sorting networks proposed by Batcher. While the AKS network is the only known sorting network with O (lgn) depth,...

    Provided By University of Texas

  • White Papers // Aug 2010

    Global Scheduling Based Reliability-Aware Power Management for Multiprocessor Real-Time Systems

    Reliability-Aware Power Management (RAPM) has been a recent research focus due the negative effects of the popular power management technique Dynamic Voltage and Frequency Scaling (DVFS) on system reliability. As a result, several RAPM schemes have been proposed for uniprocessor real-time systems. In this paper, for a set of frame-based...

    Provided By University of Texas

  • White Papers // Aug 2010

    Characterizing Link and Path Reliability in Large-Scale Wireless Sensor Networks

    Reliable Data Transfer (RDT) is one of the key issues in Wireless Sensor Networks (WSNs) and can be achieved by using link-level re-transmissions and multipath routing. Another key issue is the scalability of WSNs. In this paper, the authors try to better understand and characterize/quantify the relationships between reliability and...

    Provided By University of Texas

  • White Papers // Aug 2010

    Identifying Unnecessary Bounds Checks Through Block-Qualified Variable Elimination

    Java's memory-safety relies on the Java Virtual Machine checking that each access to an array does not exceed the bounds of that array. When static analysis can determine that some array access instruction will never use an out of bounds index, the cost of dynamically checking the index each time...

    Provided By University of Texas

  • White Papers // Aug 2010

    Functional Specification and Verification of Object-Oriented Programs

    One weakness of Hoare-style verification techniques based on first-order predicate logic is that reasoning is backward from post-conditions to preconditions. A natural, forward reasoning is possible by viewing a program as a mathematical function that maps one program state to another. This functional program verification technique requires a minimal mathematical...

    Provided By University of Texas

  • White Papers // Jul 2010

    Source Capture Time Analysis of Privacy Communication Protocols for Wireless Sensor Networks

    Communication privacy techniques that protect the locations of source sensor nodes and sink nodes from either global or local adversaries have received significant attention recently. The improvement in capture time, which is the time it takes for an adversary to identify the location of the source, is often estimated using...

    Provided By University of Texas

  • White Papers // Jul 2010

    Fast Neighbor Discovery With Lightweight Termination Detection in Heterogeneous Cognitive Radio Networks

    An important step in the initialization of a wireless ad hoc network is neighbor discovery in which every node attempts to determine the set of nodes it can communicate with in one wireless hop. In recent years, cognitive radio technology has gained attention as an attractive approach to alleviate the...

    Provided By University of Texas

  • White Papers // Jul 2010

    Femtocell Access Control in the TDMA/OFDMA Uplink

    This paper investigates open vs. closed access in the uplink of femtocell networks, for the particular case of orthogonal multiple access protocols like TDMA or OFDMA. Open access reduces near-far interference and provides an inexpensive way to expand the capacity of the operator's network. Additionally, with a cap on the...

    Provided By University of Texas

  • White Papers // Jul 2010

    On the Insecurity of Parallel Repetition for Leakage Resilience

    A fundamental question in leakage-resilient cryptography is: Can leakage resilience always be amplified by parallel repetition? It is natural to expect that if people have leakage-resilient primitive tolerating nl bits of leakage, they can take n copies of it to form a system tolerating nl bits of leakage. In this...

    Provided By University of Texas

  • White Papers // Jan 2010

    Energy Consumption in Wireless Sensor Networks Using GSP

    One of the main design issues for a sensor network is conservation of the energy available at each sensor node. The authors propose to deploy multiple, mobile base stations to prolong the lifetime of the sensor network. They split the lifetime of the sensor network into equal periods of time...

    Provided By University of Texas

  • White Papers // Sep 2011

    Building Malware Infection Trees

    "Dynamic analysis of malware is an ever evolving and challenging task. A Malware infection Tree (MiT) can assist in analysis by identifying processes and files related to a specific malware sample. In this paper, the authors propose an abstract approach to building a comprehensive MiT based on rules describing execution...

    Provided By University of Texas

  • White Papers // Nov 2011

    Leakage - Delay Tradeoff in Wide-Bit Nanoscale CMOS Adders

    The scaling of nanometer technology has had a major impact on the power dissipation of CMOS circuits. As transistor size decreases it has become apparent that leakage power is becoming a dominant fighting force against future technology. In this paper, the importance of static power consumption on the design of...

    Provided By University of Texas

  • White Papers // Jun 2009

    Switching Activity Calculation of VLSI Adders

    Using exact switching activity rates at all internal nodes when calculating energy of digital circuits is believed to result in improved accuracy over to the use of average switching activity. The authors compare the two approaches in the case of the Kogge-stone adder implemented with Weinberger and Ling addition recurrences....

    Provided By University of Texas

  • White Papers // Oct 2009

    Energy-Delay Space Exploration of Clocked Storage Elements Using Circuit Sizing

    Rapid energy-delay exploration methodology based on circuit sizing as applied to clocked storage elements is presented. Circuit delay and energy is modeled using improved RC delay model of a transistor. The accuracy of the model is increased by using logical effort setup accounting for input signal slope and extraction of...

    Provided By University of Texas

  • White Papers // Jan 2015

    KLOS: A High Performance Kernel-Less Operating System

    Operating systems provide services that are accessed by processes via mechanisms that involve a ring transition to transfer control to the kernel where the required function is performed. This has one significant drawback that every service call involves an overhead of a context switch where processor state is saved and...

    Provided By University of Texas

  • White Papers // Nov 2008

    WebOS: Operating System Services for Wide Area Applications

    In this paper, the authors demonstrate the power of providing a common set of operating system services to wide-area applications, including mechanisms for naming, persistent storage, remote process execution, resource management, authentication, and security. On a single machine, application developers can rely on the local operating system to provide these...

    Provided By University of Texas

  • White Papers // Nov 2014

    Encoded Archival Description: Data Quality and Analysis

    In order to authenticate the meaning of collections and to preserve their evidentiary value, archivists create documents (finding aids) that describe the provenance and original order of the records. Metadata standards such as Encoded Archival Description (EAD) enable finding aids to be encoded, searched, and displayed online. However, recent research...

    Provided By University of Texas

  • White Papers // Jul 2010

    Fast Neighbor Discovery With Lightweight Termination Detection in Heterogeneous Cognitive Radio Networks

    An important step in the initialization of a wireless ad hoc network is neighbor discovery in which every node attempts to determine the set of nodes it can communicate with in one wireless hop. In recent years, cognitive radio technology has gained attention as an attractive approach to alleviate the...

    Provided By University of Texas

  • White Papers // Dec 2010

    ASSERT: A Wireless Networking Testbed

    As wireless networking is becoming more pervasive, there has been a greater desire to develop communication hardware and protocol stacks that have a number of desirable properties like increased throughput, reduced latency, reduced energy consumption, quality of service, security, etc. Consequently, several academic and industrial research groups are actively working...

    Provided By University of Texas

  • White Papers // Jan 2012

    A Lightweight Algorithm for Causal Message Ordering in Mobile Computing Systems

    Causally ordered message delivery is a required property for several distributed applications particularly those that involve human interactions (such as teleconferencing and collaborative work). In this paper, the authors present an efficient protocol for causal ordering in mobile computing systems. This protocol requires minimal resources on mobile hosts and wireless...

    Provided By University of Texas

  • White Papers // Nov 2011

    Heterogeneous Cellular Networks: From Theory to Practice

    The proliferation of internet-connected mobile devices will continue to drive growth in data traffic in an exponential fashion, forcing network operators to dramatically increase the capacity of their networks. To do this cost-effectively, a paradigm shift in cellular network infrastructure deployment is occurring away from traditional (expensive) high-power tower-mounted base...

    Provided By University of Texas

  • White Papers // Sep 2009

    Random Access Transport Capacity

    The authors develop a new metric for quantifying end-to-end throughput in multihop wireless networks, which they term random access transport capacity since the interference model presumes uncoordinated transmissions. The metric quantifies the average maximum rate of successful end-to-end transmissions, multiplied by the communication distance, and normalized by the network area....

    Provided By University of Texas

  • White Papers // Oct 2010

    User Admission in MIMO Interference Alignment Networks

    In this paper, the authors consider an interference channel where a set of primary active users are cooperating through interference alignment over a constant multiple-input-multiple-output channel while a set of secondary users desire access to the channel. They present the conditions under which a secondary user can be admitted to...

    Provided By University of Texas

  • White Papers // Nov 2010

    Transmission Capacity of Carrier Sensing Ad Hoc Networks With Multiple Antennas

    Multiple antennas have become a common component of wireless networks, improving range, throughput, and spatial reuse, both at the link and network levels. At the same time, carrier sensing is a widely used method of improving spatial reuse in distributed wireless networks, especially when there is limited coordination among non-communicating...

    Provided By University of Texas

  • White Papers // Mar 2010

    A New Eclipse-Based JML Compiler Built Using AST Merging

    The Java Modeling Language (JML) is a formal interface specification language to document the behavior of Java program modules and has been used in many research and industrial projects. However, its inability to support Java 5 features such as generics is reducing its user base significantly. Besides, the JML compiler...

    Provided By University of Texas

  • White Papers // Sep 2011

    Supervised Learning for Insider Threat Detection Using Stream Mining

    Insider threat detection requires the identification of rare anomalies in contexts where evolving behaviors tend to mask such anomalies. This paper proposes and tests an ensemble-based stream mining algorithm based on supervised learning that addresses this challenge by maintaining an evolving collection of multiple models to classify dynamic data streams...

    Provided By University of Texas

  • White Papers // Nov 2011

    Insider Threat Detection Using Stream Mining and Graph Mining

    Evidence of malicious insider activity is often buried within large data streams, such as system logs accumulated over months or years. Ensemble-based stream mining leverages multiple classification models to achieve highly accurate anomaly detection in such streams even when the stream is unbounded, evolving, and unlabeled. This makes the approach...

    Provided By University of Texas

  • White Papers // Apr 2011

    Towards Security-Aware Program Visualization for Analyzing In-Lined Reference Monitors

    In-lined Reference Monitoring frameworks are an emerging technology for enforcing security policies over untrusted, mobile, binary code. However, formulating correct policy specifications for such frameworks to enforce remains a daunting undertaking with few supporting tools. A visualization approach is proposed to aid in this task; preliminary results are presented in...

    Provided By University of Texas

  • White Papers // Sep 2010

    Challenges and Future Directions of Software Technology: Secure Software Development

    Developing large scale software systems has major security challenges. This paper describes the issues involved and then addresses two topics: formal methods for emerging secure systems and secure services modeling. Large scale software development is one of the biggest challenges faced by corporations. Incorporating security into the software development process...

    Provided By University of Texas

  • White Papers // Oct 2009

    Model-Checking In-Lined Reference Monitors

    A technique for elegantly expressing In-lined Reference Monitor (IRM) certification as model-checking is presented and implemented. In-lined Reference Monitors (IRM's) enforce software security policies by in-lining dynamic security guards into untrusted binary code. Certifying IRM systems provide strong formal guarantees for such systems by verifying that the instrumented code produced...

    Provided By University of Texas

  • White Papers // Aug 2011

    Remote Batch Invocation for SQL Databases

    Batch services are a new approach to distributed computation in which clients send batches of operations for execution on a server and receive hierarchical results sets in response. In this paper, the authors show how batch services provide a simple and powerful interface to relational databases, with support for arbitrary...

    Provided By University of Texas

  • White Papers // Sep 2011

    Toward the Verification of a Simple Hypervisor

    Virtualization promises significant benefits in security, efficiency, dependability, and cost. Achieving these benefits depends upon the reliability of the underlying virtual machine monitors (hypervisors). This paper describes an ongoing project to develop and verify MinVisor, a simple but functional Type-I x86 hypervisor, proving protection properties at the assembly level using...

    Provided By University of Texas

  • White Papers // Jun 2011

    Enhancing the Role of Inlining in Effective Interprocedural Parallelization

    The emergence of multi-core architectures makes it essential for optimizing compilers to automatically extract parallelism for large scientific applications composed of many subroutines residing in different files. Inlining is a well-known technique which can be used to erase procedural boundaries and enable more aggressive loop parallelization. However, conventional inlining cannot...

    Provided By University of Texas

  • White Papers // Dec 2009

    Adaptive Limited Feedback for Sum-Rate Maximizing Beamforming in Cooperative Multicell Systems

    Base station cooperation improves the sum-rates that can be achieved in cellular systems. Conventional cooperation techniques require sharing large amounts of information over finite-capacity backhaul links and assume that base stations have full Channel State Information (CSI) of all the active users in the system. In this paper, a new...

    Provided By University of Texas

  • White Papers // Jul 2010

    User Partitioning for Less Overhead in MIMO Interference Channels

    This paper presents a study on multiple-antenna interference channels, accounting for general overhead as a function of the number of users and antennas in the network. The model includes both perfect and imperfect channel state information based on channel estimation in the presence of noise. Three low-complexity methods are proposed...

    Provided By University of Texas

  • White Papers // Jan 2012

    Identifying Failure-Inducing Combinations in a Combinatorial Test Set

    A t-way combinatorial test set is designed to detect failures that are triggered by combinations involving no more than t parameters. Assume that the authors have executed a t-way test set and some tests have failed. A natural question to ask is: what combinations have caused these failures? Identifying such...

    Provided By University of Texas

  • White Papers // Dec 2010

    A Customizable Two-Step Framework for General Equipment Provisioning in Optical Transport Networks

    Optical Transport Network (OTN) is a standard approach to offering transport support to a variety of existing service technologies, e.g., ESCON, HDTV, GE, etc. Multiple service technologies can be concurrently multiplexed onto one common transport network, which offers hierarchical transmission rate wrappers physically supported by Wavelength Division Multiplexing (WDM) lambda...

    Provided By University of Texas

  • White Papers // Jan 2010

    Logical Concurrency Control From Sequential Proofs

    The authors are interested in identifying and enforcing the isolation requirements of a concurrent program, i.e., concurrency control that ensures that the program meets its specification. This can be done systematically starting from a sequential proof, i.e., a proof of correctness of the program in the absence of concurrent inter-leavings....

    Provided By University of Texas

  • White Papers // Dec 2011

    Making Argument Systems for Outsourced Computation Practical (Sometimes)

    This paper describes the design, implementation, and evaluation of a system for performing verifiable outsourced computation. It has long been known that this problem can be solved in theory using Probabilistically Checkable Proofs (PCPs) coupled with modern cryptographic tools, and these solutions have wholly impractical performance, according to the conventional...

    Provided By University of Texas

  • White Papers // Apr 2011

    Toward Practical and Unconditional Verification of Remote Computations

    This paper revisits a classic question: how can a machine specify a computation to another one and then, without executing the computation, check that the other machine carried it out correctly? The applications of such a primitive include cloud computing (a computationally limited device offloads processing to the cloud but...

    Provided By University of Texas

  • White Papers // Apr 2011

    Repair From a Chair: Computer Repair as an Untrusted Cloud Service

    Today, when people need their computers repaired, their process is not very different from hiring someone to fix a television: they either bring the computer to a repair service, or they call a technician (or family member) and ask for a house call. This process is inconvenient. It also risks...

    Provided By University of Texas

  • White Papers // Jan 2012

    XML Query Routing in Structured P2P Systems

    This paper addresses the problem of data placement, indexing, and querying large XML data repositories distributed over an existing P2P service infrastructure. The authors' architecture scales gracefully to the network and data sizes, is fully distributed, fault tolerant and self-organizing, and handles complex queries efficiently, even those queries that use...

    Provided By University of Texas

  • White Papers // Jan 2011

    A Fully Pipelined XQuery Processor

    The authors present a high-performance, pull-based streaming processor for XQuery, called XQPull, that can handle many essential features of the language, including general predicates, recursive queries, backward axis steps, and function calls, using a very small amount of caching. Their framework is based on a new type of event streams,...

    Provided By University of Texas

  • White Papers // May 2010

    Runtime Constraint Checking Approaches for OCL, a Critical Comparison

    There are many benefits of checking design constraints at run-time - for example, automatic detection of design drift or corrosion. However, there is no comparative analysis of different approaches although such an analysis could provide a sound basis for determining the appropriateness of one approach over the others. In this...

    Provided By University of Texas

  • White Papers // Aug 2011

    The CleanJava Language for Functional Program Verification

    Unlike Hoare-style program verification, functional program verification supports forward reasoning by viewing a program as a mathematical function from one program state to another and proving its correctness by essentially comparing two mathematical functions, the function computed by the program and its specification. Since it requires a minimal mathematical background...

    Provided By University of Texas

  • White Papers // Nov 2011

    Functional Verification of Class Invariants in CleanJava

    In Cleanroom-style functional program verification, a program is viewed as a mathematical function from one program state to another, and the program is verified by comparing two functions, the implemented and the expected behaviors of a program. The technique requires a minimal mathematical background and supports forward reasoning, but it...

    Provided By University of Texas

  • White Papers // Aug 2011

    A Tutorial on Functional Program Verification

    This paper gives a quick tutorial introduction to functional program verification. In functional program verification, a program is viewed as a mathematical function from one program state to another, and proving its correctness is essentially comparing two mathematical functions, the function computed by the program and the specification of the...

    Provided By University of Texas

  • White Papers // Aug 2010

    Functional Specification and Verification of Object-Oriented Programs

    One weakness of Hoare-style verification techniques based on first-order predicate logic is that reasoning is backward from post-conditions to preconditions. A natural, forward reasoning is possible by viewing a program as a mathematical function that maps one program state to another. This functional program verification technique requires a minimal mathematical...

    Provided By University of Texas

  • White Papers // Apr 2010

    Access Control Contracts for Java Program Modules

    Application-level security has become an issue in recent years; for example, errors, discrepancies and omissions in the specification of access control constraints of security-sensitive software components are recognized as an important source for security vulnerabilities. The authors propose to formally specify access control assumptions or constraints of a program module...

    Provided By University of Texas