University of Texas

Displaying 81-120 of 302 results

  • White Papers // Mar 2011

    A Generic Handover Decision Management Framework for Next Generation Networks

    Next generation networks are defined to be packet based networks that provide telecommunication services to users by utilizing different transport technologies, wired and wireless. In this paper the authors provide a generic framework for handover decision management in next generation networks. They show that any handover decision algorithm can utilize...

    Provided By University of Texas

  • White Papers // Feb 2011

    A New Model for Coverage With Fractional Frequency Reuse in OFDMA Cellular Networks

    Fractional Frequency Reuse (FFR) is an interference management technique well-suited to OFDMA-based cellular networks wherein the cells are partitioned into spatial regions with different frequency reuse factors. To date, FFR techniques have been typically been evaluated through system-level simulations using a hexagonal grid for the base station locations. This paper...

    Provided By University of Texas

  • White Papers // Feb 2011

    Is That You? Authentication in a Network Without Identities

    Most networks require that their users have "Identities", i.e. have names that are fixed for a relatively long time, unique, and have been approved by a central authority (in order to guarantee their uniqueness). Unfortunately, this requirement, which was introduced to simplify the design of networks, has its own drawbacks....

    Provided By University of Texas

  • White Papers // Feb 2011

    On Secure and Resilient Telesurgery Communications Over Unreliable Networks

    Telesurgical Robot Systems (TRSs) address mission critical operations emerging in extreme fields such as battlefields, underwater, and disaster territories. The lack of wire-lined communication infrastructure in such fields makes the use of wireless technologies including satellite and ad-hoc networks inevitable. TRSs over wireless environments pose unique challenges such as preserving...

    Provided By University of Texas

  • White Papers // Feb 2011

    Synthesizing Concurrent Schedulers for Irregular Algorithms

    Scheduling is the assignment of tasks or activities to processors for execution, and it is an important concern in parallel programming. Most prior work on scheduling has focused either on static scheduling of applications in which the dependence graph is known at compile-time or on dynamic scheduling of independent loop...

    Provided By University of Texas

  • White Papers // Feb 2011

    A Tractable Approach to Coverage and Rate in Cellular Networks

    Cellular networks are usually modeled by placing the base stations on a grid, with mobile users either randomly scattered or placed deterministically. These models have been used extensively but suffer from being both highly idealized and not very tractable, so complex system-level simulations are used to evaluate coverage/outage probability and...

    Provided By University of Texas

  • White Papers // Feb 2011

    Ergodic Spatial Throughput of Wireless Ad Hoc Networks With Markovian Fading Channels

    Most work on wireless network throughput ignores the temporal correlation inherent to wireless channels, due to trouble with tractability. In order to better capture the temporal variations of wireless network throughput, this paper introduces the metric of Ergodic Spatial Throughput (EST), which includes spatial and temporal ergodicity. All transmitters in...

    Provided By University of Texas

  • White Papers // Feb 2011

    Innovating To Optimize Supply Chain Management

    As an innovator dedicated to collaboration with the network of vendors and customers, Advanced Micro Devices (AMD) considers the supply chain to be a key component to the success and one of the most important competitive assets. How important? Just one change the way the authors handle the supply chain...

    Provided By University of Texas

  • White Papers // Jan 2011

    Corporate Taxes And Investment: The Cash Flow Channel

    Existing literature focuses on how corporate taxation affects firms' investment decisions by altering after-tax returns. This paper instead examines how corporate taxation affects investment by reducing the cash flow a firm has available to invest in the current period. The author uses a sharp nonlinearity in the mapping from pre-tax...

    Provided By University of Texas

  • White Papers // Jan 2011

    Decentralizing Attribute-Based Encryption

    The authors propose a Multi-Authority Attribute-Based Encryption (ABE) system. In the system, any party can become an authority and there is no requirement for any global coordination other than the creation of an initial set of common reference parameters. A party can simply act as an ABE authority by creating...

    Provided By University of Texas

  • White Papers // Jan 2011

    Pseudorandom Financial Derivatives

    Arora, Barak, Brunnermeier, and Ge [ABBG] showed that taking computational complexity into account, a dishonest seller could dramatically increase the lemon costs of a family of financial derivatives. The authors show that if the seller is required to construct derivatives of a certain form, then this phenomenon disappears. In particular,...

    Provided By University of Texas

  • White Papers // Jan 2011

    Programming Many-Core Architectures - A Case Study: Dense Matrix Computations on the Intel SCC Processor

    A message passing, distributed-memory parallel computer on a chip is one possible design for future, many-core architectures. The authors discuss initial experiences with the Intel Single-chip Cloud Computer research processor, which is a prototype architecture that incorporates 48 cores on a single die that can communicate via a small, shared,...

    Provided By University of Texas

  • White Papers // Jan 2011

    Correlation of Link Outages in Low-Mobility Spatial Wireless Networks

    In this paper, the authors consider a network where the nodes' locations are modeled by a realization of a Poisson point process and remain fixed or change very slowly over time. Most of the literature focuses on the spatial average of the link outage probabilities. But each link in the...

    Provided By University of Texas

  • White Papers // Jan 2011

    Using Valgrind to Detect Undefined Value Errors With Bit-Precision

    The authors present Memcheck, a tool that has been implemented with the dynamic binary instrumentation framework Valgrind. Memcheck detects a wide range of memory errors in programs as they run. This paper focuses on one kind of error that Memcheck detects: undefined value errors. Such errors are common, and often...

    Provided By University of Texas

  • White Papers // Jan 2011

    DRES: Dynamic Range Encoding Scheme for TCAM Coprocessors

    One of the most critical resource management issues in the use of Ternary Content Addressable Memory (TCAM) for packet classification/filtering is how to effectively support filtering rules with ranges, known as range matching. In this paper, a Dynamic Range Encoding Scheme (DRES) is proposed to significantly improve TCAM storage efficiency...

    Provided By University of Texas

  • White Papers // Jan 2011

    Vertical Specialization, Intermediate Tariffs, And The Pattern Of Trade: Assessing The Role Of Tariff Liberalization To U.S. Bilateral Trade 1989-2001

    How important are intermediate tariffs in determining trade patterns? Empirical work measuring the impact of tariff liberalization most commonly focuses on the effects of barriers imposed by importers, but exporter trade policy should also matter when exports are produced with imported intermediates. Guided by extensions of the Eaton and Kortum...

    Provided By University of Texas

  • White Papers // Jan 2011

    Toward a Practical Packet Marking Approach for IP Traceback

    IP traceback is an important step in defending against Denial-of-Service (DoS) attacks. Probabilistic Packet Marking (PPM) has been studied as a promising approach to realize IP traceback. In this paper, the authors propose a new PPM approach that improves the current state of the art in two practical directions: it...

    Provided By University of Texas

  • White Papers // Jan 2011

    The Impact Of Financial Constraints On The Relation Between Shareholder Taxes And The Cost Of Equity Capital

    Using both the Tax Relief Act of 1997 (TRA) and the Jobs and Growth Tax Relief and Reconciliation Act of 2003 (JGTRRA), the authors conduct the first empirical investigation on how the tax cuts on dividends and/or capital gains affect the cost of equity differently for firms facing different degrees...

    Provided By University of Texas

  • White Papers // Jan 2011

    A Parametric Model for the Distribution of the Angle of Arrival and the Associated Correlation Function and Power Spectrum at the Mobile Station

    One of the main assumptions in the Clarke's classic channel model is isotropic scattering, i.e. uniform distribution for the angle of arrival of multipath components at the mobile station. However, in many mobile radio channels the authors encounter non-isotropic scattering, which strongly affects the correlation function and power spectrum of...

    Provided By University of Texas

  • White Papers // Jan 2011

    Pilot Designs for Consistent Frequency Offset Estimation in OFDM Systems

    This paper presents pilot designs for consistent frequency offset estimation of OFDM systems in frequency-selective fading channels. The authors describe two design approaches, namely consistency in the probabilistic sense and absolute consistency. Existing preambles and pilot designs in the literature do not guarantee the absolute consistency. They derive general criteria...

    Provided By University of Texas

  • White Papers // Jan 2011

    A Fully Pipelined XQuery Processor

    The authors present a high-performance, pull-based streaming processor for XQuery, called XQPull, that can handle many essential features of the language, including general predicates, recursive queries, backward axis steps, and function calls, using a very small amount of caching. Their framework is based on a new type of event streams,...

    Provided By University of Texas

  • White Papers // Jan 2011

    Energy Efficient Schemes for Wireless Sensor Networks With Multiple Mobile Base Stations

    One of the main design issues for a sensor network is conservation of the energy available at each sensor node. The authors propose to deploy multiple, mobile base stations to prolong the lifetime of the sensor network. They split the lifetime of the sensor network into equal periods of time...

    Provided By University of Texas

  • White Papers // Jan 2011

    Toward a More Practical Marking Scheme for IP Traceback

    Probabilistic Packet Marking (PPM) has been studied as a promising approach to realize IP traceback. In this paper, the authors propose a new PPM approach that improves the current state of the art in two practical directions: it improves the efficiency and accuracy of IP traceback and it provides incentives...

    Provided By University of Texas

  • White Papers // Jan 2011

    FONet : A Federated Overlay Network for DoS Defense in the Internet (A Position Paper)

    The authors propose a novel service architecture to provide DoS resistant communication services in the Internet. The architecture consists of a large scale federated overlay network with DoS protected tunnels established between overlay nodes. Individual overlay nodes are deployed and maintained by the domains hosting them. The overlay network as...

    Provided By University of Texas

  • White Papers // Jan 2011

    Variable Power Broadcasting in Ad Hoc Networks

    Network wide broadcast is a frequently used operation in ad hoc networks and consumes significant amount of energy. Reducing the overall power consumption is extremely important in increasing the longevity of ad hoc networks. As a result, developing energy efficient broadcast operations becomes an important issue in ad hoc networks....

    Provided By University of Texas

  • White Papers // Jan 2011

    Intersection Characteristics of End-to-End Internet Paths and Trees

    This paper focuses on understanding the scale and the distribution of "State overhead" (briefly load) that is incurred on the routers by various value-added network services, e.g., IP multicast and IP traceback. This understanding is essential to developing appropriate mechanisms and provisioning resources so that the Internet can support such...

    Provided By University of Texas

  • White Papers // Jan 2011

    Discovering Substructures in the Chemical Toxicity Domain

    The researcher's ability to interpret the data and discover interesting patterns within the data is of great importance as it helps in obtaining relevant SARs [Srinivasan et al.], for the cause of chemical cancers (e.g., Progol identified a primary amine group as a relevant SAR for the cause of chemical...

    Provided By University of Texas

  • White Papers // Jan 2011

    Structure Discovery in Sequentially-Connected Data Streams

    Much of current data mining research is focused on discovering sets of attributes that discriminate data entities into classes, such as shopping trends for a particular demographic group. In contrast, the authors are working to develop data mining techniques to discover patterns consisting of complex relationships between entities. Their research...

    Provided By University of Texas

  • White Papers // Jan 2011

    Using a Graph-Based Data Mining System to Perform Web Search

    The World Wide Web provides an immense source of information. Accessing information of interest presents a challenge to scientists and analysts, particularly if the desired information is structural in nature. The authors' goal is to design a structural search engine that uses the hyperlink structure of the Web, in addition...

    Provided By University of Texas

  • White Papers // Jan 2011

    Qualitative Comparison of Graph-Based and Logicbased Multi-Relational Data Mining: A Case Study

    The goal of this paper is to generate insights about the differences between graph-based and logic-based approaches to multi-relational data mining by performing a case study of graph-based system, Subdue and the inductive logic programming system, CProgol. The authors identify three key factors for comparing graph-based and logic-based multi-relational data...

    Provided By University of Texas

  • White Papers // Jan 2011

    Lightweight Distributed Selective Re-Execution and Its Implications for Value Speculation

    In this paper, the authors describe a lightweight protocol to support selective re-execution on the TRIPS processor. The protocol permits multiple waves of speculation to be traversing a dataflow graph simultaneously and in any order, with a cleanup "Commit" wave propagating as well to determine completion of a group of...

    Provided By University of Texas

  • White Papers // Jan 2011

    Practical Utilities for Monitoring Multicast Service Availability

    Monitoring has become one of the key issues for the successful deployment of IP multicast in the Internet. During the last decade, several tools and systems have been developed to monitor several different characteristics of IP multicast. In this paper, the authors focus on one specific monitoring task: monitoring end-to-end...

    Provided By University of Texas

  • White Papers // Jan 2011

    Enhancing Both Network and User Performance for Networks Supporting Best Effort Traffic

    With a view on improving user perceived performance on networks supporting best effort flows, e.g., multimedia/data file transfers, the authors propose a family of bandwidth allocation criteria that depends on the residual work of on-going transfers. Analysis and simulations show that allocating bandwidth in this fashion can significantly improve the...

    Provided By University of Texas

  • White Papers // Jan 2011

    Operating System Support for Massive Replication

    The increasing number of devices used by each user to access data and services and the increasing importance of the data and services available electronically both favor "Access-anywhere" network-delivered services. Unfortunately, making such services highly available is difficult. For example, even though end servers or service hosting sites advertise an...

    Provided By University of Texas

  • White Papers // Dec 2010

    Just-in-Time Analytics on Large File Systems

    As file systems reach the peta-bytes scale, users and administrators are increasingly interested in acquiring high-level analytical information for file management and analysis. Two particularly important tasks are the processing of aggregate and top-k queries which, unfortunately, cannot be quickly answered by hierarchical file systems such as ext3 and NTFS....

    Provided By University of Texas

  • White Papers // Dec 2010

    Hybrid Partial Evaluation

    The authors present Hybrid Partial Evaluation (HPE), a pragmatic approach to partial evaluation that borrows ideas from both online and offline partial evaluation. HPE performs offline-style specialization using an online approach without static binding time analysis. The goal of HPE is to provide a practical and predictable level of optimization...

    Provided By University of Texas

  • White Papers // Dec 2010

    A Customizable Two-Step Framework for General Equipment Provisioning in Optical Transport Networks

    Optical Transport Network (OTN) is a standard approach to offering transport support to a variety of existing service technologies, e.g., ESCON, HDTV, GE, etc. Multiple service technologies can be concurrently multiplexed onto one common transport network, which offers hierarchical transmission rate wrappers physically supported by Wavelength Division Multiplexing (WDM) lambda...

    Provided By University of Texas

  • White Papers // Dec 2010

    It's on Me! the Benefit of Altruism in BAR Environments

    Cooperation, a necessity for any Peer-To-Peer (P2P) cooperative service, is often achieved by rewarding good behavior now with the promise of future benefits. However, in most cases, interactions with a particular peer or the service itself eventually end, resulting in some last exchange in which departing participants have no incentive...

    Provided By University of Texas

  • White Papers // Dec 2010

    Design of the FutureGrid Experiment Management Framework

    FutureGrid provides novel computing capabilities that enable reproducible experiments while simultaneously supporting dynamic provisioning. This paper describes the Future-Grid experiment management framework to create and execute large scale scientific experiments for researchers around the globe. The experiments executed are performed by the various users of FutureGrid ranging from administrators, software...

    Provided By University of Texas

  • White Papers // Dec 2010

    Parallel Graph Partitioning on Multicore Architectures

    Graph partitioning is a common and frequent preprocessing step in many high-performance parallel applications on distributed and shared-memory architectures. It is used to distribute graphs across memory and to improve spatial locality. There are several parallel implementations of graph partitioning for distributed-memory architectures. In this paper, the authors present a...

    Provided By University of Texas

  • White Papers // Sep 2010

    A New Tractable Model for Cellular Coverage

    Cellular networks are usually modeled by placing the base stations according to a regular geometry such as a grid, with the mobile users scattered around the network either as a Poisson point process (i.e. uniform distribution) or deterministically. These models have been used extensively for cellular design and analysis but...

    Provided By University of Texas

  • White Papers // Dec 2009

    Capacity Scaling of MIMO Broadcast Channels With Random User Distribution

    A novel capacity scaling law for Multiple-Input Multiple-Output (MIMO) broadcast channels is derived considering a random user distribution. The random locations cause unequal average SNRs amongst the users, whereas prior work typically assumes that all users have the same average SNR. Most centralized wireless networks, e.g. cellular systems, are more...

    Provided By University of Texas

  • White Papers // Jan 2010

    A New Method for Computing the Transmission Capacity of Non-Poisson Wireless Networks

    The relative locations of concurrent transmitting nodes play an important role in the performance of wireless networks because it largely determines their mutual interference. In most prior work the set of interfering transmitters has been modeled by a homogeneous Poisson distribution, which assumes independence in the transmitting node positions, and...

    Provided By University of Texas

  • White Papers // Dec 2009

    Multicast Capacity Scaling of Wireless Networks With Multicast Outage

    Multicast transmission has several distinctive traits as opposed to more commonly studied unicast networks. Specially, these include - identical packets must be delivered successfully to several nodes, outage could simultaneously happen at different receivers, and the multicast rate is dominated by the receiver with the weakest link in order to...

    Provided By University of Texas

  • White Papers // Jan 2010

    Spectral Covariance for Spectrum Sensing, With Application to IEEE 802.22

    Despite the shortage of available frequency spectrum, recent studies have shown that the actual usage of the allocated spectrum is scarce. The IEEE is developing the 802.22 standard for spectral reuse in TV bands that uses Cognitive Radio (CR) technology. One of the essential and challenging features of CR is...

    Provided By University of Texas

  • White Papers // Jul 2009

    Block Diagonalization in the MIMO Broadcast Channel With Delayed CSIT

    This paper investigates the impact of delayed Channel State Information at the Transmitter (CSIT) on the MIMO broadcast channel with Block Diagonalization (BD) pre-coding. First, an upper bound for the achievable throughput is provided, which shows that BD is more robust to imperfect CSIT than zero-forcing pre-coding as it has...

    Provided By University of Texas

  • White Papers // Nov 2009

    A Simple Upper Bound on Random Access Transport Capacity

    The authors attempt to quantify end-to-end throughput in multi-hop wireless networks using a metric that measures the maximum density of source-destination pairs that can successfully communicate over a specified distance at certain data rate. They term this metric the random access transport capacity, since it is similar to transport capacity...

    Provided By University of Texas

  • White Papers // Dec 2008

    Achievable Throughput of Multi-Mode Multiuser MIMO With Imperfect CSI Constraints

    For the Multiple-Input Multiple-Output (MIMO) broadcast channel with imperfect Channel State Information (CSI), neither the capacity nor the optimal transmission technique have been fully discovered. In this paper, the authors derive achievable ergodic rates for a MIMO fading broadcast channel when CSI is delayed and quantized. It is shown that...

    Provided By University of Texas

  • White Papers // Feb 2009

    Throughput Scaling Laws for Wireless Ad Hoc Networks With Relay Selection

    Determining the capacity of uncoordinated wireless ad hoc networks is one of the most general and challenging problems in network information theory. The current mainstream approach consists of deriving sum rate bounds and asymptotic capacity scaling laws, which describe important aspects of the capacity region. In their seminal work, Gupta...

    Provided By University of Texas

  • White Papers // Dec 2008

    Spectrum Allocation in Two-Tier Networks

    Two-tier networks, comprising a conventional cellular network overlaid with shorter range hotspots (e.g. femtocells, distributed antennas, or wired relays), offer an economically viable way to improve cellular system capacity. The capacity-limiting factor in such networks is interference. The cross-tier interference between macrocells and femtocells can suffocate the capacity due to...

    Provided By University of Texas

  • White Papers // Jun 2011

    Average Rate Achievable in K-Tier Downlink Heterogeneous Cellular Networks

    Cellular networks are becoming increasingly heterogeneous due to the co-deployment of many disparate infrastructure elements, including micro, pico and femtocells, and distributed antennas. This introduces new challenges in the modeling, analysis, and design of these networks. While grid-based models have been quite popular in modeling classical macrocell networks, they are...

    Provided By University of Texas

  • White Papers // Mar 2011

    Cooperative Spectral Covariance Sensing: Properties and Analysis

    This paper investigates the theoretical limits of white space sensing in a Cognitive Radio (CR) network limited by channel correlation. In a log-normal shadowing channel, the received signal power is correlated based on the distance among the sensors and this makes sensing the presence of a signal difficult, even with...

    Provided By University of Texas

  • White Papers // Mar 2010

    Dynamic Connectivity in ALOHA Ad Hoc Networks

    In a wireless network the set of transmitting nodes changes frequently because of the MAC scheduler and the traffic load. Previously, connectivity in wireless networks was analyzed using static geometric graphs, and as the authors show leads to an overly constrained design criterion. The dynamic nature of the transmitting set...

    Provided By University of Texas

  • White Papers // Apr 2011

    Combating Channel Impairments and Impulsive Noise in Local Utility Powerline Communications

    A smart grid intelligently monitors and controls energy flow in order to improve the efficiency and reliability of power delivery. This monitoring and control requires low-delay, real-time, highly reliable communications between customers, local utilities and regional utilities. This paper focuses on PowerLine Communications (PLC) from the customer power meters to...

    Provided By University of Texas

  • White Papers // Feb 2009

    Dynamic Coexistence of Frequency Hopping Networks Using Parallel and Gaussian Allocations

    This paper studies the coexistence of several independent and dynamic wireless networks using the frequency hopping technique in the unlicensed radio band. The authors propose a new hopping scheme that allows more networks to collocate effectively, but does not violate federal restrictions regarding frequency constraint (related to the minimum number...

    Provided By University of Texas

  • White Papers // Jan 2011

    Energy Efficient Schemes for Wireless Sensor Networks With Multiple Mobile Base Stations

    One of the main design issues for a sensor network is conservation of the energy available at each sensor node. The authors propose to deploy multiple, mobile base stations to prolong the lifetime of the sensor network. They split the lifetime of the sensor network into equal periods of time...

    Provided By University of Texas

  • White Papers // May 2011

    Architecture and Abstractions for Environment and Traffic Aware System-Level Coordination of Wireless Networks

    This paper presents a system level approach to interference management in an infrastructure based wireless network with full frequency reuse. The key idea is to use loose base station coordination that is tailored to the spatial load distribution and the propagation environment to exploit the diversity in a user population's...

    Provided By University of Texas

  • White Papers // May 2011

    Practical Adaptive User Association Policies for Wireless Systems With Dynamic Interference

    The authors study the impact of user association policies on flow-level performance in interference-limited wireless networks. Most research in this area has used static interference models (neighboring base stations are always active) and resorted to intuitive objectives such as load balancing. In this paper, they show that this can be...

    Provided By University of Texas

  • White Papers // Nov 2009

    Opportunistic Routing for Interactive Traffic in Wireless Networks

    To take advantage of the broadcast nature of wireless communication, a number of opportunistic routing techniques have recently been proposed. In order to manage the extra signaling overhead associated with operation of the opportunistic routing, these schemes work in terms of 'Batches' consisting of multiple packets. While these opportunistic techniques...

    Provided By University of Texas

  • White Papers // Jan 2010

    Logical Concurrency Control From Sequential Proofs

    The authors are interested in identifying and enforcing the isolation requirements of a concurrent program, i.e., concurrency control that ensures that the program meets its specification. This can be done systematically starting from a sequential proof, i.e., a proof of correctness of the program in the absence of concurrent inter-leavings....

    Provided By University of Texas

  • White Papers // Jan 2012

    Identifying Failure-Inducing Combinations in a Combinatorial Test Set

    A t-way combinatorial test set is designed to detect failures that are triggered by combinations involving no more than t parameters. Assume that the authors have executed a t-way test set and some tests have failed. A natural question to ask is: what combinations have caused these failures? Identifying such...

    Provided By University of Texas

  • White Papers // Aug 2011

    The CleanJava Language for Functional Program Verification

    Unlike Hoare-style program verification, functional program verification supports forward reasoning by viewing a program as a mathematical function from one program state to another and proving its correctness by essentially comparing two mathematical functions, the function computed by the program and its specification. Since it requires a minimal mathematical background...

    Provided By University of Texas

  • White Papers // Nov 2011

    Functional Verification of Class Invariants in CleanJava

    In Cleanroom-style functional program verification, a program is viewed as a mathematical function from one program state to another, and the program is verified by comparing two functions, the implemented and the expected behaviors of a program. The technique requires a minimal mathematical background and supports forward reasoning, but it...

    Provided By University of Texas

  • White Papers // Aug 2011

    A Tutorial on Functional Program Verification

    This paper gives a quick tutorial introduction to functional program verification. In functional program verification, a program is viewed as a mathematical function from one program state to another, and proving its correctness is essentially comparing two mathematical functions, the function computed by the program and the specification of the...

    Provided By University of Texas

  • White Papers // Aug 2010

    Functional Specification and Verification of Object-Oriented Programs

    One weakness of Hoare-style verification techniques based on first-order predicate logic is that reasoning is backward from post-conditions to preconditions. A natural, forward reasoning is possible by viewing a program as a mathematical function that maps one program state to another. This functional program verification technique requires a minimal mathematical...

    Provided By University of Texas

  • White Papers // Apr 2010

    Access Control Contracts for Java Program Modules

    Application-level security has become an issue in recent years; for example, errors, discrepancies and omissions in the specification of access control constraints of security-sensitive software components are recognized as an important source for security vulnerabilities. The authors propose to formally specify access control assumptions or constraints of a program module...

    Provided By University of Texas

  • White Papers // May 2010

    Runtime Constraint Checking Approaches for OCL, a Critical Comparison

    There are many benefits of checking design constraints at run-time - for example, automatic detection of design drift or corrosion. However, there is no comparative analysis of different approaches although such an analysis could provide a sound basis for determining the appropriateness of one approach over the others. In this...

    Provided By University of Texas

  • White Papers // Sep 2011

    Toward the Verification of a Simple Hypervisor

    Virtualization promises significant benefits in security, efficiency, dependability, and cost. Achieving these benefits depends upon the reliability of the underlying virtual machine monitors (hypervisors). This paper describes an ongoing project to develop and verify MinVisor, a simple but functional Type-I x86 hypervisor, proving protection properties at the assembly level using...

    Provided By University of Texas

  • White Papers // Jun 2011

    Enhancing the Role of Inlining in Effective Interprocedural Parallelization

    The emergence of multi-core architectures makes it essential for optimizing compilers to automatically extract parallelism for large scientific applications composed of many subroutines residing in different files. Inlining is a well-known technique which can be used to erase procedural boundaries and enable more aggressive loop parallelization. However, conventional inlining cannot...

    Provided By University of Texas

  • White Papers // Jan 2009

    Credit Risk Price Discovery In Equity, Debt, And Credit Derivative Markets

    Credit risk pricing is perhaps an understudied topic in comparisons to its profound impact on the United State's financial markets and economy. This paper uses established price discovery techniques to develop a method of price discovery for credit risk in three financial markets: equity, debt, and credit derivative. Using weekly...

    Provided By University of Texas

  • White Papers // Jun 2010

    Palmtree: An IP Alias Resolution Algorithm With Linear Probing Complexity

    Internet topology mapping studies utilize large scale topology maps to analyze various characteristics of the Internet. IP alias resolution, the task of mapping IP addresses to their corresponding routers, is an important task in building such topology maps. In this paper, the authors present a new probe-based IP alias resolution...

    Provided By University of Texas

  • White Papers // Jan 2011

    Toward a More Practical Marking Scheme for IP Traceback

    Probabilistic Packet Marking (PPM) has been studied as a promising approach to realize IP traceback. In this paper, the authors propose a new PPM approach that improves the current state of the art in two practical directions: it improves the efficiency and accuracy of IP traceback and it provides incentives...

    Provided By University of Texas

  • White Papers // Jan 2011

    FONet : A Federated Overlay Network for DoS Defense in the Internet (A Position Paper)

    The authors propose a novel service architecture to provide DoS resistant communication services in the Internet. The architecture consists of a large scale federated overlay network with DoS protected tunnels established between overlay nodes. Individual overlay nodes are deployed and maintained by the domains hosting them. The overlay network as...

    Provided By University of Texas

  • White Papers // Jan 2011

    Variable Power Broadcasting in Ad Hoc Networks

    Network wide broadcast is a frequently used operation in ad hoc networks and consumes significant amount of energy. Reducing the overall power consumption is extremely important in increasing the longevity of ad hoc networks. As a result, developing energy efficient broadcast operations becomes an important issue in ad hoc networks....

    Provided By University of Texas

  • White Papers // Jan 2011

    Intersection Characteristics of End-to-End Internet Paths and Trees

    This paper focuses on understanding the scale and the distribution of "State overhead" (briefly load) that is incurred on the routers by various value-added network services, e.g., IP multicast and IP traceback. This understanding is essential to developing appropriate mechanisms and provisioning resources so that the Internet can support such...

    Provided By University of Texas

  • White Papers // Apr 2010

    SKAIT: A Parameterized Key Assignment Scheme For Wireless Networks

    In this paper, the authors propose SKAIT, a parameterized symmetric key pre-distribution scheme that guarantees a secure and confidential channel between every pair of nodes in a wireless network. Parameterization enables control over the number of keys assigned to a node, and allows users to trade increased key space complexity...

    Provided By University of Texas

  • White Papers // Aug 2010

    Characterizing Link and Path Reliability in Large-Scale Wireless Sensor Networks

    Reliable Data Transfer (RDT) is one of the key issues in Wireless Sensor Networks (WSNs) and can be achieved by using link-level re-transmissions and multipath routing. Another key issue is the scalability of WSNs. In this paper, the authors try to better understand and characterize/quantify the relationships between reliability and...

    Provided By University of Texas

  • White Papers // Mar 2009

    Religious Beliefs, Gambling Attitudes, And Financial Market Outcomes

    The authors use religion as a proxy for gambling and investigate whether geographical variation in religion-induced gambling norms affects aggregate market outcomes. Motivated by the evidence from gambling studies, they conjecture that gambling propensity would be higher in regions with higher concentration of Catholics relative to Protestants. They consider four...

    Provided By University of Texas

  • White Papers // Oct 2010

    User Admission in MIMO Interference Alignment Networks

    In this paper, the authors consider an interference channel where a set of primary active users are cooperating through interference alignment over a constant multiple-input-multiple-output channel while a set of secondary users desire access to the channel. They present the conditions under which a secondary user can be admitted to...

    Provided By University of Texas

  • White Papers // Sep 2010

    The Effect of Interference Cancellation on Spectrum-Sharing Transmission Capacity

    The efficiency of spectrum sharing can be improved by interference suppression and/or cancellation. This paper analyzes the performance of spectrum sharing networks with Interference Cancellation (IC) based on the Spectrum-sharing Transmission Capacity (S-TC), defined as the number of successful transmissions per unit area while guaranteeing all target outage probabilities of...

    Provided By University of Texas