University of Texas

Displaying 81-120 of 277 results

  • White Papers // Mar 2011

    Downlink SDMA With Limited Feedback in Interference-Limited Wireless Networks

    The tremendous capacity gains promised by Space Division Multiple Access (SDMA) depend critically on the accuracy of the transmit channel state information. In the broadcast channel, even without any network interference, it is known that such gains collapse due to interstream interference if the feedback is delayed or low rate....

    Provided By University of Texas

  • White Papers // Mar 2011

    A Generic Handover Decision Management Framework for Next Generation Networks

    Next generation networks are defined to be packet based networks that provide telecommunication services to users by utilizing different transport technologies, wired and wireless. In this paper the authors provide a generic framework for handover decision management in next generation networks. They show that any handover decision algorithm can utilize...

    Provided By University of Texas

  • White Papers // Mar 2011

    Cooperative Spectral Covariance Sensing: Properties and Analysis

    This paper investigates the theoretical limits of white space sensing in a Cognitive Radio (CR) network limited by channel correlation. In a log-normal shadowing channel, the received signal power is correlated based on the distance among the sensors and this makes sensing the presence of a signal difficult, even with...

    Provided By University of Texas

  • White Papers // Feb 2011

    A New Model for Coverage With Fractional Frequency Reuse in OFDMA Cellular Networks

    Fractional Frequency Reuse (FFR) is an interference management technique well-suited to OFDMA-based cellular networks wherein the cells are partitioned into spatial regions with different frequency reuse factors. To date, FFR techniques have been typically been evaluated through system-level simulations using a hexagonal grid for the base station locations. This paper...

    Provided By University of Texas

  • White Papers // Feb 2011

    Is That You? Authentication in a Network Without Identities

    Most networks require that their users have "Identities", i.e. have names that are fixed for a relatively long time, unique, and have been approved by a central authority (in order to guarantee their uniqueness). Unfortunately, this requirement, which was introduced to simplify the design of networks, has its own drawbacks....

    Provided By University of Texas

  • White Papers // Feb 2011

    On Secure and Resilient Telesurgery Communications Over Unreliable Networks

    Telesurgical Robot Systems (TRSs) address mission critical operations emerging in extreme fields such as battlefields, underwater, and disaster territories. The lack of wire-lined communication infrastructure in such fields makes the use of wireless technologies including satellite and ad-hoc networks inevitable. TRSs over wireless environments pose unique challenges such as preserving...

    Provided By University of Texas

  • White Papers // Feb 2011

    Synthesizing Concurrent Schedulers for Irregular Algorithms

    Scheduling is the assignment of tasks or activities to processors for execution, and it is an important concern in parallel programming. Most prior work on scheduling has focused either on static scheduling of applications in which the dependence graph is known at compile-time or on dynamic scheduling of independent loop...

    Provided By University of Texas

  • White Papers // Feb 2011

    A Tractable Approach to Coverage and Rate in Cellular Networks

    Cellular networks are usually modeled by placing the base stations on a grid, with mobile users either randomly scattered or placed deterministically. These models have been used extensively but suffer from being both highly idealized and not very tractable, so complex system-level simulations are used to evaluate coverage/outage probability and...

    Provided By University of Texas

  • White Papers // Feb 2011

    Ergodic Spatial Throughput of Wireless Ad Hoc Networks With Markovian Fading Channels

    Most work on wireless network throughput ignores the temporal correlation inherent to wireless channels, due to trouble with tractability. In order to better capture the temporal variations of wireless network throughput, this paper introduces the metric of Ergodic Spatial Throughput (EST), which includes spatial and temporal ergodicity. All transmitters in...

    Provided By University of Texas

  • White Papers // Feb 2011

    Innovating To Optimize Supply Chain Management

    As an innovator dedicated to collaboration with the network of vendors and customers, Advanced Micro Devices (AMD) considers the supply chain to be a key component to the success and one of the most important competitive assets. How important? Just one change the way the authors handle the supply chain...

    Provided By University of Texas

  • White Papers // Jan 2011

    Corporate Taxes And Investment: The Cash Flow Channel

    Existing literature focuses on how corporate taxation affects firms' investment decisions by altering after-tax returns. This paper instead examines how corporate taxation affects investment by reducing the cash flow a firm has available to invest in the current period. The author uses a sharp nonlinearity in the mapping from pre-tax...

    Provided By University of Texas

  • White Papers // Jan 2011

    Decentralizing Attribute-Based Encryption

    The authors propose a Multi-Authority Attribute-Based Encryption (ABE) system. In the system, any party can become an authority and there is no requirement for any global coordination other than the creation of an initial set of common reference parameters. A party can simply act as an ABE authority by creating...

    Provided By University of Texas

  • White Papers // Jan 2011

    Pseudorandom Financial Derivatives

    Arora, Barak, Brunnermeier, and Ge [ABBG] showed that taking computational complexity into account, a dishonest seller could dramatically increase the lemon costs of a family of financial derivatives. The authors show that if the seller is required to construct derivatives of a certain form, then this phenomenon disappears. In particular,...

    Provided By University of Texas

  • White Papers // Jan 2011

    Programming Many-Core Architectures - A Case Study: Dense Matrix Computations on the Intel SCC Processor

    A message passing, distributed-memory parallel computer on a chip is one possible design for future, many-core architectures. The authors discuss initial experiences with the Intel Single-chip Cloud Computer research processor, which is a prototype architecture that incorporates 48 cores on a single die that can communicate via a small, shared,...

    Provided By University of Texas

  • White Papers // Jan 2011

    Correlation of Link Outages in Low-Mobility Spatial Wireless Networks

    In this paper, the authors consider a network where the nodes' locations are modeled by a realization of a Poisson point process and remain fixed or change very slowly over time. Most of the literature focuses on the spatial average of the link outage probabilities. But each link in the...

    Provided By University of Texas

  • White Papers // Jan 2011

    Discovering Substructures in the Chemical Toxicity Domain

    The researcher's ability to interpret the data and discover interesting patterns within the data is of great importance as it helps in obtaining relevant SARs [Srinivasan et al.], for the cause of chemical cancers (e.g., Progol identified a primary amine group as a relevant SAR for the cause of chemical...

    Provided By University of Texas

  • White Papers // Jan 2011

    Structure Discovery in Sequentially-Connected Data Streams

    Much of current data mining research is focused on discovering sets of attributes that discriminate data entities into classes, such as shopping trends for a particular demographic group. In contrast, the authors are working to develop data mining techniques to discover patterns consisting of complex relationships between entities. Their research...

    Provided By University of Texas

  • White Papers // Jan 2011

    Using a Graph-Based Data Mining System to Perform Web Search

    The World Wide Web provides an immense source of information. Accessing information of interest presents a challenge to scientists and analysts, particularly if the desired information is structural in nature. The authors' goal is to design a structural search engine that uses the hyperlink structure of the Web, in addition...

    Provided By University of Texas

  • White Papers // Jan 2011

    Qualitative Comparison of Graph-Based and Logicbased Multi-Relational Data Mining: A Case Study

    The goal of this paper is to generate insights about the differences between graph-based and logic-based approaches to multi-relational data mining by performing a case study of graph-based system, Subdue and the inductive logic programming system, CProgol. The authors identify three key factors for comparing graph-based and logic-based multi-relational data...

    Provided By University of Texas

  • White Papers // Jan 2011

    Lightweight Distributed Selective Re-Execution and Its Implications for Value Speculation

    In this paper, the authors describe a lightweight protocol to support selective re-execution on the TRIPS processor. The protocol permits multiple waves of speculation to be traversing a dataflow graph simultaneously and in any order, with a cleanup "Commit" wave propagating as well to determine completion of a group of...

    Provided By University of Texas

  • White Papers // Jan 2011

    Practical Utilities for Monitoring Multicast Service Availability

    Monitoring has become one of the key issues for the successful deployment of IP multicast in the Internet. During the last decade, several tools and systems have been developed to monitor several different characteristics of IP multicast. In this paper, the authors focus on one specific monitoring task: monitoring end-to-end...

    Provided By University of Texas

  • White Papers // Jan 2011

    Enhancing Both Network and User Performance for Networks Supporting Best Effort Traffic

    With a view on improving user perceived performance on networks supporting best effort flows, e.g., multimedia/data file transfers, the authors propose a family of bandwidth allocation criteria that depends on the residual work of on-going transfers. Analysis and simulations show that allocating bandwidth in this fashion can significantly improve the...

    Provided By University of Texas

  • White Papers // Jan 2011

    Operating System Support for Massive Replication

    The increasing number of devices used by each user to access data and services and the increasing importance of the data and services available electronically both favor "Access-anywhere" network-delivered services. Unfortunately, making such services highly available is difficult. For example, even though end servers or service hosting sites advertise an...

    Provided By University of Texas

  • White Papers // Jan 2011

    Using Valgrind to Detect Undefined Value Errors With Bit-Precision

    The authors present Memcheck, a tool that has been implemented with the dynamic binary instrumentation framework Valgrind. Memcheck detects a wide range of memory errors in programs as they run. This paper focuses on one kind of error that Memcheck detects: undefined value errors. Such errors are common, and often...

    Provided By University of Texas

  • White Papers // Jan 2011

    DRES: Dynamic Range Encoding Scheme for TCAM Coprocessors

    One of the most critical resource management issues in the use of Ternary Content Addressable Memory (TCAM) for packet classification/filtering is how to effectively support filtering rules with ranges, known as range matching. In this paper, a Dynamic Range Encoding Scheme (DRES) is proposed to significantly improve TCAM storage efficiency...

    Provided By University of Texas

  • White Papers // Jan 2011

    Vertical Specialization, Intermediate Tariffs, And The Pattern Of Trade: Assessing The Role Of Tariff Liberalization To U.S. Bilateral Trade 1989-2001

    How important are intermediate tariffs in determining trade patterns? Empirical work measuring the impact of tariff liberalization most commonly focuses on the effects of barriers imposed by importers, but exporter trade policy should also matter when exports are produced with imported intermediates. Guided by extensions of the Eaton and Kortum...

    Provided By University of Texas

  • White Papers // Jan 2011

    Toward a Practical Packet Marking Approach for IP Traceback

    IP traceback is an important step in defending against Denial-of-Service (DoS) attacks. Probabilistic Packet Marking (PPM) has been studied as a promising approach to realize IP traceback. In this paper, the authors propose a new PPM approach that improves the current state of the art in two practical directions: it...

    Provided By University of Texas

  • White Papers // Jan 2011

    Energy Efficient Schemes for Wireless Sensor Networks With Multiple Mobile Base Stations

    One of the main design issues for a sensor network is conservation of the energy available at each sensor node. The authors propose to deploy multiple, mobile base stations to prolong the lifetime of the sensor network. They split the lifetime of the sensor network into equal periods of time...

    Provided By University of Texas

  • White Papers // Jan 2011

    A Parametric Model for the Distribution of the Angle of Arrival and the Associated Correlation Function and Power Spectrum at the Mobile Station

    One of the main assumptions in the Clarke's classic channel model is isotropic scattering, i.e. uniform distribution for the angle of arrival of multipath components at the mobile station. However, in many mobile radio channels the authors encounter non-isotropic scattering, which strongly affects the correlation function and power spectrum of...

    Provided By University of Texas

  • White Papers // Jan 2011

    Pilot Designs for Consistent Frequency Offset Estimation in OFDM Systems

    This paper presents pilot designs for consistent frequency offset estimation of OFDM systems in frequency-selective fading channels. The authors describe two design approaches, namely consistency in the probabilistic sense and absolute consistency. Existing preambles and pilot designs in the literature do not guarantee the absolute consistency. They derive general criteria...

    Provided By University of Texas

  • White Papers // Jan 2011

    The Impact Of Financial Constraints On The Relation Between Shareholder Taxes And The Cost Of Equity Capital

    Using both the Tax Relief Act of 1997 (TRA) and the Jobs and Growth Tax Relief and Reconciliation Act of 2003 (JGTRRA), the authors conduct the first empirical investigation on how the tax cuts on dividends and/or capital gains affect the cost of equity differently for firms facing different degrees...

    Provided By University of Texas

  • White Papers // Jan 2011

    Toward a More Practical Marking Scheme for IP Traceback

    Probabilistic Packet Marking (PPM) has been studied as a promising approach to realize IP traceback. In this paper, the authors propose a new PPM approach that improves the current state of the art in two practical directions: it improves the efficiency and accuracy of IP traceback and it provides incentives...

    Provided By University of Texas

  • White Papers // Jan 2011

    FONet : A Federated Overlay Network for DoS Defense in the Internet (A Position Paper)

    The authors propose a novel service architecture to provide DoS resistant communication services in the Internet. The architecture consists of a large scale federated overlay network with DoS protected tunnels established between overlay nodes. Individual overlay nodes are deployed and maintained by the domains hosting them. The overlay network as...

    Provided By University of Texas

  • White Papers // Jan 2011

    Variable Power Broadcasting in Ad Hoc Networks

    Network wide broadcast is a frequently used operation in ad hoc networks and consumes significant amount of energy. Reducing the overall power consumption is extremely important in increasing the longevity of ad hoc networks. As a result, developing energy efficient broadcast operations becomes an important issue in ad hoc networks....

    Provided By University of Texas

  • White Papers // Jan 2011

    Intersection Characteristics of End-to-End Internet Paths and Trees

    This paper focuses on understanding the scale and the distribution of "State overhead" (briefly load) that is incurred on the routers by various value-added network services, e.g., IP multicast and IP traceback. This understanding is essential to developing appropriate mechanisms and provisioning resources so that the Internet can support such...

    Provided By University of Texas

  • White Papers // Jan 2011

    A Fully Pipelined XQuery Processor

    The authors present a high-performance, pull-based streaming processor for XQuery, called XQPull, that can handle many essential features of the language, including general predicates, recursive queries, backward axis steps, and function calls, using a very small amount of caching. Their framework is based on a new type of event streams,...

    Provided By University of Texas

  • White Papers // Dec 2010

    Just-in-Time Analytics on Large File Systems

    As file systems reach the peta-bytes scale, users and administrators are increasingly interested in acquiring high-level analytical information for file management and analysis. Two particularly important tasks are the processing of aggregate and top-k queries which, unfortunately, cannot be quickly answered by hierarchical file systems such as ext3 and NTFS....

    Provided By University of Texas

  • White Papers // Dec 2010

    Hybrid Partial Evaluation

    The authors present Hybrid Partial Evaluation (HPE), a pragmatic approach to partial evaluation that borrows ideas from both online and offline partial evaluation. HPE performs offline-style specialization using an online approach without static binding time analysis. The goal of HPE is to provide a practical and predictable level of optimization...

    Provided By University of Texas

  • White Papers // Dec 2010

    A Customizable Two-Step Framework for General Equipment Provisioning in Optical Transport Networks

    Optical Transport Network (OTN) is a standard approach to offering transport support to a variety of existing service technologies, e.g., ESCON, HDTV, GE, etc. Multiple service technologies can be concurrently multiplexed onto one common transport network, which offers hierarchical transmission rate wrappers physically supported by Wavelength Division Multiplexing (WDM) lambda...

    Provided By University of Texas

  • White Papers // Dec 2010

    It's on Me! the Benefit of Altruism in BAR Environments

    Cooperation, a necessity for any Peer-To-Peer (P2P) cooperative service, is often achieved by rewarding good behavior now with the promise of future benefits. However, in most cases, interactions with a particular peer or the service itself eventually end, resulting in some last exchange in which departing participants have no incentive...

    Provided By University of Texas

  • White Papers // Sep 2009

    Modeling Delivery Delay for Flooding in Mobile Ad Hoc Networks

    Mobile Ad hoc NETworks (MANETs) can have widely varying characteristics under different deployments, and previous studies show that the characteristics impact the behavior of routing protocols for MANETs. To deploy applications successfully in MANETs, application developers need to comprehend the potential behavior of any underlying protocol used. In mobile networks,...

    Provided By University of Texas

  • White Papers // Sep 2009

    Blurring Snapshots: Temporal Inference of Missing and Uncertain Data

    Many pervasive computing applications continuously monitor state changes in the environment by acquiring, interpreting and responding to information from sensors embedded in the environment. However, it is extremely difficult and expensive to obtain a continuous, complete, and consistent picture of a continuously evolving operating environment. One standard technique to mitigate...

    Provided By University of Texas

  • White Papers // Oct 2009

    Pharos: An Application-Oriented Testbed for Heterogeneous Wireless Networking Environments

    This paper presents the Pharos mobile computing testbed which focuses on application-driven validation of pervasive computing research using real hardware and real mobility. Pharos enables meaningful and reproducible validation at all levels of the network stack, including mobility modeling, routing protocols, coordination, application support, and system configuration. At the same...

    Provided By University of Texas

  • White Papers // Oct 2009

    Design of a Reliable Communication System for Grid-Style Traffic Control Networks

    This paper designs and analyzes the performance of a reliable communication scheme for the traffic control system built upon a wireless process control protocol, aiming at enhancing the robustness and timeliness of the safety critical control applications. Focusing on the slot-based predictable access and the grid topology of urban area...

    Provided By University of Texas

  • White Papers // Jun 2009

    A Location-Determination Application in WirelessHART

    WirelessHART is an emerging wireless communication standard that is targeted at the real-time process control industry. An example application of wireless communication in an industrial process control plant is the location of field engineers. The capability to locate personnel is a safety critical issue in process control plants because of...

    Provided By University of Texas

  • White Papers // Sep 2010

    A Virtual Network Approach for Testing Wireless Mesh in Industrial Process Control

    Unlike wired networks, the configuration and the behavior of wireless networks are heavily dependent on many environmental factors. While this makes it more difficult to build a wireless network testbed whose behavior is controllable, it also makes the availability of such a testbed all the more desirable. Indeed, the difficulties...

    Provided By University of Texas

  • White Papers // Sep 2009

    Online Scheduling Switch for Maintaining Data Freshness in Flexible Real-Time Systems

    Maintaining the temporal validity of real-time data is one of the crucial issues in a real-time database system. Past studies focus on designing algorithms to minimize imposed workload by a fixed set of update transactions while maintaining data freshness within validity intervals. In this paper the authors revisit this problem...

    Provided By University of Texas

  • White Papers // Mar 2011

    Downlink SDMA With Limited Feedback in Interference-Limited Wireless Networks

    The tremendous capacity gains promised by Space Division Multiple Access (SDMA) depend critically on the accuracy of the transmit channel state information. In the broadcast channel, even without any network interference, it is known that such gains collapse due to interstream interference if the feedback is delayed or low rate....

    Provided By University of Texas

  • White Papers // Mar 2011

    Modeling and Analysis of K-Tier Downlink Heterogeneous Cellular Networks

    Cellular networks are in a major transition from a carefully planned set of large tower-mounted BaseStations (BSs) to an irregular deployment of heterogeneous infrastructure elements that often additionally includes micro, pico, and femtocells, as well as distributed antennas. In this paper, the authors develop a tractable, flexible, and accurate model...

    Provided By University of Texas

  • White Papers // May 2009

    Coverage in Multi-Antenna Two-Tier Networks

    In two-tier networks comprising a conventional cellular network overlaid with shorter range hotspots (e.g. femtocells, distributed antennas, or wired relays)with universal frequency reuse, the near-far effect from cross-tier interference creates dead spots where reliable coverage cannot be guaranteed to users in either tier. Equipping the macrocell and femtocells with multiple...

    Provided By University of Texas

  • White Papers // Feb 2011

    A Tractable Approach to Coverage and Rate in Cellular Networks

    Cellular networks are usually modeled by placing the base stations on a grid, with mobile users either randomly scattered or placed deterministically. These models have been used extensively but suffer from being both highly idealized and not very tractable, so complex system-level simulations are used to evaluate coverage/outage probability and...

    Provided By University of Texas

  • White Papers // Sep 2010

    Two-Way Transmission Capacity of Wireless Ad-Hoc Networks

    The transmission capacity of an ad-hoc network is the maximum density of active transmitters per unit area, given an outage constraint at each receiver for a fixed rate of transmission. Most prior work on finding the transmission capacity of ad-hoc networks has focused only on one-way communication where a source...

    Provided By University of Texas

  • White Papers // Sep 2009

    Diversity-Multiplexing Tradeoff of Network Coding With Bidirectional Random Relaying

    This paper develops a Diversity-Multiplexing Tradeoff (DMT) over a bidirectional random relay set in a wireless network where the distribution of all nodes is a stationary Poisson point process. This is a nontrivial extension of the DMT because it requires consideration of the cooperation (or lack thereof) of relay nodes,...

    Provided By University of Texas

  • White Papers // May 2009

    On the Secrecy Rate of Interference Networks Using Structured Codes

    This paper shows that structured transmission schemes are a good choice for secret communication over interference networks with an eavesdropper. Structured transmission is shown to exploit channel asymmetries and thus perform better than randomly generated codebooks for such channels. For a class of interference channels, the authors show that an...

    Provided By University of Texas

  • White Papers // Jan 2011

    Decentralizing Attribute-Based Encryption

    The authors propose a Multi-Authority Attribute-Based Encryption (ABE) system. In the system, any party can become an authority and there is no requirement for any global coordination other than the creation of an initial set of common reference parameters. A party can simply act as an ABE authority by creating...

    Provided By University of Texas

  • White Papers // Sep 2010

    Identity-Based Encryption Secure Against Selective Opening Attack

    The authors present the first Identity-Based Encryption (IBE) schemes that are proven secure against Selective Opening Attack (SOA). This means that if an adversary, given a vector of ciphertexts, adaptively corrupts some fraction of the senders, exposing not only their messages but also their coins, the privacy of the unopened...

    Provided By University of Texas

  • White Papers // Aug 2010

    Identifying Unnecessary Bounds Checks Through Block-Qualified Variable Elimination

    Java's memory-safety relies on the Java Virtual Machine checking that each access to an array does not exceed the bounds of that array. When static analysis can determine that some array access instruction will never use an out of bounds index, the cost of dynamically checking the index each time...

    Provided By University of Texas

  • White Papers // Apr 2010

    Toward Practical Authorization-Dependent User Obligation Systems

    Many authorization system models include some notion of obligation. Little attention has been given to user obligations that depend on and affect authorizations. However, to be usable, the system must ensure users have the authorizations they need when their obligations must be performed. Prior work in this area introduced accountability...

    Provided By University of Texas

  • White Papers // Jul 2009

    An Optimal Boundary-Fair Scheduling Algorithm for Multiprocessor Real-Time Systems

    Although the scheduling problem for multiprocessor real-time systems has been studied for decades, it is still an evolving research field with many open problems. In this paper, focusing on periodic real-time tasks, the authors propose a novel optimal scheduling algorithm, namely Boundary fair (Bfair), which follows the same line of...

    Provided By University of Texas

  • White Papers // Mar 2011

    Secure Friend Discovery in Mobile Social Networks

    Mobile social networks extend social networks in the cyberspace into the real world by allowing mobile users to discover and interact with existing and potential friends who happen to be in their physical vicinity. Despite their promise to enable many exciting applications, serious security and privacy concerns have hindered wide...

    Provided By University of Texas

  • White Papers // Jun 2011

    XML Query Optimization in Map-Reduce

    The authors present a novel query language for large-scale analysis of XML data on a map-reduce environment, called MRQL, that is expressive enough to capture most common data analysis tasks and at the same time is amenable to optimization. The authors' evaluation plans are constructed using a small number of...

    Provided By University of Texas

  • White Papers // Dec 2010

    Hybrid Partial Evaluation

    The authors present Hybrid Partial Evaluation (HPE), a pragmatic approach to partial evaluation that borrows ideas from both online and offline partial evaluation. HPE performs offline-style specialization using an online approach without static binding time analysis. The goal of HPE is to provide a practical and predictable level of optimization...

    Provided By University of Texas

  • White Papers // Jan 2011

    Programming Many-Core Architectures - A Case Study: Dense Matrix Computations on the Intel SCC Processor

    A message passing, distributed-memory parallel computer on a chip is one possible design for future, many-core architectures. The authors discuss initial experiences with the Intel Single-chip Cloud Computer research processor, which is a prototype architecture that incorporates 48 cores on a single die that can communicate via a small, shared,...

    Provided By University of Texas

  • White Papers // Feb 2011

    Is That You? Authentication in a Network Without Identities

    Most networks require that their users have "Identities", i.e. have names that are fixed for a relatively long time, unique, and have been approved by a central authority (in order to guarantee their uniqueness). Unfortunately, this requirement, which was introduced to simplify the design of networks, has its own drawbacks....

    Provided By University of Texas

  • White Papers // Mar 2011

    TPP: The Two-Way Password Protocol

    The need for secure communication in the Internet has led to the widespread deployment of secure application-level protocols. The current state-of-the-art is to use TLS, in conjunction with a password protocol. The password protocol, which the authors call a One-way Password Protocol (OPP), authenticates the client to the server, using...

    Provided By University of Texas

  • White Papers // Apr 2011

    The K-Observer Problem in Computer Networks

    For any non-negative integer K, a K-observer P of a network N is a set of nodes in N such that each message, that travels at least K hops in N, is handled (and so observed) by at least one node in P. A K-observer P of a network N...

    Provided By University of Texas

  • White Papers // Apr 2011

    Mechanizing the Expert Dense Linear Algebra Developer

    Sustained high performance on the fastest computers in the world has traditionally been accomplished by experts who carefully hand-code key routines. This quickly becomes unmanageable for large bodies of software and/or as architectures change enough that entire libraries need to be rewritten. The authors believe the problem is that a...

    Provided By University of Texas

  • White Papers // May 2011

    Regret-Freedom Isn't Free

    Cooperative, Peer-To-Peer (P2P) services-distributed systems consisting of participants from Multiple Administrative Domains (MAD)|must deal with the threat of arbitrary (Byzan-tine) failures while incentivizing the cooperation of potentially selfish (rational) nodes that such services rely on to function. Although previous work has generally agreed that these types of participants need to...

    Provided By University of Texas

  • White Papers // Nov 2011

    Efficient Similarity Search Over Encrypted Data

    In recent years, due to the appealing features of cloud computing, large amount of data have been stored in the cloud. Although cloud based services offer many advantages, privacy and security of the sensitive data is a big concern. To mitigate the concerns, it is desirable to outsource sensitive data...

    Provided By University of Texas

  • White Papers // Apr 2009

    Social Network Classification Incorporating Link Type Values

    Classification of nodes in a social network and its applications to security informatics have been extensively studied in the past. However, previous paper generally does not consider the types of links (e.g., whether a person is friend or a close friend) that connect social networks members for classification purposes. Here,...

    Provided By University of Texas

  • White Papers // Nov 2008

    Using Anonymized Data for Classification

    In recent years, anonymization methods have emerged as an important tool to preserve individual privacy when releasing privacy sensitive data sets. This interest in anonymization techniques has resulted in a plethora of methods for anonymizing data under different privacy and utility assumptions. At the same time, there has been little...

    Provided By University of Texas

  • White Papers // Apr 2007

    Enforcing Honesty in Assured Information Sharing within a Distributed System

    The growing number of distributed information systems such as the internet has created a need for security in data sharing. When several autonomous parties attempt to share data, there is not necessarily any guarantee that the participants will share data truthfully. In fact, there is often a large incentive to...

    Provided By University of Texas

  • White Papers // Feb 2006

    Sovereign Joins

    The authors present a secure network service for sovereign information sharing whose only trusted component is an off-the shelf secure coprocessor. The participating data providers send encrypted relations to the service that sends the encrypted results to the recipients. The technical challenge in implementing such a service arises from the...

    Provided By University of Texas

  • White Papers // Sep 2011

    Building Malware Infection Trees

    "Dynamic analysis of malware is an ever evolving and challenging task. A Malware infection Tree (MiT) can assist in analysis by identifying processes and files related to a specific malware sample. In this paper, the authors propose an abstract approach to building a comprehensive MiT based on rules describing execution...

    Provided By University of Texas

  • White Papers // Jul 2001

    Product Costing And Pricing Under Long Term Capacity Commitment

    This white paper discusses about a model that is developed to analyze optimal product costing and pricing decisions when a firm must make long term commitments to some activity resource capacities. The problem is complex because of interactions between the initial capacity choices and adjustments in product costs and prices...

    Provided By University of Texas

  • White Papers // Feb 2010

    Open Vs. Closed Access Femtocells in the Uplink

    Femtocells are assuming an increasingly important role in the coverage and capacity of cellular networks. In contrast to existing cellular systems, femtocells are end-user deployed and controlled, randomly located, and rely on third party backhaul (e.g., DSL or cable modem). Femtocells can be configured to be either open access or...

    Provided By University of Texas

  • White Papers // Sep 2011

    AOVis: A Model-Driven Multiple-Graph Approach to Program Fact Extraction for AspectJ/Java Source Code

    AspectJ reverse engineering and visualization remains a challenge at the architectural and design levels, with fewer tools available for reverse engineers compared to other languages such as Java. Prior work on AspectJ modeling focused on forward engineering or detailed-design reverse engineering, or required special instrumentation to identify cross-cutting relationships. Effective...

    Provided By University of Texas

  • White Papers // Jan 2010

    An Adaptive Algorithm for Dynamic Tuning of MAC Parameters in IEEE 802.15.4/ZigBee Sensor Networks

    Recent studies have highlighted that IEEE 802.15.4 based Wireless Sensor Networks (WSNs) suffer from a severe unreliability problem due to the default MAC parameters setting suggested by the standard, although with a more appropriate choice it is possible to achieve the desired reliability and better energy efficiency. However, such setting...

    Provided By University of Texas

  • White Papers // Jul 2010

    Fast Neighbor Discovery With Lightweight Termination Detection in Heterogeneous Cognitive Radio Networks

    An important step in the initialization of a wireless ad hoc network is neighbor discovery in which every node attempts to determine the set of nodes it can communicate with in one wireless hop. In recent years, cognitive radio technology has gained attention as an attractive approach to alleviate the...

    Provided By University of Texas

  • White Papers // Dec 2010

    ASSERT: A Wireless Networking Testbed

    As wireless networking is becoming more pervasive, there has been a greater desire to develop communication hardware and protocol stacks that have a number of desirable properties like increased throughput, reduced latency, reduced energy consumption, quality of service, security, etc. Consequently, several academic and industrial research groups are actively working...

    Provided By University of Texas