Harvard University

Displaying 81-120 of 149 results

  • White Papers // Aug 2010

    Social Networks, Personalized Advertising, And Privacy Controls

    This paper investigates how internet users' perception of control over their personal information affects how likely they are to click on online advertising. The paper uses data from a randomized field experiment that examined the relative effectiveness of personalizing ad copy to mesh with existing personal information on a social...

    Provided By Harvard University

  • White Papers // Jul 2010

    Unstable Equity? Combining Banking With Private Equity Investing

    Theoretical work suggests that banks can be driven by market mispricing to undertake activity in a highly cyclical manner, accelerating activity during periods when securities can be readily sold to other parties. While financial economists have largely focused on bank lending, banks are active in a variety of arenas, with...

    Provided By Harvard University

  • White Papers // Jul 2010

    Popularity Is Everything a New Approach to Protecting Passwords From Statistical-Guessing Attacks

    The authors propose to strengthen user-selected passwords against statistical-guessing attacks by allowing users of Internet-scale systems to choose any password they want-so long as it's not already too popular with other users. They create an oracle to identify undesirably popular passwords using an existing data structure known as a count-min...

    Provided By Harvard University

  • White Papers // Jul 2010

    The Task And Temporal Microstructure Of Productivity: Evidence From Japanese Financial Services

    Sustaining workers' productivity is critical to organizations' operational success. Yet, comparatively little attention has been given to how managers can effectively allocate work across tasks and time to improve workers' performance. In this paper, the author uses the learning curve framework to investigate how productivity varies within task and within...

    Provided By Harvard University

  • White Papers // Jul 2010

    Financing Risk And Bubbles Of Innovation

    Investors in risky startups who stage their investments face financing risk - that is, the risk that later stage investors will not fund the startup, even if the fundamentals of the firm are still sound. They show that financing risk is part of a rational equilibrium where investors can flip...

    Provided By Harvard University

  • White Papers // Jul 2010

    A Comparative-Advantage Approach To Government Debt Maturity

    This paper studies optimal government debt maturity in a model where investors derive monetary services from holding riskless short-term securities. In a simple setting where the government is the only issuer of such riskless paper, it trades off the monetary premium associated with short-term debt against the refinancing risk implied...

    Provided By Harvard University

  • White Papers // Jul 2010

    A Macroprudential Approach To Financial Regulation

    Bank capital requirements were the cornerstone of financial regulation up until the global financial crisis. In the wake of the crisis there have been many calls for reform of financial regulation, with a first step having been recently passed in the United States. In this paper, the authors offer a...

    Provided By Harvard University

  • White Papers // May 2010

    Cyclicality Of Credit Supply: Firm Level Evidence

    The author tells the effect of bank loan supply through the business cycle using firm level data from 1990 to 2009. The contribution of the paper is to address two of the main empirical challenges in identifying the effects of bank credit supply. First, they focus on firms' choice between...

    Provided By Harvard University

  • White Papers // May 2010

    Demand Response Pricing In Organized Wholesale Markets

    The Federal Energy Regulatory Commission's Notice of Proposed Rulemaking (NOPR) addresses the question of proper compensation for demand response in organized wholesale electricity markets. The Commission proposes to pay the full Locational Marginal Price (LMP) for all types of demand response acting as a resource in the energy market. The...

    Provided By Harvard University

  • White Papers // May 2010

    Required Information Release

    Many computer systems have a functional requirement to release information. Such requirements are an important part of a system's information security requirements. Current information-flow control techniques are able to reason about permitted information flows, but not required information flows. In this paper, the authors introduce and explore the specification and...

    Provided By Harvard University

  • White Papers // May 2010

    Wireless Sensor Networks for Healthcare

    Driven by the confluence between the need to collect data about people's physical, physiological, psychological, cognitive, and behavioral processes in spaces ranging from personal to urban and the recent availability of the technologies that enable this data collection, wireless sensor networks for healthcare have emerged in the recent years. In...

    Provided By Harvard University

  • White Papers // May 2010

    How Far Have Public Financial Management Reforms Come In Africa?

    This paper asks how strong African Public Financial Management (PFM) has become, after a decade and more of reform. How well do African PFM systems in place now facilitate effective public financial management? Where are the next challenges and how can they be met? It analyzes recent PFM assessments in...

    Provided By Harvard University

  • White Papers // May 2010

    The Intensive Margin Of Technology Adoption

    The authors present a tractable model for analyzing the relationship between economic growth and the intensive and extensive margins of technology adoption. The "Extensive" margin refers to the timing of a country's adoption of a new technology; the "Intensive" margin refers to how many units are adopted (for a given...

    Provided By Harvard University

  • White Papers // Apr 2010

    Improved Delegation of Computation Using Fully Homomorphic Encryption

    Following Gennaro, Gentry, and Parno (Cryptology ePrint Archive 2009/547), the authors use fully homomorphic encryption to design improved schemes for delegating computation. In such schemes, a delegator outsources the computation of a function F on many, dynamically chosen inputs xi to a worker in such a way that it is...

    Provided By Harvard University

  • White Papers // Apr 2010

    Capacity Bounds on Multi-Pair Two-Way Communication With a Base-Station Aided by a Relay

    The simplest bi-directional relay network consists of a pair of terminal nodes that wish to exchange messages through the use of a single relay. While the capacity of this channel is still unknown in general, it has been of great recent interest (see references in and) due to its relevance...

    Provided By Harvard University

  • White Papers // Apr 2010

    Arbitrage Capital And Real Investment

    The authors study the relationship between the supply of arbitrage capital and real investment. The investment of firms that depend on convertible debt for financing responds positively to flows into convertible arbitrage hedge funds. An extra $1 of fund flows increases capital expenditures of convertible dependent firms by $0.49. At...

    Provided By Harvard University

  • White Papers // Mar 2010

    Rewarding Calculated Risk-Taking: Evidence From A Series Of Experiments With Commercial Bank Loan Officers

    The authors analyze the underwriting process of small-business loans to entrepreneurs in an emerging market, using data obtained from a large commercial lender in India. Using a series of randomized experiments involving loan officers from several banks, this paper provides the first rigorous test of theories of loan officer decision...

    Provided By Harvard University

  • White Papers // Mar 2010

    Conceptual Foundations Of The Balanced Scorecard

    The paper was based on a multi-company research project to study performance measurement in companies whose intangible assets played a central role in value creation. The author believed that if companies were to improve the management of their intangible assets, they had to integrate the measurement of intangible assets into...

    Provided By Harvard University

  • White Papers // Mar 2010

    Ensuring Strong Dominance of the Leading Eigenvalues for Cluster Ensembles

    Spectral analysis is a popular mathematical tool in analyzing a variety of network and distributed systems. For a special class of networks, called cluster ensembles, which are made of interconnected clusters, the authors can characterize those which exhibit strong dominance of the leading eigenvalues in terms of the cluster structure....

    Provided By Harvard University

  • White Papers // Mar 2010

    The Impact Of Private Equity Ownership On Portfolio Firms? Corporate Tax Planning

    This paper investigates whether private equity (PE) firms influence the tax practices of their portfolio firms. Prior research documents that PE firms create economic value in portfolio firms through effective governance, financial, and operational engineering. Given PE firms' focus on value creation, the authors examine whether PE firms influence the...

    Provided By Harvard University

  • White Papers // Mar 2010

    Development As Leadership-Led Change - A Report For The Global Leadership Initiative And The World Bank Institute (WBI)

    Development involves change. But many development initiatives produce unimpressive levels of change in targeted countries, organizations and outcomes. This is the case in social sector initiatives, core public management reforms, and even macroeconomic adjustment operations. Change is often limited even when countries adopt proposed solutions in their proposed forms, in...

    Provided By Harvard University

  • White Papers // Mar 2010

    Population Aging And Economic Growth In China

    According to current UN projections, the population of the world age 60 or older will be 2 billion by 2050. With populations aging in nearly all countries, there has been widespread concern about the possible effects on economic growth and on the ability of countries to provide support for their...

    Provided By Harvard University

  • White Papers // Mar 2010

    Sign-Based Spectral Clustering

    Sign-based spectral clustering performs data grouping based on signs of components in the eigenvectors of the input. This paper introduces the concept of sign-based clustering, proves some of its basic properties and describes its use in applications. It is shown that for certain applications where a relatively small number of...

    Provided By Harvard University

  • White Papers // Feb 2010

    How Hard Can It Be to Measure Phishing?

    Measuring cybercrime might be thought to be easy; if only the criminals and victims would cooperate and provide the data. This short paper explains, by reference to 'phishing' criminality, how even when a great deal of data is available it is far from clear quite what should be measured. Examples...

    Provided By Harvard University

  • White Papers // Feb 2010

    Towards Query Interoperability: PASSing PLUS

    The authors describe their experiences importing PASS provenance into PLUS. Although both systems import and export provenance that conforms to the Open Provenance Model (OPM), the two systems vary greatly with respect to the granularity of provenance captured, how much semantic knowledge the system contributes, and the completeness of provenance...

    Provided By Harvard University

  • White Papers // Feb 2010

    Graption: Graph-Based P2P Traffic Classification at the Internet Backbone

    Monitoring network traffic and classifying applications are essential functions for network administrators. Current traffic classification methods can be grouped in three categories: flow-based (e.g., packet sizing/timing features), payload-based, and host-based. Methods from all three categories have limitations, especially when it comes to detecting new applications, and classifying traffic at the...

    Provided By Harvard University

  • White Papers // Feb 2010

    Self Control, Risk Aversion, And The Allais Paradox

    This paper develops a dual-self model that is compatible with modern dynamic macroeconomic theory and evidence, and shows how it leads to a wide range of behavioral anomalies concerning risk, including the Allais paradox. The authors calibrate the model to obtain a quantitative fit, by extending the simpler "Nightclub" model...

    Provided By Harvard University

  • White Papers // Feb 2010

    Tracking Back References in a Write-Anywhere File System

    Many file systems reorganize data on disk, for example to defragment storage, shrink volumes, or migrate data between different classes of storage. Advanced file system features such as snapshots, writable clones, and deduplication make these tasks complicated, as moving a single block may require finding and updating dozens, or even...

    Provided By Harvard University

  • White Papers // Feb 2010

    Mediation In Business-Related Human Rights Disputes: Objections, Opportunities And Challenges

    In his 2008 report to the United Nations Human Rights Council, Prof. John Ruggie, the Special Representative of the UN Secretary-General (SRSG) for Business and Human Rights, set out a three-part framework to advance a shared understanding of the complex interactions between companies and human rights. The framework, subsequently endorsed...

    Provided By Harvard University

  • White Papers // Feb 2010

    Will I Stay Or Will I Go? Cooperative And Competitive Effects Of Workgroup Sex And Race Composition On Turnover

    The authors develop an integrated theory of the social identity mechanisms linking workgroup sex and race composition across levels with individual turnover. Analyzing longitudinal human resource data on professionals employed in a large up-or-out knowledge organization, they assess the distinct effects of demographic match with superiors and demographic match with...

    Provided By Harvard University

  • White Papers // Jan 2010

    Green Computing: Can Computation Advance Sustainability? An Exploration

    The Green Computing Project was launched in January 2009 as a development and networking effort intended to bring the IIC's multidisciplinary approach to computational science and engineering to bear on the challenges of environmental sustainability. Over the longer term, the project aimed to catalyze faculty efforts around two main goals:...

    Provided By Harvard University

  • White Papers // Jan 2010

    Does Product Market Competition Lead Firms To Decentralize?

    In this paper the authors investigate whether greater product market competition increases decentralization. To tackle the issues they collected detailed information on the internal organization of firms across nations. The few datasets that exist are either from a single industry or (at best) across many firms in a single country....

    Provided By Harvard University

  • White Papers // Jan 2010

    The Drawdown Of Personal Retirement Assets Faculty Research

    How households draw down the balances that they accumulate in retirement saving accounts such as 401(k) plans and Individual Retirement Accounts can have an important effect on the contribution of these accounts to retirement income security. This paper presents evidence on the pattern of withdrawals at different ages. The authors...

    Provided By Harvard University

  • White Papers // Jan 2010

    The Effect Of Financial Development On The Investment-Cash Flow Relationship: Cross-Country Evidence From Europe

    The authors investigate if financial development eases firm level financing constraints in a cross-country data set covering much of the European economy. The cash flow sensitivity of investment is lower in countries with better-developed financial markets. To deal with potentially serious biases, they employ a difference-in-difference methodology. Subsidiaries of other...

    Provided By Harvard University

  • White Papers // Dec 2009

    Estimating the Compression Fraction of an Index using Sampling

    Data compression techniques such as null suppression and dictionary compression are commonly used in today's database systems. In order to effectively leverage compression, it is necessary to have the ability to efficiently and accurately estimate the size of an index if it were to be compressed. Such an analysis is...

    Provided By Harvard University

  • White Papers // Dec 2009

    P2P Trading in Social Networks: The Value of Staying Connected

    The success of future P2P applications ultimately depends on whether users will contribute their bandwidth, CPU and storage resources to a larger community. In this paper, the authors propose a new incentive paradigm, Networked Asynchronous Bilateral Trading (NABT), which can be applied to a broad range of P2P applications. In...

    Provided By Harvard University

  • White Papers // Dec 2009

    Improving QoS in BitTorrent-Like VoD Systems

    In recent years a number of research efforts have focused on effective use of P2P-based systems in providing large scale video streaming services. In particular, live streaming and Video-on-Demand (VoD) systems have attracted much interest. While previous efforts mainly focused on the common challenges faced by both types of applications,...

    Provided By Harvard University

  • White Papers // Dec 2009

    Policy Bundling To Overcome Loss Aversion: A Method For Improving Legislative Outcomes

    Policies that would create net benefits for society but would also involve costs frequently lack the necessary support to be enacted because losses loom larger than gains psychologically. To reduce the harmful consequence of loss aversion, the authors propose a new type of policy bundling technique in which related bills...

    Provided By Harvard University

  • White Papers // Dec 2009

    Climate Finance: Key Concepts And Ways Forward

    Climate finance is fundamental to curbing anthropogenic climate change. Compared, however, to the negotiations over emissions reduction timetables, commitments, and architectures, climate finance issues have received only limited and belated attention. Assuring delivery and appropriate use of the financial resources needed to achieve emissions reductions and secure adaptation to climate...

    Provided By Harvard University

  • White Papers // Nov 2009

    The Phish Market Protocol: Securely Sharing Attack Data Between Competitors

    A key way in which banks mitigate the effects of phishing is to remove fraudulent websites or suspend abusive domain names. This 'take-down' is often subcontracted to specialist firms. Prior work has shown that these take-down companies refuse to share 'Feeds' of phishing website URLs with each other, and consequently,...

    Provided By Harvard University

  • White Papers // Oct 2009

    An Adaptive Rate Assignment Strategy for CDMA2000 IS-856 Subject to RAB Delay

    CDMA2000 1xEV-DO (IS-856) is designed for high-performance wireless networks to provide high-speed service at low cost. IS-856 uses the same bandwidth as traditional CDMA2000, but provides higher throughput; this translates to very high data rate on downlink channel. In this paper, the problem of resource allocation in IS-856 uplink in...

    Provided By Harvard University

  • White Papers // Feb 2012

    Modeling Internet-Scale Policies for Cleaning Up Malware

    The results of the authors' experiments using ASIM indicate that when filtering wicked traffic, the best targets for intervention are a small group of the largest ASes. Specifically, they find that intervention by the top 0.2% of ASes (in terms of size) is more effective than intervention by a randomly...

    Provided By Harvard University

  • White Papers // Mar 2013

    A Highly Scalable Key Pre-Distribution Scheme for Wireless Sensor Networks

    Given the sensitivity of the potential WSN applications and because of resource limitations, key management emerges as a challenging issue for WSNs. One of the main concerns when designing a key management scheme is the network scalability. Indeed, the protocol should support a large number of nodes to enable a...

    Provided By Harvard University

  • White Papers // Feb 2011

    Evaluating Value-Graph Translation Validation for LLVM

    Translation validators are static analyzers that attempt to verify that program transformations preserve semantics. Normalizing translation validators do so by trying to match the value-graphs of an original function and it's transformed counterpart. In this paper, the authors present the design of such a validator for LLVM's intra-procedural optimizations, a...

    Provided By Harvard University

  • White Papers // May 2012

    Verifiable Computation With Massively Parallel Interactive Proofs

    As the cloud computing paradigm has gained prominence, the need for verifiable computation has grown increasingly urgent. Protocols for verifiable computation enable a weak client to outsource difficult computations to a powerful, but untrusted server, in a way that provides the client with a guarantee that the server performed the...

    Provided By Harvard University

  • White Papers // Aug 2012

    Compressive Sensing Medium Access Control for Wireless LANs

    The authors propose a Medium Access Control (MAC) protocol for wireless Local Area Networks (LANs) that leverages the theory of compressive sensing. The proposed Compressive Sensing MAC (CS-MAC) exploits the sparse property that, at a given time, only a few hosts are expected to request for radio channel access. Under...

    Provided By Harvard University

  • White Papers // Apr 2012

    Performance Gains in Conjugate Gradient Computation With Linearly Connected GPU Multiprocessors

    Conjugate gradient is an important iterative method used for solving least squares problems. It is compute-bound and generally involves only simple matrix computations. One would expect that the authors could fully parallelize such computation on the GPU architecture with multiple Stream Multiprocessors (SMs), each consisting of many SIMD processing units....

    Provided By Harvard University

  • White Papers // Aug 2012

    Determining RF Angle of Arrival Using COTS Antenna Arrays: A Field Evaluation

    The authors are interested in estimating the angle of arrival of an RF signal by using Commercial-Off-The-Shelf (COTS) Software-Defined Radios (SDRs). The proposed COTS-based approach has the advantages of flexibility, low cost and ease of deployment, but - unlike traditional phased antenna arrays in which elements are already phase-aligned -...

    Provided By Harvard University

  • White Papers // Dec 2011

    Compressive Sensing Based Channel Feedback Protocols for Spatially-Correlated Massive Antenna Arrays

    Incorporating wireless transceivers with numerous antennas (such as Massive-MIMO) is a prospective way to increase the link capacity or enhance the energy efficiency of future communication systems. However, the benefits of such approach can be realized only when proper channel information is available at the transmitter. Since the amount of...

    Provided By Harvard University

  • White Papers // Jul 2012

    A Chip Architecture for Compressive Sensing Based Detection of Ic Trojans

    The authors present a chip architecture for a compressive sensing based method that can be used in conjunction with the JTAG standard to detect IC Trojans. The proposed architecture compresses chip output resulting from a large number of test vectors applied to a Circuit Under Test (CUT). They describe their...

    Provided By Harvard University

  • White Papers // Aug 2012

    Output Compression for IC Fault Detection Using Compressive Sensing

    The process of detecting logical faults in Integrated Circuits (ICs) due to manufacturing variations is bottlenecked by the I/O cost of scanning in test vectors and offloading test results. Traditionally, the output bottleneck is alleviated by reducing the number of bits in output responses using XOR networks, or computing signatures...

    Provided By Harvard University

  • White Papers // Aug 2012

    Compressive Sensing With Optimal Sparsifying Basis and Applications in Spectrum Sensing

    The authors describe a method of integrating Karhunen-Loeve Transform (KLT) into compressive sensing, which can as a result improve the compression ratio without affecting the accuracy of decoding. They present two complementary results: by using KLT to find an optimal basis for decoding they can drastically reduce the number of...

    Provided By Harvard University

  • White Papers // Sep 2012

    Compressed Statistical Testing and Application to Radar

    The authors present Compressed Statistical Testing (CST) with an illustrative application to radar target detection. They characterize an optimality condition for a compressed domain test to yield the same result as the corresponding test in the uncompressed domain. They demonstrate by simulation that under high SNR, a likelihood ratio test...

    Provided By Harvard University

  • White Papers // Aug 2011

    Partitioned Compressive Sensing With Neighbor-Weighted Decoding

    Compressive sensing has gained momentum in recent years as an exciting new theory in signal processing with several useful applications. It states that signals known to have a sparse representation may be encoded and later reconstructed using a small number of measurements, approximately proportional to the signal's sparsity rather than...

    Provided By Harvard University

  • White Papers // Aug 2011

    Measurement Combining and Progressive Reconstruction in Compressive Sensing

    Compressive sensing has emerged as an important new technique in signal acquisition due to the surprising property that a sparse signal can be captured from measurements obtained at a sub-Nyquist rate. The decoding cost of compressive sensing, however, grows superlinearly with the problem size. In distributed sensor systems, the aggregate...

    Provided By Harvard University

  • White Papers // May 2011

    Separation-Based Joint Decoding in Compressive Sensing

    The authors introduce a joint decoding method for compressive sensing that can simultaneously exploit sparsity of individual components of a composite signal. Their method can significantly reduce the total number of variables decoded jointly by separating variables of large magnitudes in one domain and using only these variables to represent...

    Provided By Harvard University

  • White Papers // Jul 2011

    DISTROY: Detecting Integrated Circuit Trojans With Compressive Measurements

    Detecting Trojans in an Integrated Circuit (IC) is an important but hard problem. A Trojan is malicious hardware - it can be extremely small in size and dormant until triggered by some unknown circuit state. To allow wake-up, a Trojan could draw a minimal amount of power, for example, to...

    Provided By Harvard University

  • White Papers // Mar 2010

    Ensuring Strong Dominance of the Leading Eigenvalues for Cluster Ensembles

    Spectral analysis is a popular mathematical tool in analyzing a variety of network and distributed systems. For a special class of networks, called cluster ensembles, which are made of interconnected clusters, the authors can characterize those which exhibit strong dominance of the leading eigenvalues in terms of the cluster structure....

    Provided By Harvard University

  • White Papers // Jun 2012

    Designing Informative Securities

    The authors create a formal framework for the design of informative securities in prediction markets. These securities allow a market organizer to infer the likelihood of events of interest as well as if he knew all of the traders' private signals. They consider the design of markets that are always...

    Provided By Harvard University

  • White Papers // Jan 2011

    Abstract Predicates and Mutable ADTs in Hoare Type Theory

    Hoare Type Theory (HTT) combines a dependently typed, higher-order language with monadically-encapsulated, stateful computations. The type system incorporates pre- and post-conditions, in a fashion similar to Hoare and Separation Logic, so that programmers can modularly specify the requirements and effects of computations within types. This paper extends HTT with quantification...

    Provided By Harvard University

  • White Papers // Apr 2010

    Capacity Bounds on Multi-Pair Two-Way Communication With a Base-Station Aided by a Relay

    The simplest bi-directional relay network consists of a pair of terminal nodes that wish to exchange messages through the use of a single relay. While the capacity of this channel is still unknown in general, it has been of great recent interest (see references in and) due to its relevance...

    Provided By Harvard University

  • White Papers // Nov 2010

    A Study on the Optimal Degree-of-Freedoms of Cellular Networks: Opportunistic Interference Mitigation

    The authors introduce an Opportunistic Interference Mitigation (OIM) protocol for cellular networks, where a user scheduling strategy is utilized in uplink K-cell environments with time-invariant channel coefficients and Base Stations (BSs) having M receive antennas. In the OIM scheme, each BS opportunistically selects a set of users who generate the...

    Provided By Harvard University

  • White Papers // Feb 2010

    Towards Query Interoperability: PASSing PLUS

    The authors describe their experiences importing PASS provenance into PLUS. Although both systems import and export provenance that conforms to the Open Provenance Model (OPM), the two systems vary greatly with respect to the granularity of provenance captured, how much semantic knowledge the system contributes, and the completeness of provenance...

    Provided By Harvard University

  • White Papers // Feb 2010

    Tracking Back References in a Write-Anywhere File System

    Many file systems reorganize data on disk, for example to defragment storage, shrink volumes, or migrate data between different classes of storage. Advanced file system features such as snapshots, writable clones, and deduplication make these tasks complicated, as moving a single block may require finding and updating dozens, or even...

    Provided By Harvard University

  • White Papers // Apr 2011

    Benchmarking File System Benchmarking: It is Rocket Science

    The quality of file system benchmarking has not improved in over a decade of intense research spanning hundreds of publications. Researchers repeatedly use a wide range of poorly designed benchmarks, and in most cases, develop their own ad-hoc benchmarks. The authors' community lacks a definition of what they want to...

    Provided By Harvard University

  • White Papers // Apr 2011

    Multicore OSes: Looking Forward From 1991, er, 2011

    Upcoming multicore processors, with hundreds of cores or more in a single chip, require a degree of parallel scalability that is not currently available in today's system software. Based on prior experience in the supercomputing sector, the likely trend for multicore processors is away from shared memory and toward shared...

    Provided By Harvard University

  • White Papers // May 2011

    Provenance Integration Requires Reconciliation

    While there has been a great deal of research on provenance systems, there has been little discussion about challenges that arise when making different provenance systems interoperate. In fact, most of the literature focuses on provenance systems in isolation and does not discuss interoperability - what it means, its requirements,...

    Provided By Harvard University

  • White Papers // May 2011

    Provenance Map Orbiter: Interactive Exploration of Large Provenance Graphs

    Provenance systems can produce enormous provenance graphs that can be used for a variety of tasks from determining the inputs to a particular process to debugging entire workflow executions or tracking difficult-to-find dependencies. Visualization can be a useful tool to support such tasks, but graphs of such scale (thousands to...

    Provided By Harvard University

  • White Papers // May 2011

    Collecting Provenance Via the Xen Hypervisor

    The Provenance Aware Storage Systems project (PASS) currently collects system-level provenance by intercepting system calls in the Linux kernel and storing the provenance in a stackable filesystem. While this approach is reasonably efficient, it suffers from two significant drawbacks: each new revision of the kernel requires reintegration of PASS changes,...

    Provided By Harvard University

  • White Papers // Jan 2011

    A Non-Work-Conserving Operating System Scheduler for SMT Processors

    Simultaneous Multi-Threading (SMT) processors run multiple threads simultaneously on a single processing core. Because concurrent threads compete for the processor's shared resources, non-work-conserving scheduling, i.e., running fewer threads than the processor allows even if there are threads ready to run, can often improve performance. Nevertheless, conventional operating systems do not...

    Provided By Harvard University

  • White Papers // Oct 2011

    Information-Theoretic Limits of Dense Underwater Networks

    Information-theoretic throughput scaling laws are analyzed in an underwater acoustic network with n regularly located nodes on a unit square, in which both bandwidth and received signal power can be severely limited. A narrow-band model is assumed where the carrier frequency is allowed to scale as a function of n....

    Provided By Harvard University

  • White Papers // Jul 2010

    Popularity Is Everything a New Approach to Protecting Passwords From Statistical-Guessing Attacks

    The authors propose to strengthen user-selected passwords against statistical-guessing attacks by allowing users of Internet-scale systems to choose any password they want-so long as it's not already too popular with other users. They create an oracle to identify undesirably popular passwords using an existing data structure known as a count-min...

    Provided By Harvard University

  • White Papers // Jan 2011

    On the Zero-Error Capacity Threshold for Deletion Channels

    The authors consider the zero-error capacity of deletion channels. Specifically, they consider the setting where they choose a codebook C consisting of strings of n bits, and their model of the channel corresponds to an adversary who may delete up to pn of these bits for a constant p. Their...

    Provided By Harvard University

  • White Papers // Sep 2009

    An Improved Analysis of the Lossy Difference Aggregator

    The authors provide a detailed analysis of the Lossy Difference Aggregator, a recently developed data structure for measuring latency in a router environment where packet losses can occur. Their analysis provides stronger performance bounds than those given originally, and leads one to a model for how to optimize the parameters...

    Provided By Harvard University

  • White Papers // Jun 2009

    Some Open Questions Related to Cuckoo Hashing

    Hash-based data structures and algorithms are currently a booming industry in the Internet, particularly for applications related to measurement, monitoring, and security. Hash tables and related structures, such as Bloom filters and their derivatives, are used billions of times a day, and new uses keep proliferating. Indeed, one of the...

    Provided By Harvard University

  • White Papers // Aug 2010

    Speculative Pipelining for Compute Cloud Programming

    MapReduce job execution typically occurs in sequential phases of parallel steps. These phases can experience unpredictable delays when available computing and network capacities fluctuate or when there are large disparities in inter-node communication delays, as can occur on shared compute clouds. The authors propose a pipeline-based scheduling strategy, called speculative...

    Provided By Harvard University

  • White Papers // Aug 2011

    Collaborative Compressive Spectrum Sensing in a UAV Environment

    Spectrum sensing is of fundamental importance to many wireless applications including cognitive radio channel assignment and radiolocation. However, conventional spectrum sensing can be prohibitively expensive in computation and network bandwidth when the bands under scanning are wide and highly contested. In this paper, the authors propose distributed spectrum sensing with...

    Provided By Harvard University

  • White Papers // May 2011

    Achieving High Throughput Ground-to-UAV Transport Via Parallel Links

    Wireless data transfer under high mobility, as found in Unmanned Aerial Vehicle (UAV) applications, is a challenge due to varying channel quality and extended link outages. The authors present FlowCode, an easily deployable link-layer solution utilizing multiple transmitters and receivers for the purpose of supporting existing transport protocols such as...

    Provided By Harvard University

  • White Papers // Mar 2010

    Sign-Based Spectral Clustering

    Sign-based spectral clustering performs data grouping based on signs of components in the eigenvectors of the input. This paper introduces the concept of sign-based clustering, proves some of its basic properties and describes its use in applications. It is shown that for certain applications where a relatively small number of...

    Provided By Harvard University

  • White Papers // Aug 2010

    A Location-Dependent Runs-and-Gaps Model for Predicting TCP Performance Over a UAV Wireless Channel

    In this paper, the authors use a finite-state model to predict the performance of the Transmission Control Protocol (TCP) over a varying wireless channel between an Unmanned Aerial Vehicle (UAV) and ground nodes. As a UAV traverses its flight path, the wireless channel may experience periods of significant packet loss,...

    Provided By Harvard University