Network Measurement

Traffic Localization

Assessing Traffic Localization at a Stub AS and its Implications

A measurement study to assess the network localization of Internet traffic for UOnet, University of Oregon Campus Network, and its impact on connection throughput.

Serving user requests from near-by caches or servers has been a powerful technique for localizing Internet traffic with the intent of providing lower delay and higher throughput to end users while also lowering the cost for network operators. This basic concept has led to the deployment of different types of infrastructures of varying degrees of complexity that large CDNs, ISPs, and content providers operate to localize their user traffic. Prior measurement studies in this area have focused mainly on revealing these deployed infrastructures, reverse-engineering the techniques used by these companies to map end users to close- by caches or servers, or evaluating the performance benefits that “typical” end users experience from well-localized traffic.

To our knowledge, there has been no empirical study that assesses the nature and implications of traffic localization as experienced by end users at an actual stub-AS. We conduct such a study for the stub-AS UOnet (AS3582), a Research & Education network operated by the University of Oregon. Based on a complete flow-level view of the delivered traffic from the Internet to UOnet, we characterize the stub- AS’s traffic footprint (i.e. a detailed assessment of the locality of the delivered traffic by all major content providers), examine how effective individual content providers utilize their built-out infrastructures for localizing their delivered traffic to UOnet, and investigate the impact of traffic localization on perceived throughput by end users served by UOnet. Our empirical findings offer valuable insights into important practical aspects of content delivery to real-world stub-ASes such as UOnet.

Assessing Traffic Localization at a Stub AS and its Implications

Publications

Team

  • Bahador Yeganeh (UO)
  • Prof. Reza Rejaie (UO)
  • Dr. Walter Willinger (NINKSUN Inc.)

Funding

This material is based upon work supported by the National Science Foundation (NSF) Award NeTS-1320977. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the NSF.

Web Complexity

This measurement study re-assesses the complexity of 2000 most popular web pages and present any observed trends in their complexity characteristics over the recent years.

During the past decade, the Web has become increasingly more popular and thus more important for delivery of content and services over the Internet. At the same time, the number of requested objects, their size and delivery mechanisms for popular websites have become more complex. This in turn has various implications including the impact on page loading time that directly affects the experience of visiting users. Therefore, it is important to capture and characterize the complexity of popular web pages. An earlier study by Butkiewicz et al. characterized the complexity of 1700 popular pages in 2011.

In this study, we adopt the methodology proposed by Butkiewicz et al., develop the required tools and conduct a detailed measurement study to re-assess the complexity of 2000 popular web pages and present any observed trends in their complexity characteristics over the past four years. Our results show that the number of requested objects and contacted servers for each website has significantly increased. But a growing number of contacted servers are associated with third parties. Despite these changes, the page loading time remains rather unchanged and it is primarily affected by the same key parameters. Overall, our results sheds a useful light on trends in web site complexity and motivates a range of issues to be explored.

Publications

Team

  • Ran Tian (UO)
  • Prof. Reza Rejaie (UO)