Topics: Information and content centric network; Named data networking
 
 
	Authors: Michele Tortelli and Dario Rossi (Telecom ParisTech, France); Emilio Leonardi (Politecnico di Torino, Italy)
Presenter bio: 
		Emilio is an Associate Professor at the Dipartimento di Elettronica 
of Politecnico di Torino. 
His research interests are in the field of: 
performance evaluation of computer networks and distributed systems and  queueing theory.
 
 
	Abstract: Large scale deployments of general cache networks, such as Content 
Delivery Networks or Information Centric Networking architectures, arise
 new challenges regarding their performance evaluation for network 
planning. On the one hand, analytical models can hardly represent all 
the detailed interaction of complex replacement, replication, and 
routing policies on arbitrary topologies. On the other hand, the sheer 
size of network and content catalogs makes event-driven simulation 
techniques inherently non-scalable. We propose a new technique for the 
performance evaluation of large scale caching systems that intelligently
 integrates elements of stochastic analysis within a MonteCarlo 
simulative approach, that we colloquially refer to as ModelGraft. Our 
approach (i) leverages the intuition that complex scenarios can be 
mapped to a simpler equivalent scenario that builds upon Time-To-Live 
(TTL) caches; it (ii) significantly downscales the scenario to lower 
computation and memory complexity, while, at the same time, preserving 
its properties to limit accuracy loss; finally, it (iii) is simple to 
use and robust, as it autonomously converges to a consistent state 
through a feedback-loop control system, regardless of the initial state.
 Performance evaluation shows that, with respect to classic event-driven
 simulation, ModelGraft gains over two orders of magnitude in both CPU 
time and memory complexity, while limiting accuracy loss below 2%. In 
addition, we show that ModelGraft extends performance evaluation well 
beyond the boundaries of classic approaches, by enabling study of 
Internet scale scenarios with content catalogs comprising hundreds of 
billions objects.