IS23: Inference in Stochastic Networks

date: 7/15/2025, time: 14:00-15:30, room: ICS 119

Organizer: Michel Mandjes (Leiden University)

Chair: Michel Mandjes (Leiden University)

TBA

vacant (vacant)

Tal Goldshtein and Elad Domanovitz

Anatoly KHINA (Tel Aviv University)

The rapid expansion of wireless connectivity and the Internet of Things (IoT) has heightened the need for low-latency communication in distributed networks, where strict delay constraints redefine traditional capacity limits. Information Velocity (IV) has emerged as a critical metric for quantifying the speed at which information can reliably propagate through cascaded and relay networks under such constraints. While prior art was focused on single-message transmission, real-world applications often involve continuous message streams, necessitating a broader analytical framework. In this work, we derive IV bounds for canonical network topologies, including parallel, InTree and OutTree networks, modeled as independent servers with i.i.d. exponential processing times where the arrival process is assumed to be a Poisson process (independent of the processing times of the servers). We derive bounds on the anycast and multicast IV for continuous message streams. Our results provide theoretical insights into delay-limited communication and offer practical applications in distributed AI, computing, and real-time edge inference.

Inference in dynamic random graphs

Michel Mandjes (Leiden University)

The bulk of the random graph literature concerns models that are of an inherently static nature, in that features of the random graph at a single point in time are considered. There are strong practical motivations, however, to consider random graphs that are stochastically evolving, so as to model networks’ inherent dynamics. In this talk I’ll start by briefly discussing a set of dynamic random graph mechanisms and their probabilistic properties. Key results cover functional diffusion limits for subgraph counts (describing the behaviour around the mean) and a sample-path large-deviation principle (describing the rare-event behaviour, thus extending the seminal result for the static case developed by Chatterjee and Varadhan). The main part of my talk will be about estimation of the model parameters from partial information. We for instance demonstrate how the model’s underlying parameters can be estimated from just snapshots of the number of edges. We also consider settings in which particles move around on a dynamically evolving random graph, and in which the graph dynamics are inferred from the movements of the particles (i.e., not observing the graph process).