Search algorithms on networks are important tools for the organisation of large data sets. A key example is Google PageRank, which assigns a weight to each element of a hyperlinked set of documents, such as the World Wide Web, with the purpose of measuring its relative importance within the set. The weights are assigned via exploration: a page that is linked to by many pages with a high rank receives a high rank itself. Complex networks are modelled as random graphs. Search algorithms are modelled as random walks, moving along the network by randomly picking an edge incident to the vertex currently visited and jumping to the vertex at the other end.
The goal of the project is to analyse the long-time behaviour of different classes of random walks (simple, non-backtracking, with resets) on different classes of sparse random graphs, evolving randomly over time according to different types of edge-rewiring dynamics. Key questions concern the characterisation of mixing times and cover times. Depending on the relative speeds at which the random graph and the random walk evolve, different speeds of mixing are expected.
The project is anchored in probability theory, but is of interest also in computer science for the design of exploration algorithms on complex networks. Mathematical techniques include the theory of Markov processes, coupling methods, combinatorial path-counting arguments, and branching process approximations.
|Supervisors||Frank den Hollander(LU), Luca Avena(LU), Remco van der Hofstad (TU/e)|
|PhD Student||Oliver Nagy|
|Location||Universiteit Leiden (UL)|