Humans generate information at an unprecedented pace, with some estimates suggesting that in a year we now produce on the order of 1021 bytes of data, millions of times the amount of information in all the books ever written. Processing this "data deluge", as some have called it, requires new tools and new approaches at the interface of statistics, statistical and machine learning, network theory and statistical physics.