Writing in Quanta Magazine, Jennifer Ouellette surveys the techniques researchers are developing to make sense of noisy and unstructured "big data."
Ouellette describes piecemeal approaches, the topological data analysis (TDA) pioneered by Stanford's Gunnar Carlsson, and a "mathematical version of Occam's razor" called compressed sensing.
As effective as these tools have been, though, some who deal in big data think it demands a more unified approach:
Yale University mathematician Ronald Coifman says that what is really needed is the big data equivalent of a Newtonian revolution, on par with the 17th century invention of calculus, which he believes is already underway. It is not sufficient, he argues, to simply collect and store massive amounts of data; they must be intelligently curated, and that requires a global framework.
Read the piece.