Not sure. But
- Linear algebra is useful for physics, graphics, machine learning, data analysis.
- General algebra is basis of pretty much any modern cryptography.
- Statistics is basis for humanities, machine learning, data analysis, manufacturing processes, economy and stuff.
- Optimization theory is useful for economy, machine learning and engineering in general.
- Discrete mathematics (with graph theory) is basis for computer science, algorithms, computer networks and stuff.
- Differential equations are for physics.
- Fourier (and Laplace) analysis is used in electric engineering and in digital signal processing.
So I'd say statistics.
However, in most "high-end" fields, you get most of it combined. Physics in general is pretty much linear algebra, calculus and differential equations at once, and that's before statistics comes in.
Or like in machine learning, where you derive something from bayes decision rules (statistics and probability), make derivates and see where it converges (calculus), use optimization to find approximate parameters for the model (optimization featuring doing second derivates of whole matrices at once) while working in Hilbert space of possibly infinite dimension defined by strange yet working inner product (general algebra + madness).
So that.