Analog is not THE future, although it does have A future in niche areas.
Analog has issues with precision. Analog computations are effected by component aging, quality, ambient temperature, and all of these issues result in loss of data precision. The op video mentions this challenge, and attempting to overcome it by converting back to digital over and over.
Digital systems are more resistant to noise both on the power and on the signal lines. Digital systems are capable of detecting internal failure and typically throw a system fault rather then output garbage data. Analog systems have to be externally tested to verify output is within expected parameters for input. Input parameters not specifically tested can still produce garbage data. Retesting and adjustment has to be done frequently to avoid signal drift (lose of precision).
Analog is hardwired (including the cool mythic chips in the op post), and cannot be altered to a new function. The OPs video compares it to a GPU, but a more fair comparison would be an ASIC. The mythic chips specifically are subject to cell leakage and wear, a bit like an SDD where every cell is used and there is no over provisioning.
Analog systems achieve efficiency gains by applying special circuitry to solve one problem. The same can be done with digital systems for massive gains also. (
https://en.wikipedia.org/wiki/Application-specific_integrated_circuit ).
Analog is subject to the same limitations as an ASIC, but has many downsides ASICs do not have. The applications for analog systems where they show appreciable gains over ASIC systems are niche.