Hacker News new | past | comments | ask | show | jobs | submit login

In the early 1990's, JP Morgan developed software tools it packaged and sold as "RiskMetrics." It standardized the way most banks measured and reported their portfolio risk. For example, it was now easy to determine how likely a portfolio would lose, say, 5% of its value. This underlying concept, called Value at Risk, was baked into the RiskMetrics tool and quickly became the default methodology for measuring risk among traders but also regulators, bank management, and external portfolio managers and investors.

The problem was VaR had never really been tested in a real-world meltdown situation. Its models all assumed even under duress, markets acted rationally and traders' behavior was uncoordinated and uncorrelated. In 1998, the spectacular failure of Long-Term Capital Management, was enabled by its reliance on VaR (among other things, like bad trades and enormous leverage). In a crisis, markets were NOT acting rationally, and neither were traders, causing losses much larger than anticipated by VaR. Unfortunately, people tend to repeat mistakes, so in 2007-8, once again, banks all over the world that relied on VaR calculations to tell them their risk portfolio was small and manageable, soon found out that it wasn't.

http://www.nytimes.com/2009/01/04/magazine/04risk-t.html

tl;dr: JP Morgan would be among the last companies I would trust with a software innovation reliant on new, breakthrough algorithms.




There's a big difference between calculation and automation though. This article starts with an operation that's a bit of both, but the big underlying theme here is the automation of data digestion.

For all the pomp and pedigree of banking, it's mostly just moving data or digesting data. Because of how complex banks are, they're easily a decade behind the times and still HEAVILY reliant on manual processes in excel. Any real "streamlining" of processes is done through patchwork fixes with no systemic reworks being done.

This article is implying systemic reworks. Banks live and die on their NII, and would gladly gut their back and middle office personnel (most have done so by outsourcing to India and Hungary) with automation. Truth be told, rightfully so given the menial tasks of those jobs.


>> The program does the mind-numbing job of interpreting commercial-loan agreements

A computer program that interprets written language -- natural-language processing -- is reliant on highly-advanced, complex algorithms and AI to do the interpreting. This is much deeper than mindless "automation" and requires absolute trust in the accuracy of the underlying algo (or, hiring back some of those humans).




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: