On May 6, 2010, the stock market
plunged 4%, and then in mere minutes, sharply fell another 6%, before
mysteriously rebounding almost as quickly. A reactive, computer
execution system had caused roughly $2 billion worth of shares to be
sold in just seven minutes in reaction to someone's trade, and the
ensuing panic exposed the fragility of our stock market. After
review, the The U.S. Securities and Exchange Commission (SEC) and the
Commodity Futures Trading Commission (CFTC) determined that a
computer algorithm was to blame for the incident, and measures have
been put in place to prevent computers from causing such sudden,
volatile swings in the market. Today, markets depend on the volume
generated by high-speed traders and their computers, but the
computers don't have a sense of when to intervene during a
crisis—they are entirely oblivious to the catastrophic effects that
may be caused by certain actions. While it is obvious that computers
are an integral part of the stock market, have we allowed them too
much of a role in our fragile economy? What kind of oversight is
there? What are the failsafes? In the end, the computers were only
following their programming, regardless of the possible outcome. As
computers have been integrated into vital components of our
civilization's infrastructure, it is chilling to consider how
far-reaching a computer error can be upon our way of life.
No comments:
Post a Comment