40% van alle winst is niet genoeg?

Cees Binkhorst ceesbink at XS4ALL.NL
Tue Sep 22 09:40:16 CEST 2009


REPLY TO: D66 at nic.surfnet.nl

Een tijd geleden maakte iemand miljoenen buit door de afrondingen van
computerfacturen op een eigen aparte rekening te zetten.
Dit lijkt er dus veel op.

Groet / Cees

http://tpmcafe.talkingpointsmemo.com/talk/blogs/john_hempton/2009/07/high-frequency-traders-a-phone.php
Why do I get the feeling that the financial sector (Wall Street etc.) is a
separate country - First World, extremely wealthy - while the country of
the American people is a Third/Fourth World country to be exploited for
its wealth in natural resources, cheap labor, pelf etc. by the First World
country.
We obviously are experiencing a new form of imperalism - domestic
imperalism? A question remains, however, and it is which 'country' does
the government support.

http://arstechnica.com/tech-policy/news/2009/08/nyse-builds-computer-trading-mothership-worries-abound.ars
Computer-trading worries grow as NYSE builds new datacenter
The NYSE is building a massive datacenter in New Jersey that should
significantly boost the percentage of stock market activity that's nothing
but computers trading against one another for millisecond profits. Critics
are worried that more such trading could destabilize the markets, or worse
By Jon Stokes  | Last updated August 2, 2009 9:00 PM CT

With all of the scrutiny that high-frequency trading is now under in the
media and in Congress, the New York Stock Exchange is probably none too
thrilled that the Wall Street Journal has uncovered fresh details of
NYSE's giant new datacenter, which the exchange is building in a former
New Jersey quarry. The new datacenter will significantly advance the
amount of computer-automated trading that already dominates global
markets, housing as it will "several football fields of cutting-edge
computing equipment for hedge funds and other firms that engage in
high-frequency trading," according to the WSJ. So if you were recently
shocked to learn that an estimated 70 percent of stock trading is just
computers trading against one another, get ready for that number to go
even higher.

The NYSE is reportedly already taking orders from firms that want to lease
space in the datacenter so that they can co-locate their servers with
those that run the exchange in order to execute trades more quickly.
Actually building and running a datacenter of this size is new for the
NYSE, which historically has rented space in others' datacenters. As for
the impact this will have on the markets, the debate rages.

Not everyone is happy that the NYSE is poised to massively boost the
already overwhelming amount of computer trading. At issue is not the
simple fact of computers trading against one another over electronic
networks—it's the speed with which they appear to be squeezing the humans
out of the loop, and the potential instability and fragility that may
result from increased automation of global markets.
Yes, there is actually something new here

For every technique or technology that comes under the heading of HFT, you
can dig up an example of how people did this same thing on a much smaller
scale without computers. Therefore, the argument goes, the relatively
recent (see below) use of computers to do two orders of magnitude more of
these activities in a given timeslice is "nothing new," despite the fact
that the computers are now doing this among themselves without human
intervention. This kind of reasoning keeps cropping up all over the
Internet in the current HFT debates, in some instances coming from very
smart people. (Tyler Cowen's response at Margina Revolution is the best
statement of this view.)

Others, however, are concerned by HFT's rise to dominance. Quant guru Paul
Wilmott, in a widely cited NYT op-ed this last week, says that it's
exactly the potential systemic impact of the recent acceleration of the
speed and volume of HFT activity that has him worried.

"There's nothing new in using all publicly available information to help
you trade," Wilmott acknowledges, in an initial nod to the naysayers. But
he continues: "What's novel is the quantity of data available, the
lightning speed at which it is analyzed, and the short time that positions
are held... The problem with the sudden popularity of high-frequency
trading is that it may increasingly destabilize the market."

Michael Durbin, the man behind Citadel's high-frequency trading desk,
echoed this warning to Reuters' Matthew Goldstein earlier this week:

    "You have multiple HFT trading firms and sometimes their agendas are
complementary and sometimes they're not," explains Durbin, director of
HFT research with Blue Capital Group, a small Chicago-based options
trading firm.

    "There could be a time where these HFT programs unintentionally
collaborate and you have a two- or three-minute period where the
markets are going crazy. Then other traders respond to it and it
simply gets out of control."

My own take on the question was summed up in a recent note to Felix Salmon:

    It's quite remarkable to me that many of the econ and finance folks
who insist that "HFT is the same thing we always did, just way faster"
don't seem to realize that frequency and amplitude matter a whole lot,
and that for any given phenomenon when you suddenly increase those two
factors by an order of magnitude you typically end up with something
very different than what you started with. This is true for isolated
phenomena, and it's doubly true for complex systems, where you have to
deal with systemic effects like feedback loops and
synchronization/resonance.

    What I've noticed anecdotally is that engineers and IT pros are more
concerned about HFT than people who just handle money for a living.
These guys have a keen sense for just how fragile and unpredictable
these systems-of-systems are even under the best of conditions, and
how when things go wrong they do so spectacularly and at very
inconvenient moments (they get paid a lot of money to rush into the
office to put out fires at 4am).

    There's an analogy here with e-voting, which I did quite a bit of work
on. In the e-voting fiasco, you had people who were specialists in
elections but who had little IT experience greenlighting what they
thought was an elections systems rollout, but in actuality they had
signed on for a large IT deployment and they had no idea what they
were getting into. To them, it was just voting, but with computers,
y'know? They found out the hard way that networked computer systems
are a force multiplier not just for human capabilities, but for human
limitations, as well.

In sum, the growing concern with HFT is not that computers are doing
something that people used to do; it's that they're doing a whole lot of
it, very rapidly, so that the market as a system-of-systems now has
starkly different frequency, amplitude, and connectedness characteristics
that may (or may not) give it a new, currently unknown set of emergent
properties. And if a fuse blows while the machines are driving the bus,
we're all traveling so fast that we may hit a wall before the humans in
the vehicle have time to react.
Reaction time, or what could possibly go wrong

If something does go wrong, the market moves so quickly now that by the
time the humans intervene, the damage could already be too great to keep
contagion and feedback effects from kicking in and pushing things further
south. This is possible because HFT, by design, is capable of moving the
markets much more rapidly than humans can possibly react to those moves.
Again, speed matters, especially in activities where human judgment plays
a critical role.

An example of what could go wrong happened on September 8, 2008, when a
six-year-old news story about United Airlines' bankruptcy was somehow
republished on Google News with a current timestamp. A source tells me
that the algo traders kicked in at some point and started dumping United's
stock, which eventually lost 75 percent of its value that day. About $1
billion of United's market cap evaporated in 12 minutes, before the humans
figured out that they were looking not at a bankruptcy, but at the type of
GIGO problem (garbage in, garbage out) that's well known in computer
science circles. Much of the stock's value was restored by the next day,
when the market had sorted itself out, but experts worry that something
like this could happen on a much larger scale, especially in response to a
genuine external shock.

A paper on the dangers of HFT by Lime Brokerage (uncovered by Zero Hedge)
paints the following picture of just how rapidly things might go awry in
the present environment:

    Lime's familiarity with high speed trading allows us to benchmark some
of the fastest computer traders on the planet, and we have seen
[computerized day trading (CDT)] order placement rates easily exceed
1,000 orders per second. Should a CDT algorithm go awry, where a large
amount of orders are placed erroneously or where the orders should not
have passed order validation, the Sponsor will incur a substantial
time-lag in addressing the issue.

    From the moment the Sponsor's representative detects the problem until
the time the problematic orders can be addressed by the Sponsor, at
least two mintues will have passed. The Sponsor's only tools to
control Sponsored Access flow are to log into the Trading Center's
website (if available), place a phone call to the Trading Center, or
call the Sponsee to disable trading and cancel these erroneous
orders—all sub-optimal processes which require human intervention.
With a two minute delay to cancel these erroneous orders, 120,000
orders could have gone into the market and been executed, even though
an order validation problem was detected previously. At 1,000 shares
per order and an average price of $20 per share, $2.4 billion of
improper trades could be executed in this short timeframe. (emphasis
added)

A final point: when there is a chorus of Wall Streeters and economists
telling us in regard to some new, very profitable milestone in complexity
and innovation, "don't worry, and the odds of anything going wrong here
are passing slim," does this really reassure anyone in 2009? I think the
burden of proof should be on people like Cowen to demonstrate that
long-term "buy and hold" can peacefully coexist with the degree of opaque,
computer-driven hyperspeculation the markets are barreling towards.
Addendum: the landscape has changed just this year

One of the things that perhaps wasn't clear enough from my previous
article on high-frequency trading (HFT) is that, while people have been
using computers to trade against one another for a decade or more, the
bigger HFT picture has evolved rapidly in just the past few months.

This recent HedgeWeek article tells the typical story that you'll find on
various trading blogs, and it goes something like the following:

Algo trading trading was on the rise going into the summer of 2007, when
the correlations that the algo traders depend on suddenly began to break
down. The August 2007 "quant quake" was a shakeout that had a catastrophic
impact on number of the larger funds, and statistical arbitrage strategies
continued to perform very poorly throughout 2008. As a result of this
underperformance, stat arb was generally out of favor throughout 2008,
until the strategy started working again earlier this year. In the past
few months, algo trading has enjoyed a resurgence in popularity, with
current market conditions apparently very favorable to stat arbs in
particular.

But the fact that, throughout the first half of this year, everyone has
been piling back onto the algorithmic trading bandwagon willy-nilly isn't
the only major development in HFT. It's also the case that the 2008 crisis
had the effect of radically shrinking the competitive landscape, leaving
the field to fewer out-sized players whose HFT efforts are being scaled
up; Lehman Bros., Bear Stearns, and Merrill Lynch, all had announced major
HFT plans for 2008, and in 2009 these banks no longer exist. Then there's
the flash order issue, which flared up this year and is right now creating
a stir among the exchanges and lawmakers.

All of this together adds up to a significant shift in the high-frequency
trading terrain just this year. And this is happening to markets that are
still coping with the relatively recent rise of electronic communication
networks (ECNs), which are largely (in terms of trading volume) a
post-2000 phenomenon.

If we take a step back and look solely at the percentage of traditional
"open outcry" trades (i.e., a crowd of traders, hollering and
gesticulating at one another) vs. electronic trades on the major
exchanges, we can get a sense for just how fast the market has changed in
the last decade. In 2001, some 90 percent of trading volume on the NYSE
and the Chicago Mercantile Exchange (CME) was carried out by humans on the
trading floor. In 2009, the open outcry volume is under 10 percent on both
exchanges.

Stock and commodity exchanges were based on the open outcry system for
over 100 years, but in just nine years the ratio of open outcry to
electronic trading on these two major exchanges swapped from about 90/10
to about 10/90. That's a lot of computerization in a little time; and as
the NYSE datacenter news shows, the pace of change is accelerating. The
markets are different now, and they're getting more different more
rapidly. The post-crisis Wall Street of 2009 is, in effect, in the process
of doubling down on the preceding 9 years of speed, automation,
consolidation, and leverage.

**********
Dit bericht is verzonden via de informele D66 discussielijst (D66 at nic.surfnet.nl).
Aanmelden: stuur een email naar LISTSERV at nic.surfnet.nl met in het tekstveld alleen: SUBSCRIBE D66 uwvoornaam uwachternaam
Afmelden: stuur een email naar LISTSERV at nic.surfnet.nl met in het tekstveld alleen: SIGNOFF D66
Het on-line archief is te vinden op: http://listserv.surfnet.nl/archives/d66.html
**********



More information about the D66 mailing list