21
Accelerating Wall Street 2010 Next Stop: Nanoseconds With data latency on its way to being measured in nanoseconds, message volume exploding, and intensified demand for innovative trading products, Wall Street organizations are turning to the fastest and newest tech- nologies to stay ahead. This special Wall Street & Technology digital report examines some of the latest innovations in the low-latency arms race, including hard- ware acceleration, complex event processing and coloca- tion, and provides exclusive insights from WS&T’s Accelerating Wall Street 2010 conference. A n a l y t i c s R e p o r t Analytics.InformationWeek.com June 2010 $499.00 POWERED BY Report ID: S1330610

Accelerating wall-street-2010-next-stop-nanoseconds 8049450

Embed Size (px)

DESCRIPTION

 

Citation preview

Page 1: Accelerating wall-street-2010-next-stop-nanoseconds 8049450

Accelerating Wall Street 2010

Nex t Stop: Nanoseconds

With data latency on its way to being measured in

nanoseconds, message volume exploding, and intensified

demand for innovative trading products, Wall Street

organizations are turning to the fastest and newest tech-

nologies to stay ahead. This special Wall Street &

Technology digital report examines some of the latest

innovations in the low-latency arms race, including hard-

ware acceleration, complex event processing and coloca-

tion, and provides exclusive insights from WS&T’s

Accelerating Wall Street 2010 conference.

A n a l y t i c s R e p o r t

A n a l y t i c s . I n f o r m a t i o n W e e k . c o m

J u n e 2 0 1 0$ 4 9 9 . 0 0

POWERED BY

Report ID: S1330610

Page 2: Accelerating wall-street-2010-next-stop-nanoseconds 8049450

A c c e l e r a t i n g W a l l S t r e e t 2 0 1 0

CONT

ENTS

3 The Low-Latency Imperative: How Fast Is Fast Enough?

4 Figure 1: 5 Factors Influencing Latency

8 What’s All the Fuss About?

11 How Low Can You Go?

14 Silicon: The Next Tool for Low Latency

16 What’s the Next Low-Latency Tool? Try People

18 Data Center Costs, Oversight Challenge the Sell Side

20 Firms Still Not Analyzing Unstructured Real-Time News

2 June 2010 © 2010 Wall Street & Technology/InformationWeek Analytics, Reproduction Prohibited

TA

BL

EO

F

A n a l y t i c s R e p o r t

A n a l y t i c s . I n fo r m a t i o n We e k . c o m

Page 3: Accelerating wall-street-2010-next-stop-nanoseconds 8049450

The Low-Latency Imperative: How Fast Is Fast Enough?

Conventional wisdom may assert that everyone on Wall Street wants to be as fast aspossible. But latency is a matter of perspective, influenced by trading style, instrumentclass and even the tools used to measure it.

By Daniel Safarik

Ask any trader, vendor or marketplace operator in the global securities market, “How fast doyou need to be in order to be successful?” and the answer will most likely be, “It depends.”

Technological advances move at such a pace, and firms rely on such varying strategies, that thelevel of latency—defined as the time it takes for an order to travel to a marketplace and to beexecuted or canceled, and then for a confirmation of that activity to return to its source—acceptable to any given party will vary, though none of the intervals are perceptible to thehuman eye. For the past few years, hardware and software providers have been able to decreaselatency exponentially each year. We have gone from talking about milliseconds to microsecondsand even nanoseconds.

Although it may seem to be accepted wisdom that everyone wants to be “as fast as possible,”that’s not necessarily true. And as the market saw graphically and frighteningly on May 6,speed, by itself, is not the ultimate goal. In fact, lacking business rules that acknowledge thefull implications of instantaneous transactions, speed is dangerous.

According to Adam Honore, senior analyst at Aite Group, the level of latency consideredacceptable by market participants depends largely on several factors, including:

• Trading style. If you have an aggressive trading style that relies on opportunistic pricing dif-ferentials, you need to be the fastest. If you are a long-only quant fund, speed is not as critical.

• Instrument class. Generally speaking, equities are the fastest-moving markets, with futures,foreign exchange and fixed income lagging behind.

• Venue characteristics. The capabilities offered by each exchange and marketplace will

3 June 2010 © 2010 Wall Street & Technology/InformationWeek Analytics, Reproduction Prohibited

A c c e l e r a t i n g W a l l S t r e e t 2 0 1 0

A n a l y t i c s . I n fo r m a t i o n We e k . c o m

A n a l y t i c s R e p o r t

Page 4: Accelerating wall-street-2010-next-stop-nanoseconds 8049450

4 June 2010 © 2010 Wall Street & Technology/InformationWeek Analytics, Reproduction Prohibited

vary—dark pools, intraday matching, live limit-order books and so on—as will the level oftraffic they attract throughout the course of the day.

• Instrument characteristics. Trading shares of highly liquid Microsoft would have a vastlylower latency requirement than an illiquid OTC Bulletin Board stock, for example.

There are ranges of latencies that can be instructive, according to Steve Rubinow, CIO at NYSEEuronext. But it is important to remember that these numbers vary greatly with market condi-tions and that the methodologies of measuring latency are not consistent, he adds, stressingthat if latencies are quoted out of context with market traffic, they are essentially useless.

“Everyone publishes numbers that were generated under the best possible conditions andhopes nobody asks the details, because those details would reveal whether it was comparableor not,” Rubinow says. “Having said all that, to be competitive today, you have to be in the fewhundred microseconds of turnaround time.”

When it embarked on its Universal Trading Platform (UTP) program last year, NYSE Euronextstated that it was aiming for 150 microseconds to 400 microseconds of latency per round tripfor its European Cash Markets. By comparison, on May 14 NYSE rival Nasdaq OMX publishedan overall average result of 157 microseconds, while noting that 99.9 percent of orders werecompleted within 757 microseconds, at a rate of 194,205 orders per second.

To illustrate how quickly the standard moves, the top class was in the tens of milliseconds ayear ago, according to Donal Byrne,CEO and founder of Corvil, a Dublin,Ireland-based vendor of latency meas-urement technology. [Ed. Note: 1 mil-lisecond = 1,000 microseconds.]

About 40 percent of the U.S. equitiesmarket volume is comprised of marketmakers that are trying to match thelatency of the marketplace they areusing, notes Shawn Melamed, founder,

A c c e l e r a t i n g W a l l S t r e e t 2 0 1 0

A n a l y t i c s . I n fo r m a t i o n We e k . c o m

A n a l y t i c s R e p o r t

Trade logic—the code that runs matching enginesand algorithmsSpeed of calculation hardwareSpeed of telecom switch hardwareQuality and number of connectionsDistance between network nodes

Source: Kevin McPartland, Analyst, TABB Group

5 Factors Influencing Latency

Figure 1

Page 5: Accelerating wall-street-2010-next-stop-nanoseconds 8049450

president and CEO of New York-based Correlix, which also sells latency monitoring devices toexchanges, including Nasdaq OMX. That group needs the fastest response times, he says.

On the other hand, “For someone who is doing index arbitrage, the average latency they wouldrequire is really situational,” Melamed comments. “There is no fixed number there. [Becausethey need to get information from several marketplaces], they will be tolerant to higher latency,so that you can at least get full price information before you make your decision.”

Knowing Latency Is Half the BattleThe emergence of companies such as Corvil and Correlix, which did not exist five years ago,illustrates an important consideration about latency: Often, knowing the level of latency at agiven marketplace is more important than the number itself. Traders—and the algorithms theydeploy—now can make decisions about execution venues using latency data, just as theywould use fill rate and price quality as decision factors.

At Tacoma, Wash.-based Russell Investments, which operates the Russell 2000 small-cap index,traders rely on this operational transparency to make decisions, bringing the latency data abouteach execution venue and market-data source right onto trader desktops, relates Jason Lenzo,head of equity and fixed income. “To the extent that we have multiple paths to get to a venue,we can effectively normalize out the native latency within that venue,” he says. “We can thenoptimize the speed to market across specific optical and telephony links in those networks.”

When evaluating latency, it’s vital to consider all the contributing factors, including the tradelogic (the code that runs matching engines and algorithms), the speed of calculation hard-ware, the speed of telecom switch hardware, the quality and number of connections, and thedistance between network nodes, according to Kevin McPartland, analyst at TABB Group. The

5 June 2010 © 2010 Wall Street & Technology/InformationWeek Analytics, Reproduction Prohibited

A c c e l e r a t i n g W a l l S t r e e t 2 0 1 0

A n a l y t i c s . I n fo r m a t i o n We e k . c o m

A n a l y t i c s R e p o r t

“To the extent that we have multiple paths to get to avenue, we can effectively normalize out the native latencywithin that venue.”

—Jason Lenzo, Russell Investments

Page 6: Accelerating wall-street-2010-next-stop-nanoseconds 8049450

6 June 2010 © 2010 Wall Street & Technology/InformationWeek Analytics, Reproduction Prohibited

industry as a whole is rapidly approaching the point where, “The code is so tight, hardwareimprovements are the main thing that will increase the overall efficiency of the operation,”McPartland says.

Enter, vendors such as Blade Network Technologies (telecom hardware), Equinix (colocationhosting) and Nvidia (gaming graphics cards re-tasked to calculate derivatives). Each of thesetechnology providers is feverishly trying to reduce the latency of its layer in the stack.

Santa Clara, Calif.-based Blade makes a 10-GB switch that connects feed handlers, algorithmboxes and matching engines at colocation centers, where firms have increasingly found it use-ful to situate their machines across the hallway from their counterparts, even if their offices andtrading staff are on opposite sides of the globe.

By merging routers with switches, Blade has eliminated a layer that previously added preciousmicroseconds to a round trip, explains David Iles, Blade Network Technologies’ director ofproduct management.

“We are providing sub-700 nanoseconds of latency, port to port,” Iles asserts. “We are alsodeterministic. You don’t want stale market data getting to devices. It has to take the same timeto get from Port 1 to Port 24 as it does from Port 1 to Port 2.”

Technologies such as this tend to live side by side in colocation centers run by companies suchas Foster City, Calif.-based Equinix. Here, the issue is bandwidth and energy efficiency, both ofwhich are major cost contributors in the low-latency race. Trading firms are increasingly optingfor colocation rather than running expensive, high-throughput dedicated fiber from theiroffices to the marketplace.

“We have a customer in Greenwich, Conn.,” relates John Knuff, general manager, globalfinancial markets, at Equinix. “They were spending about $20,000 a month to get trades toNew York. They moved a couple of cabinets in with us. It’s $3,000 to $5,000 a month for acabinet, and $200 to $300 for the cross-connects. They essentially offset the cost of theircolocation by getting rid of the network costs back to their office, which had no economic orcompetitive advantage.”

A c c e l e r a t i n g W a l l S t r e e t 2 0 1 0

A n a l y t i c s . I n fo r m a t i o n We e k . c o m

A n a l y t i c s R e p o r t

Page 7: Accelerating wall-street-2010-next-stop-nanoseconds 8049450

Think Fast(er)An equally important question pervades the minds of traders: Once you’re satisfied with theturnaround time to your market, how do you maximize the value of time between transac-tions? That question interested Tobias Preis, managing director of Artemis Capital AssetManagement of Holzheim, Germany, to such a degree that he became one of the first customersof Nvidia’s graphical processing unit (GPU), a processor that has 480 cores, compared to thetypical four- to 12-core CPU.

The GPU originally was developed to render high-resolution details for computer games. Preis,also a computational physicist, uses the GPU to calculate time series for the DAX-index futuresalgorithms he deploys on Eurex.

“The increases in speed represented by the GPU are many times faster than the reductions inlatency by the exchanges,” according to Preis, who says he gets by on 100 milliseconds to 150milliseconds of average latency to Eurex. “We can now perform parallel-computing calculationsthat used to take one minute in one second.”

Where will low latency be ina year? Many market partici-pants say they won’t be sur-prised if the discussion isabout nanoseconds in a year.“I know we will break the100 microsecond barrier,”NYSE’s Rubinow says. Beyond

that, it becomes enormously expensive to add each zero behind the decimal point, he notes.

Despite the excitement over low latency and the extreme competitiveness of financial firms andthe vendors that serve them, it’s important to keep a clear head about the need for speed, addsAdam Afshar, president of program trading at Atlanta-based Hyde Park Global Investments,which is 100 percent automated and has no manual traders.

“High frequency is just a method for implementing a strategy—it is not the strategy itself,”says Afshar. He notes approvingly that the decreasing cost of technology means that a $10

7 June 2010 © 2010 Wall Street & Technology/InformationWeek Analytics, Reproduction Prohibited

A c c e l e r a t i n g W a l l S t r e e t 2 0 1 0

A n a l y t i c s . I n fo r m a t i o n We e k . c o m

A n a l y t i c s R e p o r t

“I know we will break the100-microsecond barrier.”

—Steve Rubinow, NYSE Euronext

Page 8: Accelerating wall-street-2010-next-stop-nanoseconds 8049450

8 June 2010 © 2010 Wall Street & Technology/InformationWeek Analytics, Reproduction Prohibited

million investment in technology allows a smaller firm to rival the speed of the biggest bankson Wall Street.

But the key to success in the marketplace, according to Afshar, is adaptability, and that stillcomes from human ingenuity. For Afshar, going forward, the more interesting question is not“How fast can you hit the market?” but “What do you do with that speed?”

“The bottom line,” he says, “is your adaptability to the nonlinearity of markets.”

What’s All the Fuss About?

Despite the controversy surrounding high-frequency trading, the trading style is beneficial to long-term investors and to the market at large, argues Arzhang Kamarei,Managing Partner, Tradeworx.

By Melanie Rodier

High-frequency trading remains mired in controversy, with regulators fearing that unscrupu-lous traders are taking advantage of individual investors. But what critics don’t realize is thathigh-frequency trading actually is beneficial to long-term investors and to the market at large,according to Arzhang Kamarei, managing partner at Tradeworx, a quantitative investment man-agement firm with expertise in high-frequency market strategy.

“High-frequency trading creates opportunities for long-term investors by providing more liq-uidity,” asserted Kamarei, who presented the keynote address at Wall Street & Technology’srecent Accelerating Wall Street conference.

The extra liquidity that high-frequency trading provides, he explained, narrows spreads forlong-term investors, ultimately helping them get better prices.

“During the turbulent fourth quarter of 2008, it was high-frequency traders that stepped upand provided liquidity,” Kamarei argued. “High-frequency trading provides U.S. markets with

A c c e l e r a t i n g W a l l S t r e e t 2 0 1 0

A n a l y t i c s . I n fo r m a t i o n We e k . c o m

A n a l y t i c s R e p o r t

Page 9: Accelerating wall-street-2010-next-stop-nanoseconds 8049450

9 June 2010 © 2010 Wall Street & Technology/InformationWeek Analytics, Reproduction Prohibited

better prices and deeper liquidity than markets in any other country or region. It helps smooththe course of long-term investors.”

High-frequency trading is estimated to generate as much as two-thirds of U.S. equities dailytrading volume. But as it grows in popularity, it also has attracted the scrutiny of regulators,eager to appease uneasy investors after the financial crisis.

Addressing the controversy surrounding high-frequency trading strategies, Kamarei pointed outthat high-frequency trading isn’t always profitable. “High-frequency traders make moneythrough spread capture,” he noted. “They optimize adverse selection to match rebates. Morevolatility increases spreads.”

In April, the SEC unanimously approved a new proposal that would track transactions by high-frequency trading firms to improve oversight of their activity. Under the new rule, firms will be

given unique identifiers and will be required to report next-day transaction data when request-ed by regulators. This will allow authorities to keep closer tabs on traders that aren’t registeredmarket makers or broker-dealers without having to follow lengthy audit trails from exchangeswhen they scrutinize a particular firm or trade.

Next-day access to trading data also could assist investigators in finding manipulative, abusiveor otherwise illegal trading activity. The SEC estimates that the rule will apply to the largest400 market participants—firms or individuals whose transactions in exchange-listed securitiesequal or exceed 2 million shares or $20 million during any calendar day, or 20 million sharesor $200 million during any calendar month.

Regulators also have been scrutinizing flash orders, which let traders briefly expose theirorders to others in the market, and “naked access,” which allows firms to buy and sell

A c c e l e r a t i n g W a l l S t r e e t 2 0 1 0

A n a l y t i c s . I n fo r m a t i o n We e k . c o m

A n a l y t i c s R e p o r t

“High-frequency trading provides U.S. markets withbetter prices and deeper liquidity than markets in anyother country or region.”

—Arzhang Kamarei,Tradeworx

Page 10: Accelerating wall-street-2010-next-stop-nanoseconds 8049450

10 June 2010 © 2010 Wall Street & Technology/InformationWeek Analytics, Reproduction Prohibited

stocks on exchanges using a broker’s computer code without regulators knowing who ismaking the trades.

Also under fire from regulators is the strategy of colocating at or near exchange data cen-ters, which authorities say gives high-frequency trading firms an unfair advantage overslower traders.

Spurring CompetitionKamarei told attendees, however, that colocation isn’t unfair to long-term investors, as it helpsto create competition among high-frequency traders. Further, both high-frequency traders andlong-term investors can colocate, he noted.

Meanwhile, Kamarei argued that any attempts to change the market structure will fail toreverse technology advances. Instead, costs will come down for less advanced users, which inturn will drive further adoption of high-frequency trading technology, he said.

As for the future of high-speed trading, Kamarei suggested that sell-side broker-dealers will bethe main force for spreading the use of high-frequency trading to all market participants.

He predicted that high-frequency trading volumes will stay at their current levels on a volatili-ty-adjusted basis, but many high-frequency trading desks will go out of business, even as high-frequency technology grows more ubiquitous.

“New high-frequency trading firms will process more dimensional and complex data,”Kamarei said.

Meanwhile, he noted, “The fascination with colocation will decrease as the technologybecomes commonplace.”

A c c e l e r a t i n g W a l l S t r e e t 2 0 1 0

A n a l y t i c s . I n fo r m a t i o n We e k . c o m

A n a l y t i c s R e p o r t

Page 11: Accelerating wall-street-2010-next-stop-nanoseconds 8049450

11 June 2010 © 2010 Wall Street & Technology/InformationWeek Analytics, Reproduction Prohibited

How Low Can You Go?

Even as firms explore bleeding-edge technologies to lower latency, Lime BrokeragePresident and CEO Jeffrey Wecker says there’s still plenty of latency that can be elimi-nated with conventional products and a simple understanding of the nature of latency.

By Ivy Schmerken

To survive on Wall Street, firms believe they need speed, according to Lime Brokerage presi-dent and CEO Jeffrey Wecker, who delivered the closing keynote address at Wall Street &Technology’s Accelerating Wall Street conference in May.

Wecker insisted that anyone willing to make the appropriate investments could achieve mini-mum latency using conventional technology. But with low-latency trading already measured inmicroseconds, he predicted it would be difficult to reduce latency further.

“We use just about every trick available from conventional technology and have squeezedlatency out,” said Wecker. Lime Brokerage is an agency brokerage firm that provides

high–throughput, low-latency technologies to high-frequency traders and other proprietarytrading shops, as well as more traditional buy-side firms, primarily as a managed service.

“Short of addressing greater volumes of data, the top performers have a good handle on whatthey need to do to reduce latency,” Wecker continued. “Software architecture is reachingcanonical perfection in the order management process.”

Without more research breakthroughs from computer science, he added, reducing latency fur-ther will become more and more difficult.

A c c e l e r a t i n g W a l l S t r e e t 2 0 1 0

A n a l y t i c s . I n fo r m a t i o n We e k . c o m

A n a l y t i c s R e p o r t

“The latency game is nearly over for data delivery, ordermanagement and matching logic.”

—Jeffrey Wecker, Lime Brokerage

Page 12: Accelerating wall-street-2010-next-stop-nanoseconds 8049450

12 June 2010 © 2010 Wall Street & Technology/InformationWeek Analytics, Reproduction Prohibited

Miles to Go to Eliminate LatencyNonetheless, Wecker acknowledged, latency still exists—from third-party suppliers because ofphysical distances, in hardware and network equipment, and in software.

Physical distance can add latency depending on “how many miles of wires or fiber connectyour systems to the matching engine,” Wecker explained. On the network equipment side, headded, latency stems from the number of pieces of equipment and network hops, including thethe network interface and any switches through which an order must pass before it reaches themarket.

“You can lose 10 microseconds in the authentication before you get to the match,” Wecker said.

But, he continued, “The greater sources of latency do not come from the search for better hard-ware solutions. I still believe the greatest amount of latency is in code architecture and coderestructuring.”

According to Wecker, software can cause latency from the client side, from the broker-dealervalidation and from the TCP/IP stack. But, he said, firms can work with providers to restruc-ture their software code.

Wecker cited “asynchronous parallel code” as a programming method for reducing latency, andsaid he’s spoken with firms that are making the leap to graphical processing units (GPUs) toparallelize their algorithms. But Wecker cautioned firms against going out on “the bleedingedge” with hardware acceleration tools such as field-programmable gate arrays (FPGAs).

“This path is potentially full of a lot of pitfalls,” he told the audience. “Suboptimal bus rateshave derailed a lot of firms attempting to deploy GPUs.”

Wecker also noted that firms are exploring application-specific integrated circuits (ASICs)—integrated circuits customized for a particular use rather than a general-purpose use—and fab-rication technology. But again, he suggested caution. “Going down that path is very foolish andan expensive venture if you don’t have the experience with software architecture,” he said.

A c c e l e r a t i n g W a l l S t r e e t 2 0 1 0

A n a l y t i c s . I n fo r m a t i o n We e k . c o m

A n a l y t i c s R e p o r t

Page 13: Accelerating wall-street-2010-next-stop-nanoseconds 8049450

13 June 2010 © 2010 Wall Street & Technology/InformationWeek Analytics, Reproduction Prohibited

Comparing Apples With OrangesBut even if firms deploy these technologies successfully, Wecker emphasized, it’s essential tomonitor latency, and to sift through all the marketing hype around the term. “There’s evidenceof poor latency monitoring and reporting by different firms,” he said, noting that many vendorsoffer “apples and oranges comparisons.”

Though there are industry standards for measuring latency, Wecker said, hype surrounds the word.“Frankly, I think the industry can do a lot better to be intellectually honest about latency,” he said.

While customers understand this, Wecker added, they still can get confused. For instance,though many customers are colocated at exchange data centers, “They haven’t made the fullinvestment in either infrastructure, staff or monitoring technology to know if they’re getting thebest from the exchange,” he insisted.

In one case, Lime found that the client’s “net latency introduced by switches, telecom providersand software architecture had the equivalent of putting their rack five miles away from themarket center,” Wecker revealed. “Colocation has been sold to many firms as the be-all, end-all; it’s not. It doesn’t substitute for understanding the contributors to latency. I would arguethat just by being in the same data center as the exchange doesn’t help.”

While there is a benefit to colocation and the edge it brings as compared with locating inanother data center, Wecker explained, the benefit could be negated by intraprocess latency—meaning delays in processing from inside the boxes and applications, including the matchinteraction and message acknowledgement. To truly minimize latency, a firm must isolate thecauses of latency and break out the measurements, Wecker advised. “You have a right to askquestions about the methodologies used in report statistics,” he said.

In addition, latency must be examined in context. There is a difference between reportinglatency for a single message during normal market activity and reporting latency during peakthroughput. “Look at the metric itself,” said Wecker, urging firms to question whether thenumber is a mean or median value, for an individual message, or for peak throughput.

“The nature of scientific latency requires that you approach it in a disciplined way, and thatmeans understanding the full environment,” Wecker said. “That sways you one way or another.”

A c c e l e r a t i n g W a l l S t r e e t 2 0 1 0

A n a l y t i c s . I n fo r m a t i o n We e k . c o m

A n a l y t i c s R e p o r t

Page 14: Accelerating wall-street-2010-next-stop-nanoseconds 8049450

14 June 2010 © 2010 Wall Street & Technology/InformationWeek Analytics, Reproduction Prohibited

Silicon: The Next Tool for Low Latency

GPUs and FPGAs can help lower latency, but they often are difficult to program orchange, experts caution.

By Melanie Rodier

Firms are relying on the newest and fastest processors to power their cutting-edge tradinginfrastructures. For the most innovative firms, the low-latency work is taking place in silicon,with the use of graphic cards (GPUs) and FPGAs—reprogrammable silicon chips that give trad-ing firms ultra-low-latency methods for identifying and responding to specific information fromdata feeds.

According to panelists at Wall Street & Technology’s Accelerating Wall Street conference in May,hardware acceleration can help shave off microseconds, and soon nanoseconds, in latency. FPGA and GPU technology continues to gain momentum.

The benefits are clear to engineers: FPGAs are parallel in nature so different processing opera-tions do not have to compete for the same resources. A GPU can relieve the microprocessor of2-D or even 3-D graphics rendering and is commonly used in embedded systems, mobilephones and game consoles. Like FPGAs, its highly parallel structure makes it effective for arange of complex algorithms.

But before they rush to implement these new toys, financial institutions still need to evaluatethe costs and benefits of GPUs and FPGAs carefully, according to experts who participated in apanel discussion, “Silicon: The Next Tool for Low Latency,” at WS&T’s Accelerating Wall Streetconference. “You have to look at the cost of new technologies,” said Andrew Athan, managingmember, Athan Capital Holdings, “and ask yourself what is it you’re trying to gain.”

Not all financial institutions will actually gain a trading advantage by shaving a few microsec-onds off their execution times, Athan suggested. “Will every order I send be matched ahead ofa competitor? I am not sure that for every millisecond I gain, I will gain a matched order,” herelated. “When we looked at the relative benefits, it didn’t yet make sense for us.”

A c c e l e r a t i n g W a l l S t r e e t 2 0 1 0

A n a l y t i c s . I n fo r m a t i o n We e k . c o m

A n a l y t i c s R e p o r t

Page 15: Accelerating wall-street-2010-next-stop-nanoseconds 8049450

15 June 2010 © 2010 Wall Street & Technology/InformationWeek Analytics, Reproduction Prohibited

Panelist Ryan Eavy, associate director, enterprise architecture, CME Group, agreed that FPGAsand GPUs aren’t right for everyone. “You can squeeze milliseconds out of an application—that’shuge,” he said. “But microseconds? It depends if it’s worth it.”

Even if firms do decide to adopt hardware acceleration technologies, there are some limitationsthey must overcome, the panelists noted. For example, “The hardware can be difficult to debugand to troubleshoot,” according to Peter Krey, founder and president, Krey Associates.

Finding a good team of application developers who understand both the technology behindhardware acceleration as well as the financial industry’s requirements presents its own set ofchallenges, the panelists concurred.

“The industry is very dynamic, with all the regulations, changes in protocols, algorithms, etc.,”said Athan Capital’s Athan, who is based in San Diego. Being in Southern California has

allowed Athan to dip into a large talent pool filled with developers who normally build iPodsand cell phones, he noted.

Others are looking beyond the private sector for qualified technologists. With military contrac-tors now using FPGAs for aerospace and defense systems, Andy Bach, SVP, technology, NYSEEuronext, said the exchange has been hiring experienced developers from the defense industry.

But if everyone is leveraging the same hardware, won’t this level out the playing field?

“If you can only compete by buying hardware, you’ll do it,” said Athan. “But if you can com-pete by hiring smarter people, or building better algorithms, you’ll do that, too.”

A c c e l e r a t i n g W a l l S t r e e t 2 0 1 0

A n a l y t i c s . I n fo r m a t i o n We e k . c o m

A n a l y t i c s R e p o r t

“You have to look at the cost of new technologies and askyourself, what is it you’re trying to gain for yourself?”

—Andrew Athan, Athan Capital Holdings

Page 16: Accelerating wall-street-2010-next-stop-nanoseconds 8049450

16 June 2010 © 2010 Wall Street & Technology/InformationWeek Analytics, Reproduction Prohibited

What’s the Best Low-Latency Tool? Try People

While faster chips and 10 gigabit Ethernet connections help, having the right develop-ers and engineers is the key to winning the low-latency race.

By Greg MacSweeney

Most financial firms equate being the fastest with having the latest technology. But panelists atMay’s Accelerating Wall Street conference agreed that the best tool to reduce latency is not afaster server or FPGA; it’s the right people.

While the latest servers and networks certainly help in the low-latency race, the topic of hiringthe right people came up again and again during the “What’s in Your Low-Latency Toolbox”session at WS&T’s recent conference. “If you look into the entire value chain of where thingsare slowed during the trade [process], you realize you need to have the right developers andthe right engineers,” said panelist Steven Sadoff, EVP and CIO at Knight Capital Group. “Thebiggest reductions in latency come when we optimize our software.”

At Nasdaq OMX, CTO Mats Andersson said, simplifying software has been tremendously bene-ficial. To do that, he explained, not only must IT and the business be on the same page, thenetworking engineers have to know what the other technology developers are doing. “Whenwe bring our developers and networking people together, we see great results,” Anderssonasserted.

But having “simple” software is easier said than done. “Our goal is to be as simple as possible,”noted Scott Ignall, CTO at Lightspeed Financial. “We want our systems to run straight andfast.”

But in order to do that, developers and engineers need to know how the markets operate andunderstand the business goals, he added. Essentially, they need a financial market IQ, as wellas technical acumen, Ignall said.

“It’s very hard to find the right people, the ones who know technology and finance,” he said. “Weare a small company and we move quickly. We don’t have time to teach people about the industry.”

A c c e l e r a t i n g W a l l S t r e e t 2 0 1 0

A n a l y t i c s . I n fo r m a t i o n We e k . c o m

A n a l y t i c s R e p o r t

Page 17: Accelerating wall-street-2010-next-stop-nanoseconds 8049450

17 June 2010 © 2010 Wall Street & Technology/InformationWeek Analytics, Reproduction Prohibited

Panelist Michael Mollemans, head of electronic execution sales at Daiwa Capital MarketsAmerica, agreed. “There is no substitute for financial experience,” he commented. “When I amlooking [for technologists], I stay within my circle of industry contacts because they know meand they know the business. They won’t recommend someone who isn’t right.”

Aside from having the best and brightest working on your systems, the panelists also said thatwhile colocation provides a noticeable improvement, 10 gigabit Ethernet connections also pro-vide large latency improvements. “Today, it’s pretty safe to say that 10 gigabit Ethernet isrequired in this race,” said Nasdaq’s Andersson. “It is preferred by our customers, and 10 giga-bit Ethernet is definitely taking off.”

Daiwa’s Mollemans added that 10 gigabit Ethernet is definitely a better option than infiniband.

Finally, scalability to handle spikes in market data is a prerequisite in this market, addedLightspeed’s Ignall. “If you don’t have scalability and uptime during the high watermark formarket data, it really doesn’t matter,” he said. “Market data is still the challenge. We test andbenchmark constantly.”

A c c e l e r a t i n g W a l l S t r e e t 2 0 1 0

A n a l y t i c s . I n fo r m a t i o n We e k . c o m

A n a l y t i c s R e p o r t

“Our goal is to be as simple as possible. We want oursystems to run straight and fast.”

—Scott Ignall, Lightspeed Financial

Page 18: Accelerating wall-street-2010-next-stop-nanoseconds 8049450

18 June 2010 © 2010 Wall Street & Technology/InformationWeek Analytics, Reproduction Prohibited

Data Center Costs, Oversight Challenge the Sell Side

Financial services firms spent $1.8 billion on data centers in 2009, and TABB Groupexpects 2010 spending to go even higher.

By Justin Grant

Data center oversight and network capacity are the biggest infrastructure-related challenges fac-ing U.S. sell-side equity firms this year, even as spending on the segment continues to rise,TABB Group analyst Kevin McPartland said during his presentation at Wall Street & Technology’sannual Accelerating Wall Street conference in May.

Although costs remain a key concern, U.S. equity firms ramped up their investment in datacenters within the past year, according to the recent TABB report “U.S. Equity Technology2010: The Sell-Side Perspective,” which noted that the larger players each support nearly fivedata centers on average.

“Clearly, the sell side loves its data centers,” McPartland told attendees. “There’s a lot of horse-power that has to sit behind these equity businesses in the U.S. … It’s getting more and morecomplex to manage the infrastructure.”

And more costly. Equity firms spent $1.8 billion last year on data centers, with half of that totalcoming from sell-side shops, according to the TABB Group report, which predicts the sell side’suse of data center space will increase slightly in 2010.

“There’s a race here to try to compete,” McPartland continued. “Despite the cost-consciousness,spending is still high, with the sell side spending the most.” In general, these sell-side shops arepursuing a technology-driven agenda with an eye on lower latency, sleeker infrastructure andshrewder IT investment in the wake of slow budget growth, according to TABB Group.

The report, which was based on conversations with high-ranking executives at 24 sell-sidefirms, found that proximity hosting has become prominent among all the major broker-dealers,McPartland revealed. “There’s an old mentality where we still need to be close to our equip-ment,” he said.

A c c e l e r a t i n g W a l l S t r e e t 2 0 1 0

A n a l y t i c s . I n fo r m a t i o n We e k . c o m

A n a l y t i c s R e p o r t

Page 19: Accelerating wall-street-2010-next-stop-nanoseconds 8049450

19 June 2010 © 2010 Wall Street & Technology/InformationWeek Analytics, Reproduction Prohibited

Still, the vast majority of U.S. hedge funds are not yet colocated, with 76 percent opting tolook to their brokers for infrastructure rather than buying it themselves. But even for hedgefunds that do not have an ultra-low-latency trading strategy, location does matter. Proximityalso is beneficial to smaller hedge funds, which historically have kept their servers in-house,the report said.

Sell-side firms, meanwhile, also are aiming to boost their network capabilities. Demand forbandwidth is projected to soar this year on the strength of rising trading volumes, which willresult in more data being pumped out by exchanges.

McPartland said improved use of bandwidth will be crucial for brokers going forward since

growing data rates and the costs of managing data may slice into margins. This is helping tospark a rush toward server upgrades as well, with most sell-side firms expected to opt forHewlett Packard and Intel’s servers, according to TABB Group, which noted that the sell sidealready has spent $113.5 million this year on network servers.

“Connectivity is getting cheaper, but the price tag is still high,” McPartland said, while alsopointing out that the large firms are increasingly opting for hardware acceleration. “For thebulge bracket, everybody’s either using it or is looking into using it. Smaller firms are pricedout—it’s too costly to buy and maintain.”

The report also noted that while virtualization at sell-side shops is growing, a completely virtu-alized and utilized infrastructure is still a long way off for U.S. equities firms. “Even as virtual-ization improves,” said McPartland, “there will still be some latency.”

A c c e l e r a t i n g W a l l S t r e e t 2 0 1 0

A n a l y t i c s . I n fo r m a t i o n We e k . c o m

A n a l y t i c s R e p o r t

“Clearly, the sell side loves its data centers. There’s a lot ofhorsepower that has to sit behind these equitybusinesses.”

—Kevin McPartland,TABB Group

Page 20: Accelerating wall-street-2010-next-stop-nanoseconds 8049450

20 June 2010 © 2010 Wall Street & Technology/InformationWeek Analytics, Reproduction Prohibited

Firms Still Not Analyzing Unstructured Real-Time News

Real-time news isn’t reliable enough to include in automated trading strategies, sayindustry executives.

By Greg MacSweeney

Although the ability to analyze news events in real time has been promoted as the next step inthe evolution of automated trading, panelists at Wall Street & Technology’s recent AcceleratingWall Street conference are doubtful the capability can provide an edge.

“Two years ago, only 2 percent of the market was testing trading strategies with real-timenews,” said Adam Honore, research director at analyst firm Aite Group, during the panel ses-sion “Using High-Performance Databases & CEP in an Automated Trading Strategy.” “Today,two-thirds of the market is at least testing real-time news. The problem, however, is real-timenews is not very reliable.”

Panelist Robert Almgren, cofounder of Quantitative Brokers, an agency-only brokerage thatprovides best execution algorithms and transactional cost measurement for fixed income mar-kets, also said he hasn’t seen much value in analyzing news in an automated strategy. “I havealways been a pessimist about news, and I would rather use a quantitative method, such aslooking at historical data or economic indicators,” he revealed.

“The problem with news,” Almgren continued, “is humans don’t even know how to interpret it.So how could a computer?” For example, he noted, how do you know the news story is accu-rate, or if you are really the first person to see it?

Andrew Haines, CIO at GAIN Capital Group, also has been hesitant to deploy the real-timenews analysis capability in a trading strategy. “It’s hard to come up with a defensible tradingstrategy based on unstructured news,” he told attendees.

In addition, Haines said, he also decided not to try to analyze Twitter messages for trade ideas.“Streambase, our CEP [complex event processing] provider, has the ability to analyze Twitter,but we are not using it,” he commented. GAIN Capital, however, is looking at real-time news

A c c e l e r a t i n g W a l l S t r e e t 2 0 1 0

A n a l y t i c s . I n fo r m a t i o n We e k . c o m

A n a l y t i c s R e p o r t

Page 21: Accelerating wall-street-2010-next-stop-nanoseconds 8049450

21 June 2010 © 2010 Wall Street & Technology/InformationWeek Analytics, Reproduction Prohibited

and Twitter for enhancing risk management, he added, but hasn’t deployed either.

Instead of using CEP to analyze news, Haines noted, GAIN Capital is using the technology onits foreign exchange ECN. “We selected our vendor CEP tool last year,” he reported. “With a

vendor-provided tool, we can focus on business functionality rather than building the technol-ogy. We have bolted the CEP tool onto our ECN with great success.”

But even without adding real-time news or Twitter messages to an automated trading strategy,all the panelists agreed that managing the sheer volume of traditional data is a challenge. “Thedata volumes are so huge,” said Quantitative Brokers’ Almgren. “And if you want to go backand add historical data to a calculation, the problem with processing and analyzing all the datais tremendous.”

A c c e l e r a t i n g W a l l S t r e e t 2 0 1 0

A n a l y t i c s . I n fo r m a t i o n We e k . c o m

A n a l y t i c s R e p o r t

“It’s hard to come up with a defensible trading strategybased on unstructured news.”

—Andrew Haines, GAIN Capital