You are currently browsing the tag archive for the ‘Maxeler’ tag.
Computing Frontiers 2013 Program, here. Maxeler folks are there doing the Dataflow talk. Boy it sure helps to know a little history doesn’t it? It can save you a bunch.
Alex Mansfield, BBC, NASA buys into ‘quantum’ computer, here. Since when did NASA get money to buy stuff? Alex had a draft of this article with “blink of an eye” instead of “fractions of a second,” right?
A $15m computer that uses “quantum physics” effects to boost its speed is to be installed at a Nasa facility.
It will be shared by Google, Nasa, and other scientists, providing access to a machine said to be up to 3,600 times faster than conventional computers.
Unlike standard machines, the D-Wave Two processor appears to make use of an effect called quantum tunnelling.
This allows it to reach solutions to certain types of mathematical problems in fractions of a second.
Quentin Hardy, NYT, Google Buys a Quantum Computer, here.
The Quantum Artificial Intelligence Lab, as the entity is called, will focus on machine learning, which is the way computers take note of patterns of information to improve their outputs. Personalized Internet search and predictions of traffic congestion based on GPS data are examples of machine learning. The field is particularly important for things like facial or voice recognition, biological behavior, or the management of very large and complex systems.
“If we want to create effective environmental policies, we need better models of what’s happening to our climate,” Google said in a blog post announcing the partnership. “Classical computers aren’t well suited to these types of creative problems.”
Google said it had already devised machine-learning algorithms that work inside the quantum computer, which is made by D-Wave Systems of Burnaby, British Columbia. One could quickly recognize information, saving power on mobile devices, while another was successful at sorting out bad or mislabeled data. The most effective methods for using quantum computation, Google said, involved combining the advanced machines with its clouds of traditional computers.
Google and NASA bought in cooperation with the Universities Space Research Association, a nonprofit research corporation that works with NASA and others to advance space science and technology. Outside researchers will be invited to the lab as well.
This year D-Wave sold its first commercial quantum computer to Lockheed Martin. Lockheed officials said the computer would be used for the test and measurement of things like jet aircraft designs, or the reliability of satellite systems.
Scott Aaronson, Shtetl-Optimized, Ask Me Anything! Tenure Edition, here. You don’t really have to read the BBC and NYT coverage of Quantum when you can just ask Aaronson anything.
By popular request, for the next 36 hours—so, from now until ~11PM on Tuesday—I’ll have a long-overdue edition of “Ask Me Anything.” (For the previous editions, see here, here, here, and here.) Today’s edition is partly to celebrate my new, tenured “freedom to do whatever the hell I want” (as well as the publication after 7 years of Quantum Computing Since Democritus), but is mostly just to have an excuse to get out of changing diapers (“I’d love to, honey, but the world is demanding answers!”). Here are the ground rules:
One question per person, total.
Please check to see whether your question was already asked in one of the previous editions—if it was, then I’ll probably just refer you there.
No questions with complicated backstories, or that require me to watch a video, read a paper, etc. and comment on it.
No questions about D-Wave. (As it happens, Matthias Troyer will be giving a talk at MIT this Wednesday about his group’s experiments on the D-Wave machine, and I’m planning a blog post about it—so just hold your horses for a few more days!)
If your question is offensive, patronizing, nosy, or annoying, I reserve the right to give a flippant non-answer or even delete the question.
Keep in mind that, in past editions, the best questions have almost always been the most goofball ones (“What’s up with those painting elephants?”).
First, some background: Under Dodd-Frank, the CFTC was given the task of regulating the $300 trillion market for swaps in the U.S. The basic point was to bring light to a dark market and prevent another AIG by pushing as much of the over-the-counter swaps market as possible onto exchanges where prices and volume are posted. With about 80 percent of those swaps rules written,according to CFTC Chairman Gary Gensler, and a bunch of them now in effect, traders have begun “futurizing their swaps”—that is, trading futures contracts instead of entering into swaps deals. Some say that’s a clever way around Dodd-Frank. Others see it as merely a natural evolution of financial instruments.
Whatever the reason, it’s happening. And as arcane as the details may be, the potential consequences are enormous, as evidenced by Thursday’s packed house. The general consensus of those present was that Thursday was the most crowded CFTC hearing in recent memory. Lawyers and lobbyists lined the walls; congressional staffers and industry suits packed the chairs. More than 150 people crammed into the CFTC’s main conference room, and a healthy number of folks watched on TVs in the hallway outside.
Dodd-Frank has upended the derivatives market, and in the shakeout that follows, there will be winners and losers. Perhaps those with the most at stake areIntercontinentalExchange (ICE) and the Chicago Mercantile Exchange (CME), the two biggest futures exchanges in the U.S. As more companies and traders start favoring futures over swaps, the two exchanges stand to capture a much bigger portion of that activity. The potential losers? Dealers such as Goldman Sachs (GS) that have done a lot of swaps business. Standing at the back of the room, Chris Giancarlo, chair of the Wholesale Markets Brokers’ Association, likened the fight over swaps and futures to “the Maginot Line for the exchanges.”
Easley, de Prado, & O’Hara, SSRN, The Volume Clock: Insights into the High Frequency Paradigm, here. 2 LFT structural weaknesses
Over the last two centuries, technological advantages have allowed some traders to be faster than others. We argue that, contrary to popular perception, speed is not the defining characteristic that sets High Frequency Trading (HFT) apart. HFT is the natural evolution of a new trading paradigm that is characterized by strategic decisions made in a volume-clock metric. Even if the speed advantage disappears, HFT will evolve to continue exploiting Low Frequency Trading’s (LFT) structural weaknesses. However, LFT practitioners are not defenseless against HFT players, and we offer options that can help them survive and adapt to this new environment.
Nerval’s Lobster, Mars Rover Curiosity: Less Brain Power Than Apple’s iPhone 5, here. 3. So, remember that 8 hour to 238 second 2011 award winning Credit valuation and risk computation on the million dollar FPGA supercomputer with the Dataflow acceleration? That’s the Apollo Creed here, on a good day. There are days, (e.g., July 27, 2018 57.6 million km distance between Mars and Earth) when you can send the entire credit portfolio to Mars, compute the entire Risk and Valuation for the portfolio in the down time on the spare computer in the Mars Rover, then send the results back to Earth, and finish in ~360 seconds. That’s just about 50% slower than the 2011 award winning Credit valuation and risk computation on the million dollar FPGA supercomputer with the Dataflow acceleration. So, the message here is, I guess, if your computing infrastructure is on Mars … and has less brain power than an iPhone5 … then you are probably not going to be at the very top of the USD fixed/float Vanilla Swap League tables … on most days, but …if you own an iPhone5 here on earth … you have more brain power … than the 2011 award winning Credit valuation and risk computation on the million dollar FPGA supercomputer with the Dataflow acceleration?
“To give the Mars Rover Curiosity the brains she needs to operate took 5 million lines of code. And while the Mars Science Laboratory team froze the code a year before the roaming laboratory landed on August 5, they kept sending software updates to the spacecraft during its 253-day, 352 million-mile flight. In its belly, Curiosity has two computers, a primary and a backup. Fun fact: Apple’s iPhone 5 has more processing power than this one-eyed explorer. ‘You’re carrying more processing power in your pocket than Curiosity,’ Ben Cichy, chief flight software engineer, told an audience at this year’s MacWorld.”
David Murphy, Alphaville, The JPMorgan Whale’s regulatory motive, here. Just a wild guess for a movie plot – Whale takes leveraged position in CDX tranches that are no longer heavily traded like back in the day, say 2005. In May12 the big, and now publicly exposed, hedge is in the CDX series 9 where the Whale gets picked off by the hunch/pounce/kill boys. The new correlation/hazard rate cooker (the code that computes the inputs to the gaussian copula model) has a problem, maybe with Kodak maybe something else, and the Whale’s desk risk and the 238 second near real-time run time Credit P&L is shot – they are flying totally blind. They try to buy time marking the spreads to cover the model’s flaked out P&L and Risk while they fix it. The risk and regulatory requirements change while all this is going on so there is some JPM Risk executive who now wants someone to explain to him what this all means to his VaR model. Nobody has time to talk to him cause the VaR is just for mouth-breathers and there is a real problem here that folks need to think through. I wonder if it was helpful to debug the flaky model in the FPGA supercomputer once the CIO P&L went out, probably not, right? Bruno, Achilles, and probably even Ina got to read up on Verilog programming back in April 2012, cool thx prize winning Dataflow supercomputer implementation of the gaussian copula … Maserati, Bellagio, Bellagio, Kasparov.
If JPMorgan had just bought, say, senior tranche protection on a credit index, then while the bank’s position would indeed have been crash-hedged, it would have generated significant earnings volatility as the bonds would not have been marked-to-market but the derivatives would have been. In particular, in a tightening credit environment, such as we had earlier in the year as the ECB injected liquidity into the banking system, the derivatives would have lost money without a corresponding accounting gain on the bonds.
One way around this accounting mismatch is to restructure the derivatives position. The idea is still to be long crash protection — again, by buying protection on senior tranches, for example — but to offset this by also selling protection on the index. If done correctly this position will be indifferent to small moves in credit spreads (‘delta neutral’), but it will make money if there is a big increase in spreads.
This removes the fair value volatility from the position at the cost of introducing correlation risk: the amount of index you need to sell is a function of the correlation between the names in the index, so you have to readjust your hedge as the market price of correlation changes.
Deus Ex Macchiato, Whale Watching , the official tour, here. I think this is David Murphy again. Nice website, I should read it more frequently.
The firm’s main problem at this point was that two goals were in conflict. On one hand their position was so large (if unnoticed by regulators) that they would get crushed if they tried to leave too fast: on other other, they needed to leave to reduce capital. The solution, of course, was to try to change how capital was calculated.
the concern that an unwind of positions to reduce RWA would be in tension with “defending” the position. The executive therefore informed the trader (among other things) that CIO would have to “win on the methodology” in order to reduce RWA.
Chris Wilson, Yahoo, What would your signature look like if Jack Lew wrote it? (Interactive), here.
Now, Yahoo News exclusively brings you the Jack Lew Signature Generator. Just type in your name, hit the button, and see what your name would look like in his, er, signature style.
Peter Cotton, Bond Market Microstructure, Liquidity in The Fixed Income Markets: A Panel Discussion at Stanford University, here. Cotton is always entertaining.
What follows is an approximate rendition of part of a panel discussion that took place at the Third Stanford Conference on Quantitative Finance in March 2012. The topic was the future of the fixed income markets. The participants in the discussion include organizer George Papanicolaou, Professor of Mathematics at Stanford University; Tom Eady, Senior Policy Advisor at the SEC; Ravi K. Mattu, Managing Director and Global Head of Analytics at PIMCO; Tanya Beder, Chairman of SBCC Group; Darrell Duffie, Dean Witter Distinguished Professor of Finance at the Stanford GSB; and Jim Toffey, Founder and CEO of Benchmark Solutions. The conference web site contains full biographies and the conference program. The panel discussion was moderated by Kevin McPartland of TABB Group.
Adam Green, The New Yorker, A Pickpocket’s Tale, here. The Bond Market guys will take way more of your stuff, on an expectations basis, than all the Robbinses in the world, just not literally from your sweaty pocket at some discrete moment in time. Not sure if they return all of it at some later point in time, probably not. But that is partially why it is so compelling to see the “Bond Market” folks get so gobsmack taken by the FPGA Dataflow supercomputer manufacturers. Who knew, “they” had skills to sell Dataflow to otherwise fully-functioning adults in 2011?
“Come on,” Jillette said. “Steal something from me.”
Again, Robbins begged off, but he offered to do a trick instead. He instructed Jillette to place a ring that he was wearing on a piece of paper and trace its outline with a pen. By now, a small crowd had gathered. Jillette removed his ring, put it down on the paper, unclipped a pen from his shirt, and leaned forward, preparing to draw. After a moment, he froze and looked up. His face was pale.
“Fuck. You,” he said, and slumped into a chair.
Robbins held up a thin, cylindrical object: the cartridge from Jillette’s pen.
Robbins, who is thirty-eight and lives in Las Vegas, is a peculiar variety-arts hybrid, known in the trade as a theatrical pickpocket. Among his peers, he is widely considered the best in the world at what he does, which is taking things from people’s jackets, pants, purses, wrists, fingers, and necks, then returning them in amusing and mind-boggling ways.
Comptroller of the Currency, OCC’s Quarterly Report on Bank Trading and Derivatives Activities Third Quarter 2012, here. That’s a lot of interest rate swaps executed by four banks w. traders headed for the exits and totally captured by IT groups who aspire to be as competent as the 2011 award winning London Whale/Maxeler/FPGA supercomputer folks. We know the majority of these interest rate swaps are moving to central clearing in March. As I recall, the bid offer spread on a vanilla interest rate swap is a couple thousand USD, leaving quite a bit of room for spread compression. Downsides: you would probably have to carry overnight positions and the initial capitalization to trade will be significant. If there is even a hint of a whiff of a central limit order book for Interest Rate Swaps its gonna be like Agincourt.
The four banks with the most derivatives: hold 93.2% of all derivatives, while the largest 25 banks > 5 yrs
account for nearly 100% of all contracts.
Karen Ann Cullotta, NYT, Libraries See Opening as Bookstores Close, here.
As librarians across the nation struggle with the task of redefining their roles and responsibilities in a digital age, many public libraries are seeing an opportunity to fill the void created by the loss of traditional bookstores. They are increasingly adapting their collections and services based on the demands of library patrons, whom they now call customers.
Anh Nguyen, Computerworld UK, Top JP Morgan and UBS IT execs leave for HPC vendor, here. Wonder if they are going to sell the gaussian copula FPGA implementation to another bank? Computerworld UK appears to be an utter stranger to Irony.
The two new appointments are Stephen Weston, who was managing director and global head of applied analytics at JP Morgan, and Steven Hutt, former managing director and global head of credit analytics at UBS.
Weston was responsible for a major IT project using Maxeler technology at JP Morgan, which enabled the investment bank to run risk analysis and price its global credit portfolio in near real-time.
Almost missed the quotes:
“The overwhelming benefits of dataflow technology are directly measurable and undeniable,” said Weston.
“Having experienced them first-hand at JP Morgan, my new mission is to help the finance industry understand the revolution in value that this technology brings through making real-time computation and scenario analysis feasible, for both the largest and the most complicated problems.”
He said “revolution in value,” very clever.
Lisa Pollack, Alphaville, Two billion dollar ‘hedge’, here. Looks like Pollack and Zerohedge conclude the London Whale position is in CDX tranches. One other thing if I could, JPM spent 3 years getting it’s 8 hour overnight Gaussian Copula batch into an award winning FPGA supercomputer (see JP Morgan’s London Whale needs Maxeler’s FPGA Supercomputer to run Risk?) that runs in 238 seconds; is that right, Sir? And then there’s this, Recipe for Disaster: The Formula That Killed Wall Street. Now, I’m not Lt. Columbo or anything but help me out here, wouldn’t some people call that means, motive, and … opportunity.
Concerning how one can make a $2bn loss on this, we have become convinced that it’d only be possible if the above was also done with tranches, which would seriously lever up any such position. Several FT Alphaville commenters have alluded to this already — thank you, guys. Even then, a $2bn loss is a lot to chalk up. But if it isn’t that, what else could it be?
Zerohedge, Is The Pain Over For Bruno Iksil? here.
Today, for the first time since the advent of the JPM prop trading fiasco last Thursday, the IG9-10 Year skew has diverged, dipping from -3 bps to -5 bps as the index remained flattish while the intrinsics widened by about 2 bps. While hardly earthshattering, this move likely means that either JPM’s CIO trading desk is playing possum and is no longer unwinding its original pair trade exposure (either because it no longer has anything to unwind, or because it can’t take the pain any more and is out of the market entirely), or the hedge fund consortium has had enough of pushing IG 9 wider in hopes that max pain would force JPM to cover its synthetic leg. As a reminder, this is where last Thursday we said the time to push JPM would likely end. As for the question of how much additional P&L loss JPM has sustained from Friday through today is a different matter entirely, and we are confident the next announcement from JPM will come momentarily, coupled with the announcement that Bruno Iksil, the last remnant of the CIO desk, and now having completed his duty of unwinding the trade that brought so much pain for Jamie Dimon, has been retired.
Roy Scheider, Jaws, here. Ok I was wrong, Mammoth Supercomputer needed to keep London Whale Afloat was not too much.
NYT DealBook, JPMorgan Discloses $2 Billion in Trading Losses, here.
JPMorgan Chase, which emerged from the financial crisis as the nation’s biggest bank, disclosed on Thursday that it had lost more than $2 billion in trading, a surprising stumble that promises to escalate the debate over whether regulations need to rein in trading by banks.
Jamie Dimon, the chief executive of JPMorgan, blamed “errors, sloppiness and bad judgment” for the loss, which stemmed from a hedging strategy that backfired.
The trading in that hedge roiled markets a month ago, when rumors started circulating of a JPMorgan trader in London whose bets were so big that he was nicknamed “the London Whale” and “Voldemort,” after the Harry Potter villain.
WSJ Deal Journal, J.P. Morgan Reveals ‘London Whale’-Size Losses, here.
J.P. Morgan Chase & Co., the nation’s largest bank, surprised the market today, saying it has taken large losses stemming from derivatives bets gone wrong in the bank’s Chief Investment Office.
At 4:30, the bank sent out an unusual notice saying that it would be holding a call at 5 p.m. but included no details about what the call would be about. A person familiar with the matter said the call would include CEO Jamie Dimon and discuss the bank’s quarterly filing.
On the conference call, J.P. Morgan CEO Jamie Dimon said the bank had taken $2 billion in trading losses in the past six weeks and could face an additional $1 billion in second-quarter losses due to market volatility.
DealBreaker, Whale Sushi On The Menu At JPMorgan Executive Lunchroom For Next Few Months, here.
Whaledemort remains something of a riddle wrapped in an enigma wrapped in barnacles, and the Q&A reflected that. BAML’s Guy Moszkowski and others pressed Dimon on, as Moszkowski put it, “why did you feel the need to add synthetic credit exposure?”; others asked a not-unrelated question, which was, roughly, “c’mon Jamie, was this guy actually ‘hedging’ or was this just a crazy prop bet?” Dimon’s answers were not super satisfying but they were clear enough: the Whale was hedging, not adding, credit exposure. But he wasn’t just doing that by getting short lots of bonds or buying lots of CDS. Instead, he was doing something that had him getting long credit via CDX – presumably massive flatteners or tranche trades that were relatively neutral to small moves in credit but made lots of money if things got rapidly worse. These were not prop trades, not massively long credit – rather, the Whale was long credit via longer-dated CDX and short credit via shorter-dated CDX and/or tranches.
That is a simple enough trade, for some value of “enough,” but apparently not simple enough for JPMorgan! At some point they decided to reduce this credit hedge, or “re-hedge” it (Jamie’s exact words vary but whatever, you get the idea, they were short credit through some things and they decided to reduce that short position in some fashion by getting long more CDX or closing some of their shorts or whatever), and that re-hedging was “flawed, complex, poorly reviewed, poorly executed, and poorly managed” but otherwise fine. Except that, also, they fucked up the model.
Salmon, JP Morgan: When basis trades blow up, here.
I’m not sure if it was the biggest quarterly loss of all time, but Merrill Lynch’s $16 billion loss in the fourth quarter of 2008 certainly ranks very high up there in the annals of investment-bank blowups. It happened after the bank had already been taken over by Bank of America, and it was in the middle of the financial crisis, so it didn’t get nearly the amount of attention it deserved. But it was not simply a case of assets plunging in value. Instead, it was, in very large part, a basis trade blowup.
The basis trade is an arbitrage, basically. There are two different ways the market measures credit risk: by looking at credit spreads — the yield on a certain issuer’s bonds, relative to the risk-free rate — or by looking at CDS spreads, which are basically the same thing but set in the derivatives market rather than the cash bond market. Most of the time, CDS spreads and cash spreads are tightly coupled. But sometimes they’re not. And at Merrill, a huge part of that $16 billion loss was reportedly due to a bad basis bet: the basis on many credits became very large and very negative during the financial crisis.
This time around, the basis-trade disaster has happened at JP Morgan, where the famousLondon Whale seems to have contrived to lose $2 billion on what was meant to be a hedging operation. And once again, although the details are still very murky, the culprit seems to be the CDS-cash basis.
Fast forward to early 2009 and Boaz Weinstein, the former star trader and co-head of credit trading at Deutsche Bank is down $1bn, Ken Griffin of Citadel is down 50% and John Thain’s Merril is said to be down $10bn+. Most of these horrific losses are due to a single strategy… the scary negative basis trade.
Bloomberg has written about it here
And there has even been a book published on this strategy… how many trades can say that!
There are a lot of moving parts in the Dismal tale of Dimon’s demise. The starting point is that Bruno Iksil in the JPMorgan CIO Office, under the premise of hedging the bank’s credit portfolio’s tail risk had placed various tranche trades (levered credit positions with various risk profiles) in the only liquid tranche market that still exists – CDX Series 9 (an ‘orrible portfolio of credits with an initial maturity at the end of 2012). These positions were low cost (steepeners or equity-mezz) but needed a certain amount of day to day care and maintenance (adjusting hedges and so on). As the market rallied, the positions required increasing amounts of protection be sold to maintain hedges (akin to buying into a rally more and more as it rises). His large size in the market left a mark however that hedge funds tried to fix – that was his index trading was making the index extremely rich (expensive) relative to intrinsics (fair-value).
Bloomberg, JPMorgan Trader Iksil Fuels Prop-Trading Debate With Bets, here. London Whale needs some P from Series 9 Investment Grade CDX. Can Bloomberg and some prominent officials help fix the situation?
Zerohedge, From Bruno Iksil’s Personal Profile: Enjoys “Walking Over Water” And Being “Humble”, here. Oh, so the London Whale is shorting tranches of series 9 CDX in $100bn quantities.That starts to make sense given all the hysteria. So they make the purchases at the corporate-level to hedge SCDO exposure that is not actively traded by the desk. Maybe Bruno is the one who needs the the Maxeler FPGA Supercomputer Credit batch at JPM (see Credit Derivatives, Flynn’s Architectural Case for Maxeler in 2012?, and Street FP Due Diligence 3: Epic Yet?) to run in 238 seconds. How do you tag that? London Whale needs Mammoth Supercomputer to Stay Afloat? Too much, right? I still suspect that the entire JPM credit batch (as described) completes in less than a minute on a low-end Mac Pro even with the Gaussian Copula positions. Sort of more like “Bruno uses iPhone to Track Purchases.” (see Business Insider, Financial Post, Wall Street Journal blog, Sober Look, New York Times Dealbook, Financial Times Alphaville, blogrunner)
Zerohedge, 31 Dec 2011 Notional Amt. of Derivative Contracts Top 25 Comm. Banks, here. Didn’t MS carry derivative inventory in the past?
Advanced Trading, Deutsche Bank Shaves Trade Latency Down to 1.25 Microseconds, 15 mar 2011, here. They report:
“This is a bit of a revolution, since it’s breaking a barrier from previously doing a couple of hundreds of microseconds and then 80 microseconds which is the normal software-based Ultra products’ latency,” said Roth. “That is the market standard and now we’re getting into the low-single digit microseconds. That has never been done before,” he said.
Deutsche Bank deployed the patent-pending card in its lab in the first quarter. As of Monday, the first client was ready to begin testing it. “The trade comes into the card, the card does the protocol translation and risk checks” explained Roth. “We’re bypassing the PC and doing everything in hardware,” explained Roth, who runs the global product development team for equity trading.
The Ultra solution will appeal to the bank’s sponsored access clients who are facing new market access regulations from the Securities and Exchange Commission (SEC) to ban naked access by requiring pre-trade risk checks. Right now, the product is live in Nasdaq’s data center in Cartaret, New Jersey, and it will soon rollout to Direct Edge, Bats and then NYSE Arca. When it gets to Europe, the London Stock Exchange, NYSE Euronext and Xetra will be the main ones where latency really matters, said Roth.
Though other Wall Street firms are working with hardware-based solutions, and in some cases, they are working with vendors in the space, Roth believes that Deutsche Bank has the competitive edge. “Our solution is so far the lowest latency we are aware of that works close to 1 microsecond because we work with standard hardware components,” said Roth.
“If you look at the time horizon, we think we have an edge,” says Roth. “Our vision is that hardware will proliferate in this space over the next 15 to 24 months,” said Roth. “This is going to be the standard in low latency trading and more speed is going to be adopted in algorithms,” he predicted.
However, latency has become such a marketing buzzword in the electronic trading industry, that the concept can vary based on how it’s measured. Deutsche Bank measures the latency from wire-to-wire, when the message hits the card and when the message leaves the card,” said Roth, adding, “There is no ambiguity.” In this type of work, the bank uses a high resolution, oscilloscope that connects to the chip.
But 1.25 is just the start, he said, adding, “Getting the latency below one, is actually a tuning exercise.” Roth said he’s “confident” that the bank can get the latency below one microsecond. “This is about engineering. You can do these things if you are really focused and have the right engineering skills available,” he said. “It’s also about applying new techniques to the financial markets,” he continued. While a lot of proprietary trading firms and hedge funds are excited about this, Deutsche Bank is also one of the first firms to use low latency access in algorithms for the buy side, he said.
1) The SEC’s 15c3-5 Market Access Rule and CFTC’s advisory recommendations for DMA will rekindle the latency wars.
Just around the corner looms the SEC’s Market Access Rule to ban naked access. No longer will participating firms have unfettered direct market access while broker’s (virtually) look the other way as orders flow into the market with little or no checks or balances. I’m sure it was a profitable enterprise for the broker community to allow this direct channel for those willing to pay a little extra.
Brokers compete on the range of services they offer. They attract client’s order flow by offering better fill rates, better prices, increased liquidity, etc. The SEC’s rule 15c3-5 which mandates pre-trade risk checks does not really inhibit the level of service brokers can provide, but it does ensure everyone pay a latency tax for checking credit limits, and order constraints (quantity and price) brokers must enforce.
As a result, a groundswell is occurring. Pre-trade risk is fast becoming the next latency battleground. While some are scrambling, others such as Morgan Stanley are announcing achievements of microsecond latency. I am sure others will follow with revamped pre-trade risk modules as they leverage multi-core hardware to achieve parallelism in their architecture. A renewed emphasis onFPGA, hardware acceleration has also surfaced. FPGA technology has been readily available for a number of years, it’s success has primarily been in appliance oriented technology for ticker plants and messaging such as Exegy and Solace where it’s an embedded component. An Aite report on Capital Markets Technology spending puts FPGA low on the list of IT spend for infrastructure investments. I think this is primarily due to the fear, uncertainty and doubt surrounding the direct use of non-commodity hardware. From an IT manager’s viewpoint, a series of difficult questions arises regarding FPGA… “complex, non-standard development, handling long-term maintenance, support, diagnosing failures” and lack of experienced talent to hire. Challenging questions and likely the reason for its lackluster success.
Low-Latency.com, FPGA Momentum Accelerates!, 9 Aug 2011, here. They report:
The answer is: quite a lot, judging by various news releases coming my way of late. Here are some highlights:
- Deutsche Bank’s Autobahn equities electronic trading business recently expanded its μltra FPGA products to the US, to provide pre-trade compliance and risk checks in its co-located trading apps at NYSE, NYSE Arca, Nasdaq, Direct Edge and Bats. The claimed performance of the risk checks are 1.35 microseconds for OUCH messages and 1.75 microseconds for FIX messages.
- Nomura extended its NXT execution platform to Direct Edge’s co-lo centre at Equinix in Secaucus, NJ. And it’s claiming latencies of under 1.8 microseconds for fixed-length exchange protocols and 2.8 microseconds for FIX.
- Fixnetix introduced its iX-eCute trading gateway, offering latencies as low as 740 nanoseconds wire-to-wire, with 20+ pre-trade risk checks in less than 100 nanoseconds.
- Burstream rolled out its managed market data service at Nasdaq’s co-lo and Telx’s proximity centre in Chicago, leveraging data feed handling and order book generation technology from NovaSparks.
- TS Associates updated its Application Tap precision time card to make more use of FPGAs for transferring data to host memory, reducing its performance overhead.
- Impulse Accelerated Technologies introduced an FPGA development kit for 10gE ITCH/OUCH protocol handling, allowing CPU/kernel bypass to application memory space.
- Maxeler introduced MaxNode10G, a platform designed for wire-speed processing of multiple 10 gigabit network data streams.
Wallstreetandtech.com, Capital Markets Outlook 2012, here. They report:
Bank of America Merrill Lynch recently announced BofAML Express, an ultralow-latency market access and risk control platform for U.S. equities that provides embedded risk controls with sub-10 microseconds of wire-to-wire latency. Morgan Stanley is using software to shave latency from its compliant direct-market-access platform, Speedway 3.0, which is live with at least five exchanges, including NYSE, ARCA, Nasdaq, BATS and the two Direct Edge exchange platforms.
And Deutsche Bank is employing field-programmable gate array (FPGA)-based devices to lower latency for its risk checks. The platform, known as ultra FPGA, runs from Deutsche Bank’s cabinets at exchange data centers. Latency-monitoring service Correlix RaceTeam recently measured ultra FPGA’s pre-trade risk management gateway latency at 1.35 microseconds for messages sent to Nasdaq and at 1.75 microseconds for FIX messages.
Nomura, which went live in July 2010 with its ultralow-latency market access product, NXT Direct, also has turned to FPGA technology for its pre-trade compliance and risk checks. The bank says the platform offers risk-filtered, wire-to-wire direct connectivity in less than 1.8 microseconds for fixed-length exchange protocols and less than 2.8 microseconds for FIX protocols.
Industry Leaders: Deutsche Bank, Morgan Stanley, Bank of America Merrill Lynch and Lime Brokerage have adopted aggressive strategies to provide low-latency pre-trade risk controls and market access.
Technology Providers: High-performance cloud infrastructure providers include BT Radianz, Thesys Technologies, SunGard Capital Markets, NYSE Technologies, Equinix, EMC, Options IT and VMware. FPGA providers include ACTIVFinancial, Impulse Accelerated Technologies, Altera, Xilinx and Novasparks.