Treasury’s “Big Data” Challenge

May 10, 2012

By Dwight Cass

Pricing and risk management activities require ever-more market data, forcing corporates to address the same big data challenges financial institutions face. 

What type of business increased its spending on financial market data the most last year? The question brings to mind the endless rows of computers stuffed into northern New Jersey technology centers that execute algorithmic equity and derivatives trades for banks and hedge funds. But neither prop traders nor buy-side quants boosted their data spending as much as treasury and FX.

According to the Burton-Taylor Financial Market Data/Analysis Global Share & Segment Sizing 2012, an annual report on financial market data spending, the foreign exchange and treasury segment boosted its outlay by 12.09 percent in 2011, outpacing growth rates in the investment management (5.97 percent), equity sales and trading (7.60 percent), and fixed income (7.28 percent) segments. While the absolute level of spending in 2011 by the FX and treasury segment remained a reasonably small $2.26 billion out of the total global spend of $25 billion, it is by far the fastest-growing segment.

Burton-Taylor doesn’t break out treasury from FX trading, meaning 2011’s volatile currency markets, buffeted by the Eurozone crisis, could account for a significant chunk of the increase in data budget outlays. But market data providers and consultants say treasury appetite for financial market data is burgeoning.

Treasury’s Big Data Challenge

“We are definitely laying a lot more pipes into corporates,” says an account manager at Bloomberg, which edged out Thomson Reuters to post the highest revenue in the industry for the first time in 2011. “Treasury needs independent pricing sources, data to feed risk and cash flow projection models. When Dodd-Frank hits, the collateral management issues [raised by central clearing] will gobble up even more data.”

Treasury is bringing some formerly outsourced activities in-house. These include evaluating supply chain member credit and operational risks, commodity prices in relevant spot and futures markets, the risk profile of trading partners, lenders contractors and subcontractors, and credit analysis of customers if the company has a vendor financing program.

Treasury has also begun offering services such as data mining and process optimization to other parts of the company, as a way to use its resources most effectively and justify its budget. (See “Treasury Adds Value With In-House Service Offerings,” IT, February 16, 2010.) All this reduces reliance on often suboptimal outside services like Dun & Bradstreet or the credit rating companies.

Treasury’s desire to optimize banking relationships by ensuring borrowings, derivatives transactions, bond or share issues are done at as close to the best market price as possible is also boosting demand for market data. These are some of the same drivers that have made electronic trading tools more popular—a desire for efficiency and the chance to reduce reliance upon banks and brokers.

Let a Thousand ‘Bergs Bloom

More surprising, Burton-Taylor found that spending by what it classifies as the “corporate” segment grew at a faster rate, 7.66 percent, than the equity sales and trading, investment management and fixed income segments. The corporate segment is all company market data spending except that by treasury. It includes the C-suite, corporate strategy groups and IR.

“There has definitely been more pressure on corporate strategy teams since the financial crisis,” says Douglas B. Taylor, managing partner of Burton-Taylor. Evaluating acquisitions and divestiture opportunities is often the driver for financial market data to appear in the C-suite and planning offices. Strategy groups use market and macroeconomic data and trend analyses. Changes to reporting requirements embedded in Dodd-Frank and heightened investor demands for transparency are requiring investor relations to have access to data to better respond to analyst and investor queries about changes to the prices of their inputs, to borrowing costs and other factors.

But not everyone is seeing a large increase in this sort of demand. “Usually when you get up to the C-Suite level, executives are looking for calculated output of risk systems and other trends and analysis,” notes Philip Pettinato, chief technology officer at Reval, which provides treasury and risk management SaaS technology. “They’re really looking for management-level reporting on risks, how they’re hedging, the costs of inputs and other factors” but not raw data.

Big Data, Big Headaches

The problems involved in acquiring and effectively using big data are now common fodder for the business press. But financial market data is an order of magnitude larger and more granular than the consumer preference or sales information that has garnered the most coverage. In a white paper published earlier this year (“Big Data: Challenges and Opportunities”), Interactive Data, one of the big financial market data providers, framed the size of the issue:

Take the average daily number of pricing ticks processed by Interactive Data: in October 2011, across all traded asset classes in North America, an average of 10.7 billion ticks was processed every trading day. That translates into about 19.3 terabytes of data per year. However, those figures… are averages; peaks can and do occur. For example, on August 8, 2011, over 26 billion ticks were experienced. From an infrastructure perspective, that suggests the need for capacity to handle at least three times the daily average, which will boost costs.

For those companies that do take one or more raw feeds, the checks they write to the data providers represent only a small part of the total cost. The rule of thumb is that for every dollar spent on data, another five is spent vetting and processing it into a useful form.

“Many corporates have some sort of a data feed coming in for in-house trade and risk-management systems,” says Reval’s Pettinato. “But the data comes in raw, so they have to check it for accuracy, and then construct the surfaces, curves, regressions… to use in their analyses.”

Reval’s signature SaaS hedge accounting and treasury risk management product is one way around the big data problem, since it acquires the data, scrubs it and checks it for accuracy before using it in the analytics that clients use. For companies with less complex needs, a couple Bloombergs or Reuters terminals will do.

But given the intersection of several trends—the desire for analytical self-sufficiency, the need for better market oversight, increasing transparency demands from investors, counterparties and, in some cases, regulators—the complexity of treasury’s data needs looks set to grow. And if the amount spent on data continues to grow at last year’s pace, it won’t be long before the absolute amount is comparable to that spent by financial institutions. If the total implementation cost is six times that amount, it will move to the top of treasury’s list of concerns.

Leave a Reply

Your email address will not be published. Required fields are marked *