Treasury executives generating financial forecasts must manually pull data to a central database from several applications at the corporate level, and then clean and format it, a process taking a week or more before it can be inputted into their reporting and forecasting systems.
That may sound like once-upon-a-time to the modern treasury executive, but in fact it’s the norm, according to the results of a survey recently published by SunGard.
“Therein lies one crux of the issue,” said Michael Wolk, a partner in SunGard Consulting Service’s information management practice. “Folks capture the data at a point in time, and then it takes effort and hard work to prepare it, and by then management’s needs have changed, and it has to be done all over again.”
Mr. Wolk said he’s run across that scenario frequently among the energy and financial firms he’s worked with and that responded to the survey, a scenario that may be even more common among companies in less technology dependent industries. The survey backs up Mr. Wolk’s observation, finding that 80 percent of respondents use periodic reporting techniques and after-the-fact reporting, indicating that many current business intelligence (BI) programs lack proactive, investigative reporting tactics.
In fact, the survey found just 11 percent of respondents, from 74 energy firms and 93 financial services firms, use proactive and investigative techniques. And only 9 percent of respondents employ “What could happen,” proactive use of operational data.
The survey found that 45 percent of the respondents generate reports by extracting data manually and normalize it with spreadsheets and other tools, and then present the data in spreadsheet or PDF format. The report notes that spreadsheets are subject to fat-finger errors, and that companies tend to move to more sophisticated data warehouses when the amount of data becomes too large and spreadsheets to complex.
A central data base such as Microsoft’s SQL Server is a good first step away from spreadsheets, Mr. Wolk said, because the data can be normalized more quickly using middleware applications designed for that purpose, and it can be actively managed and reported through dashboards and other third-party tools.
The next step, Mr. Wolk said, is to add forecasting tools, providing insight into future receivables and payables, and other key numbers that can help treasury executives determine a company’s changing capital needs.
“For one customer we built a mini data warehouse that focused on the margin needs related to their trading activity. It forecasted what their margin needs would be next week, next month, etc.,” Mr. Wolk said.
That customer was a financial services firm. Corporates using over-the-counter swaps are exempted from clearing and margin requirements, although a proposal by the prudential regulators would require them to post margin, eating up $5.1 billion in capital or more, according to estimates by the Coalition of Derivative End Users.
The survey found that more than half of respondents said their companies define BI functional requirements with either little cross-department coordination, or they authorize and manage BI projects at the department level. A quarter of respondents said their projects are strategic and aligned with enterprise-wide goals. Nearly a quarter of respondents, suggesting an emerging trend, said their companies have created a BI vision and roadmap that includes governance models and program management across departments.