Tuesday, March 30, 2010

Transaction processing in microseconds?

I attended an Intel Faster City event this evening, the 17th such event, in which Intel talks about its latest faster, more efficient processors and its technology partners and customers talk about how increased processing and compute power is`helping them in their everyday business applications.

The aspect I associate most with Intel's Faster City events is firms like Nomura talking about how faster processors are helping them win the so-called arms race, by reducing latency in algorithmic trading and high-velocity Direct Market Access trading applications.

Ken Robson, chief algo trading architect, Nomura, reeled off a list of benefits his firm had gained from using faster Intel processors; putting multiple strategies on a single box, compressing ticker plants; but the thing that struck me the most was his comment that why they do not break the bank, his department pretty much has free rein when it comes to technology spend.

That contrasts sharply with the middle and back office, which as we know historically has not matched the level of technology investment that the front office has enjoyed. Yet, as the crisis reminded us, while the front office talked in microseconds, back-office risk and  reference data management systems struggled to keep up with their batch processing systems.

You don't hear reference data managers talking in microseconds, nor do you hear transaction banks or payment processors boasting that it only took them a microsecond to transmit a payment to a customer.

However, it appears that may be changing. With data management and risk management being the 'fall guys' of the financial crisis, the expectation now is that they will be among the largest areas of IT investment in the coming months as the fallout from the financial crisis forces banks to step up investment in these areas.

Nigel Matthews of Thomson Reuters told attendees at Intel Faster City that next-generation data management was not about batch processes but would be more near real time and event driven and that front-office technologies were slowly starting to penetrate the middle and back offices.

This can only be good news, particularly when it comes to reference data management and ensuring that the back and middle-office is keeping pace with what is happening in the front office. But I cannot help thinking when is the payments business likely to benefit from these technologies?

While there have been developments in reducing payment processing times, particularly with the advent of UK Faster Payments, which has reduced clearing times for low value payments from three days  to "near real time", payments are still largely batch processed.

Although not all payments are time sensitive, there are certainly customers that would benefit from speedier payments and there are certainly banks that would benefit from being able to process more transactions per second using dual or multi-core processors. So why is it that most firms are still using single-core processors?

But has the financial crisis really changed anything in terms of the clout back-office reference data managers have when it comes to getting a larger share of the IT begging bowl so they too can try and win the "arms race". We'd like to hear from reference data managers that are able to say with the same confidence as Nomura's chief algo architect, that they have carte blanche when it comes to technology spend.

Monday, March 15, 2010

Data analytics not for "rocket scientists"

We’ve all read the reports that in the wake of the financial crisis, risk management and analytics needs to move to the top of the corporate agenda and that risk managers should be viewed not as the bogeyman trying to rein in the profit-hungry trading desk’s excessive risk taking, but more as a strategic asset within the bank that has the ear of the CEO, CFO and CIO.

While risk managers within some banks did spot the early warning signs of a pending crisis, the risk models and analysis used have also come under harsh criticism in the wake of the crisis, particularly for their inability to speak to senior executives in a language that they clearly understood. In other words, if you take this level of exposure in your CDS portfolio, this, this and this will happen and oh by the way, i have sliced and diced the data for you and presented it in a rather colourful line graph or pie chart, that can be quickly read and interpreted,  not some complex mathematical formula.

What the financial crisis boils down to, notes Venkat Mullur, senior director, industry solutions, TIBCO Spotfire, is that “People who were making the decisions didn’t understand what Value at Risk (VAR) meant,” – VaR being a common risk modelling technique used by banks. “There was a cognitive gap between the model and analysis and consumers of that data,” not all of whom were mathematical geniuses.

Post-crisis I think we can safely assume there were too few so-called "geniuses" within banks, as there was a lot of exposure to things banks did not really understand. If only someone had bothered to portray the risk analysis  for them in a more easily digestible manner than perhaps they would not have been all so keen to pile into CDS. The question is what to do about it?

Going forward if all parts of the businesses within a bank are to understand the outcomes of data analysis across all lines of the business, Mullur argues that data or business intelligence needs to be presented in a more easily digestible, flexible and dynamic format.

Business users also need to be able to perform on-the-fly data analysis on a whole host of different data without having to revert back to IT. TIBCO’s answer to this dilemma is to leverage the business intelligence and predictive analytics capabilities within its in-memory Spotfire 3.1 platform. Spotfire uses a range of data visualization techniqes such as “conditional coloring and lasso and axis marking that allow for better data analysis of patterns, clusters and correlations among sets of variables. Multiple scale bar charts and combination bar and line plots can also be used to analyse unstructured, ‘free-dimensional’ data to identify key trends (see diagram).



“Spotfire allows users to analyse data in a more intuitive way and to make better sense of the data needed to predict future events,” says Mullur. Analyst firm Forrester has given Spotfire the thumbs up saying that it “puts the power of predictive analytics into the hands of any business user, with data visualizations they can understand, and a level of interactivity unmatched by traditional business intelligence (BI). That means, says Forrester, that statisticians and business analysts can “prototype, test, and deploy analytics much faster than with alternative statistical modelling environments,” such as spreadsheets, which do not easily allow for ad hoc analysis by business users.

It’s easy to see why TIBCO and Forrester are bullish about Spotfire, particularly when advanced data analytics of the past has been the preserve of “rocket scientists”. So there will be no excuses now for banking CEOs to say they did not understand the risks the business was undertaking in a particular investment portfolio or line of business when their risk or business manager presents them with colourful line graphs and pie charts of various statistical analyses they have performed.

And it is not just commercial banks that are likely to benefit. Mullur says it is also working with global regulators to help them get a better handle on risk analysis.