Tuesday, March 30, 2010

Transaction processing in microseconds?

I attended an Intel Faster City event this evening, the 17th such event, in which Intel talks about its latest faster, more efficient processors and its technology partners and customers talk about how increased processing and compute power is`helping them in their everyday business applications.

The aspect I associate most with Intel's Faster City events is firms like Nomura talking about how faster processors are helping them win the so-called arms race, by reducing latency in algorithmic trading and high-velocity Direct Market Access trading applications.

Ken Robson, chief algo trading architect, Nomura, reeled off a list of benefits his firm had gained from using faster Intel processors; putting multiple strategies on a single box, compressing ticker plants; but the thing that struck me the most was his comment that why they do not break the bank, his department pretty much has free rein when it comes to technology spend.

That contrasts sharply with the middle and back office, which as we know historically has not matched the level of technology investment that the front office has enjoyed. Yet, as the crisis reminded us, while the front office talked in microseconds, back-office risk and  reference data management systems struggled to keep up with their batch processing systems.

You don't hear reference data managers talking in microseconds, nor do you hear transaction banks or payment processors boasting that it only took them a microsecond to transmit a payment to a customer.

However, it appears that may be changing. With data management and risk management being the 'fall guys' of the financial crisis, the expectation now is that they will be among the largest areas of IT investment in the coming months as the fallout from the financial crisis forces banks to step up investment in these areas.

Nigel Matthews of Thomson Reuters told attendees at Intel Faster City that next-generation data management was not about batch processes but would be more near real time and event driven and that front-office technologies were slowly starting to penetrate the middle and back offices.

This can only be good news, particularly when it comes to reference data management and ensuring that the back and middle-office is keeping pace with what is happening in the front office. But I cannot help thinking when is the payments business likely to benefit from these technologies?

While there have been developments in reducing payment processing times, particularly with the advent of UK Faster Payments, which has reduced clearing times for low value payments from three days  to "near real time", payments are still largely batch processed.

Although not all payments are time sensitive, there are certainly customers that would benefit from speedier payments and there are certainly banks that would benefit from being able to process more transactions per second using dual or multi-core processors. So why is it that most firms are still using single-core processors?

But has the financial crisis really changed anything in terms of the clout back-office reference data managers have when it comes to getting a larger share of the IT begging bowl so they too can try and win the "arms race". We'd like to hear from reference data managers that are able to say with the same confidence as Nomura's chief algo architect, that they have carte blanche when it comes to technology spend.

1 comment:

Stuart Plane, Cadis Software said...

The case for data management projects has only gotten stronger since the crisis, Anita. To make it happen there has to be a strong business case – 6 month ROI, rapid implementations, 100% successful implementation rate and so on. Tangible benefits at a set cost in a specified time frame. No excuses.

The key to securing buy-in and support for data management is showing tangible benefits faster. The crisis has finally worn down C-level patience for bulky data management projects that take years, cost too much thanks to ongoing consultancy and reworking and have a high probability of failure. The focus is now on multiple golden copies which answer multiple user needs and can be implemented in months.

There’s no question that the drivers for data management – ranging from regulation to attracting investor inflows – are all pushing it to the top of the priority list. A recent Gartner survey revealed that 67% of global investment management firms rank data management as their top priority in 2010; Aite Group have also come out and said that the data management market is poised for growth over the next few years.