Thursday, August 26, 2010

FinancialTech Insider has moved

For those of you that have followed financial-i's blog FinancialTech Insider and want to continue to follow it, we have integrated the blog within financial-i's redesigned web site. Go to www.financial-i.com for news and comment on issues pertaining to cash management, payments, trade finance, asset servicing and business solutions in the transaction banking space.

Thursday, April 22, 2010

What EU airport regulators can and should learn from SEPA

"The European single sky". In the wake of the volcanic ash disaster that brought European skies to a standstill, experts are calling for a more united approach across Europe's aviation regulators.

The handling of airport closures across Europe highlighted the fragmented approach that exists among European governments, airport regulators and air traffic control. Commentators now suggest that  a more harmonised approach to air traffic control and airport safety across the EU is needed.

What has this got to do with transaction banking, you may ask? Well efforts to unite Europe's skies reminded me about efforts to unite European payments under the Single Euro Payments Area (SEPA) initiative. And if SEPA is anything to go by airport regulators could have their job cut out for them.

Of course, SEPA not only tried to harmonise existing European payment schemes, but instead proposed replacing them with new pan-European schemes for cross-border credit transfers and direct debits, which end users have been slow to adopt. No one is proposing a new EU-wide airport traffic control system per se, but linking airport traffic control systems across Europe does present its challenges, and one can already hear the national politicking and objections that are likely to emerge as different national regulators and interests jockey for position.

Europe may have a single currency, but it appears that the EU is far from united when it comes to most other things and EU-wide payment mechanisms or airport traffic control systems, are no exception. While these concepts may sound good on paper, in reality they are difficult to implement and are often hijacked or impeded by parochial interests.

Wednesday, April 14, 2010

Paying the price of electronic payments

Cash is no longer king, despite the fact that for the last decade or so we have heard transaction banks bang on about the supremacy of cash, at least to corporate treasurers that are cash-rich or looking to unlock cash trapped in inefficient parts of their business.

However, a report published by the UK Payments Council, concludes that cash's reign as king is over with cash usage rising just 7% over 10 years and comprising just 59% of all transactions (down from 73% a decade ago), with more consumers using electronic forms of payment such as debit cards, online payments or contactless cards.

Unlike the US, which while declining year on year, still has some challenges in terms of weaning companies and their customers off paper cheques, by 2018 the Payments Council predicts that fewer than 1% of UK payments will be made by cheque.

Yet, with the increasing use of electronic payments whether it is debit or credit cards, ACH or online, comes the increased risk or threat of fraud, particularly as transaction volumes rise. Jim Woodworth, head of business services at payments software provider, ACI Worldwide says financial institutions need to ensure that their systems are able to support the growth in the number of electronic payments, while reducing the risk of fraud.

Nick Ogden, founder and CEO of Voice Commerce, which provides voice authentication solutions for payments, also highlighted the heightened fraud implications associated with increased use of electronic payments whether it is cards or mobile. "The threat of fraud and identity theft becomes more prevalent as hackers get better at cracking these new payment technologies," he says.

This highlights the need for the industry to devise more secure means of authentication, that are cost effective and non-intrusive for the user. So as more payments are made online, banks not only face the challenge of ensuring their legacy payments infrastructure, some of which dates back 30 years or more, is up to scratch, but also that they are able to monitor and detect potentially fraudulent transactions in real time and to ascertain someone is who they say they are when making a payment without the user having to jump through too many onerous hoops.

And as electronic solutions revolutionize the way we pay, are consumers and companies likely to place more onus not only the speed and efficiency with which they can make a payment, but also how secure it is? In other words when we shop around for payment services will security be more front of mind than it has been historically?

Tuesday, March 30, 2010

Transaction processing in microseconds?

I attended an Intel Faster City event this evening, the 17th such event, in which Intel talks about its latest faster, more efficient processors and its technology partners and customers talk about how increased processing and compute power is`helping them in their everyday business applications.

The aspect I associate most with Intel's Faster City events is firms like Nomura talking about how faster processors are helping them win the so-called arms race, by reducing latency in algorithmic trading and high-velocity Direct Market Access trading applications.

Ken Robson, chief algo trading architect, Nomura, reeled off a list of benefits his firm had gained from using faster Intel processors; putting multiple strategies on a single box, compressing ticker plants; but the thing that struck me the most was his comment that why they do not break the bank, his department pretty much has free rein when it comes to technology spend.

That contrasts sharply with the middle and back office, which as we know historically has not matched the level of technology investment that the front office has enjoyed. Yet, as the crisis reminded us, while the front office talked in microseconds, back-office risk and  reference data management systems struggled to keep up with their batch processing systems.

You don't hear reference data managers talking in microseconds, nor do you hear transaction banks or payment processors boasting that it only took them a microsecond to transmit a payment to a customer.

However, it appears that may be changing. With data management and risk management being the 'fall guys' of the financial crisis, the expectation now is that they will be among the largest areas of IT investment in the coming months as the fallout from the financial crisis forces banks to step up investment in these areas.

Nigel Matthews of Thomson Reuters told attendees at Intel Faster City that next-generation data management was not about batch processes but would be more near real time and event driven and that front-office technologies were slowly starting to penetrate the middle and back offices.

This can only be good news, particularly when it comes to reference data management and ensuring that the back and middle-office is keeping pace with what is happening in the front office. But I cannot help thinking when is the payments business likely to benefit from these technologies?

While there have been developments in reducing payment processing times, particularly with the advent of UK Faster Payments, which has reduced clearing times for low value payments from three days  to "near real time", payments are still largely batch processed.

Although not all payments are time sensitive, there are certainly customers that would benefit from speedier payments and there are certainly banks that would benefit from being able to process more transactions per second using dual or multi-core processors. So why is it that most firms are still using single-core processors?

But has the financial crisis really changed anything in terms of the clout back-office reference data managers have when it comes to getting a larger share of the IT begging bowl so they too can try and win the "arms race". We'd like to hear from reference data managers that are able to say with the same confidence as Nomura's chief algo architect, that they have carte blanche when it comes to technology spend.

Monday, March 15, 2010

Data analytics not for "rocket scientists"

We’ve all read the reports that in the wake of the financial crisis, risk management and analytics needs to move to the top of the corporate agenda and that risk managers should be viewed not as the bogeyman trying to rein in the profit-hungry trading desk’s excessive risk taking, but more as a strategic asset within the bank that has the ear of the CEO, CFO and CIO.

While risk managers within some banks did spot the early warning signs of a pending crisis, the risk models and analysis used have also come under harsh criticism in the wake of the crisis, particularly for their inability to speak to senior executives in a language that they clearly understood. In other words, if you take this level of exposure in your CDS portfolio, this, this and this will happen and oh by the way, i have sliced and diced the data for you and presented it in a rather colourful line graph or pie chart, that can be quickly read and interpreted,  not some complex mathematical formula.

What the financial crisis boils down to, notes Venkat Mullur, senior director, industry solutions, TIBCO Spotfire, is that “People who were making the decisions didn’t understand what Value at Risk (VAR) meant,” – VaR being a common risk modelling technique used by banks. “There was a cognitive gap between the model and analysis and consumers of that data,” not all of whom were mathematical geniuses.

Post-crisis I think we can safely assume there were too few so-called "geniuses" within banks, as there was a lot of exposure to things banks did not really understand. If only someone had bothered to portray the risk analysis  for them in a more easily digestible manner than perhaps they would not have been all so keen to pile into CDS. The question is what to do about it?

Going forward if all parts of the businesses within a bank are to understand the outcomes of data analysis across all lines of the business, Mullur argues that data or business intelligence needs to be presented in a more easily digestible, flexible and dynamic format.

Business users also need to be able to perform on-the-fly data analysis on a whole host of different data without having to revert back to IT. TIBCO’s answer to this dilemma is to leverage the business intelligence and predictive analytics capabilities within its in-memory Spotfire 3.1 platform. Spotfire uses a range of data visualization techniqes such as “conditional coloring and lasso and axis marking that allow for better data analysis of patterns, clusters and correlations among sets of variables. Multiple scale bar charts and combination bar and line plots can also be used to analyse unstructured, ‘free-dimensional’ data to identify key trends (see diagram).



“Spotfire allows users to analyse data in a more intuitive way and to make better sense of the data needed to predict future events,” says Mullur. Analyst firm Forrester has given Spotfire the thumbs up saying that it “puts the power of predictive analytics into the hands of any business user, with data visualizations they can understand, and a level of interactivity unmatched by traditional business intelligence (BI). That means, says Forrester, that statisticians and business analysts can “prototype, test, and deploy analytics much faster than with alternative statistical modelling environments,” such as spreadsheets, which do not easily allow for ad hoc analysis by business users.

It’s easy to see why TIBCO and Forrester are bullish about Spotfire, particularly when advanced data analytics of the past has been the preserve of “rocket scientists”. So there will be no excuses now for banking CEOs to say they did not understand the risks the business was undertaking in a particular investment portfolio or line of business when their risk or business manager presents them with colourful line graphs and pie charts of various statistical analyses they have performed.

And it is not just commercial banks that are likely to benefit. Mullur says it is also working with global regulators to help them get a better handle on risk analysis.

Friday, February 05, 2010

Sybase-Aleri deal plays to the advantage of remaining pure-play CEP vendors

I remember writing about the burgeoning Complex Event Processing (CEP) market two or three years ago when their were a handful of vendors; StreamBase, Coral8, Aleri, Progress Apama,; all vying for market share and using CEP to service different parts of the market. Some like Progress Apama were focused on CEP and its application in the algo trading space, while Aleri was more focused on the liquidity management side.

With this week's announcement that Sybase had finalised an asset purchase agreement with Aleri, the CEP pure-play market has virtually shrunk overnight. Sybase was already using Coral8's CEP in its real-time analytics or RAP platform and had a reseller agreement with Coral8.

However, Coral8 was bought by Aleri back in 2008 giving Aleri essentially three CEP products, its own, Coral8's and OHIO, the project name for its attempt to integrate Coral8 with Aleri's CEP engine. Meanwhile since 2008, Sybase had a reseller agreement  to offer the Coral8 engine and portal as a general purpose CEP platform "in conjunction with any Sybase solution globally".

There is a lot of speculation on the web about why Aleri sold up to Sybase, including an article on Wall Street & Technology speculating that maybe Aleri was having financial difficulties. However,   the truth may lie somewhere in the complicated morass of reseller agreements and the fact that having acquired Coral8, Aleri was also planning to sell its technology, which Sybase was also reselling.

The official line from Sybase this week is that the Aleri acquisition will position it as a "clear market leader in CEP" and help strengthen its RAP platform with the addition of Aleri's Liquidity Risk Management and Liquidity Management Suite.

However, some commentators I spoke to say that Sybase is unlikely to be a serious contender in the CEP space and that under Sybase's stewardship the Aleri platform could wane, which may be a problem for existing customers.

Furthermore the so-called OHIO project for merging Coral8 CEP with Aleri CEP also seems unlikely to continue. Hence why StreamBase is rubbing its hands together coming out with the statement that customers of Aleri-Coral8 or Sybase-RAP can trade-in their products and move over to its platform.

Richard Tibbetts, CTO at StreamBase, said, “It’s unlikely that Sybase will maintain four separate products.Aleri had three separate CEP product initiatives; Coral8, Aleri, plus OHIO. Sybase’s CEP product RAP is yet a fourth code base. As a result, we’ve been approached by customers of all these products and asked to provide migration strategies to StreamBase."

It seems that the  true winners out of this deal in the CEP pure-play space are likely to be StreamBase and Progress Apama. It appears that Sybase sees CEP not as the be-all and end-all on its own, but as part of an integrated offering that supports analytics and data repositories. To that extent it is unlikely to go head-to-head with StreamBase and Progress Apama on the CEP piece.

Wednesday, February 03, 2010

Clearstream no longer thinks of itself as just a depository but as a commercial bank

At the London Capital Club, Jeffrey Tessler, CEO of Clearstream International, the Luxembourg-based ICSD, painted its competitor, Euroclear, the other half of the ICSD (International Central Securities Depository) duopoly, as being in a much weaker position - directly exposed to the failure of broker/dealer clients during the crisis, a number of changes at top management level including the impending arrival of a new CEO, and still stuck in the mindset of a utility, not a commercial bank that is moving up the value chain.

One could be forgiven for thinking that old rivalries between the ICSD duopoly, which go back decades, have never really dissipated. However, Tessler was also complementary towards his Brussels-based counterparts saying that it too recognised the importance of interoperability amongst CSDs (although the jury is still out on whether Euroclear is likely to join Clearstream's Link Up Markets).

He complemented Euroclear for the work it had done in integrating the CSDs within the Euronext markets into one with its Euroclear Settlement of Euronext-zone Securities (ESES). Yet he added that  it would be difficult for Euroclear to extend its single platform concept beyond the Euronext markets, so it would have to embrace interoperability and hopefully join Link Up Markets.

When I spoke to outgoing Euroclear CEO Pierre Francotte at Sibos in Hong Kong last September, he said Euroclear was looking at Link Up Markets, but he remained non-commital. It will be interesting to see how Euroclear's new CEO, Tim Howell, approaches the issue of interoperability and Link Up Markets when he finally  takes the helm at the Brussels-based ICSD.

If Euroclear decides not to join Link Up Markets, which has built a converter for fostering interoperability between different domestic CSDs, Tessler said you could still have bilateral links in all major markets. However, from Link Up Markets' perspective its ambition is to provide a single point of access into multiple markets."We believe interoperability as a strategy going forward is the right one," says Tessler. "Regulation is moving in that direction. Instead of destroying local market infrastructure, we want to leverage the infrastructure that exists." For more on Link Up Markets as a single pipe, listen to Jeffrey Tessler.

Clearstream Banking Frankfurt, which is the German domestic CSD within the Deutsche Bourse Group, sees its membership of Link Up Markets as a way of not only fostering interoperability among CSDs, but also moving up the value chain to prepare for a much-changed world post-TARGET2-Securities (T2S), which is the new settlement platform for euro-denominated securities due to go live now in 2014.
"T2S is like taking a chalkboard and erasing everything off of it," said Tessler. "Through Link Up Markets we will be able to access multiple markets through a single window. We are transforming Clearstream Banking Frankfurt from a domestic CSD into a hub for accessing multiple CSDs throughout Europe and the world."
 But Clearstream International, the ICSD part of the business, has far loftier ambitions. Tessler says it plans to become not just a depository but a commercial global custodian like J.P. Morgan or Bank of New York Mellon that provides value added services such as Global Securities Financing, which is an increasingly successful part of its business, going from has a 22% market share in 2002 to a 51% market share.

Interestingly, custodian banks are also customers of Clearstream, and when questioned on whether Clearstream would compete directly with its customers, Tessler said regional subcustodians that acted as an intermediary between the broker and the CSD, would find their business increasingly threatened. Increasingly, he says, brokers will ask themselves, 'Why do I need a subcustodian?'

Tessler says Clearstream is winning more securities financing business than its competitor Euroclear because of its vertical integration model which combines the trading functionalty of Deutsche Bourse, with the settlement and collateral management capabilities that also exist within the group, particularly Clearstream Frankfurt's direct links with the Deutsche Bundesbank. For more on why Clearstream has been successful in the securities financing space, listen to Jeffrey Tessler.