No doubt you have probably read the media speculation in recent weeks surrounding the Bank of America's expansion into Europe and Asia. The bank's CEO Ken Lewis has made no secret of the fact that he wants to expand the bank's credit card and corporate and investment banking business in Europe, and the bank is tipped to spend $500 million over the next four years doing just that. Click here for more.
Lewis has persistently denied rumours that acquiring a European bank is part of its expansion strategy. Mind you it wouldn't be the first time that a major US bank has eyed the European market only to find that the cultural and political barriers to cross-border M&A are too cumbersome to pull it off.
Nevertheless, despite the obstacles and Lewis' denials, rumours persist that a potential acquisition in Europe may be on the cards, and on 29 November at market close, Bank of America's market cap at $243.71 billion inched ahead of Citigroup's $243.52 billion.
It may have the market cap, but unlike Citigroup, Bank of America lacks a truly global footprint, despite its $3 billion acquisition of a 9% stake in China Construction bank. Lewis reportedly told The Wall Street Journal he didn't "see the strategic imperative of being on the ground in Europe." But according to an industry source I had lunch with the other day, the bank could still be eyeing a potential acquisition in Europe.
The UK banking sector is certainly ripe for consolidation with potential targets such as Barclays or Lloyds TSB. But given its associations with the Latin American market, perhaps a major Spanish bank like Banco Santander for example, would make an interesting partner for Bank of America in Europe?
In October, in an effort to strengthen its foothold in the Latin American market, Santander Central Hispano acquired private banking and premiere banking assets from Bank of America's wealth management portfolio. According to Latin Counsel.com the transaction involves the "potential transfer" of customer holdings valued at approximately $4 billion from Bank of America to Santander Private Banking. The holdings consist of accounts of residents in Latin American markets such as Mexico, Argentina, Uruguay, Chile, Brazil and Venezuela.
Thursday, November 30, 2006
Wednesday, November 29, 2006
Liquid assets
With all the media hoopla (including my own verbal diarrhoea) surrounding the announcement of multilateral trading facilities (MTFs) like Project Turquoise emerging in response to MiFID, it is easy to get carried away with the newness of it all. After all, it gives us hacks something to write about.
'MTF backed by investment banks challenges exchange monopoly' is a headline few hardened hacks would find difficult to ignore. But perhaps I have been a little premature in espousing the virtues of these alternative trading venues and the competitive threat they pose to the exchanges.
The reason I say that is because this morning I listened intently as market participants at a breakfast briefing hosted by Interactive Data, commented on whether they believed these new execution venues would be successful in attracting liquidity. Liquidity is after all the end game, and if these alternative execution venues don't attract their lion's share of it, then they will be remembered as those that tried to topple the 'emperor' but failed in their 'coup' attempt.
"If they can slash costs in a monopoly industry, then they [MTFs] will succeed," says
Dr Paul Lynch, managing partner, PE Lynch, a UK-based algorithmic trading specialist. However, Lynch believes it is unlikely these new platforms will attract 50% of the London Stock Exchange's liquidity within the first three months. It all boils down to whether these MTFs create better market spreads, he says.
The recently announced MTF projects are still unknown quantities and only time will tell what impact they will have in terms of fragmenting liquidity within Europe. Jon Carp, head, electronic brokerage and execution sales, Europe, Crédit Agricole Cheuvreux International Ltd, said he had seriously considered whether Cheuvreux's deal flow justified setting up an MTF or whether it should partner with a consortium of investment banks like Project Turquoise? At the end of the day, it is a business decision a number of brokerages must be mulling over with MiFID looming on the horizon.
Nevertheless, Carp believes that if the LSE were to drastically slash costs in the face of heightened competition, that may encourage some sell-side firms to stay put. "If the cost of trading comes down, it will be more attractive for the banks to say they don't have to build Project Turquoise," he says. But now that the investment banks have partially dipped their toes in the water and found that the temperature is too their liking, will they want to totally submerge themselves in the new competitive landscape that beckons or will they need to be thrown a life raft?
Arguably, it's a win-win situation for the investment banks regardless of whether Project Turquoise gets off the ground or not. Even if they don't attract liquidity, one thing they will have succeeded in doing is forcing the exchanges to reduce costs. Costs will inevitably come down. But if the investment banks do succeed, and surely we can expect to see more MTF announcements on the not too distant horizon, then what impact will all these venues have on already ballooning market data volumes?
According to Octavio Marenzi, CEO, Celent, who chaired the Interactive Data debate, MiFID says post-trade data can be published on web sites as long as it is "machine readable". 'Does that mean that there will be 60 different data sources?' he asked the esteemed panel. Danny Moore, COO, Wombat Financial Software, hinted that there could be real problems with 'symbology' if post-trade data can be published anywhere. "Symbology is a huge issue," he said. "It would be easier if everyone used the same symbology but somebody has to do the conversion. We can't do that as a vendor so it is pushed back onto the clients."
'MTF backed by investment banks challenges exchange monopoly' is a headline few hardened hacks would find difficult to ignore. But perhaps I have been a little premature in espousing the virtues of these alternative trading venues and the competitive threat they pose to the exchanges.
The reason I say that is because this morning I listened intently as market participants at a breakfast briefing hosted by Interactive Data, commented on whether they believed these new execution venues would be successful in attracting liquidity. Liquidity is after all the end game, and if these alternative execution venues don't attract their lion's share of it, then they will be remembered as those that tried to topple the 'emperor' but failed in their 'coup' attempt.
"If they can slash costs in a monopoly industry, then they [MTFs] will succeed," says
Dr Paul Lynch, managing partner, PE Lynch, a UK-based algorithmic trading specialist. However, Lynch believes it is unlikely these new platforms will attract 50% of the London Stock Exchange's liquidity within the first three months. It all boils down to whether these MTFs create better market spreads, he says.
The recently announced MTF projects are still unknown quantities and only time will tell what impact they will have in terms of fragmenting liquidity within Europe. Jon Carp, head, electronic brokerage and execution sales, Europe, Crédit Agricole Cheuvreux International Ltd, said he had seriously considered whether Cheuvreux's deal flow justified setting up an MTF or whether it should partner with a consortium of investment banks like Project Turquoise? At the end of the day, it is a business decision a number of brokerages must be mulling over with MiFID looming on the horizon.
Nevertheless, Carp believes that if the LSE were to drastically slash costs in the face of heightened competition, that may encourage some sell-side firms to stay put. "If the cost of trading comes down, it will be more attractive for the banks to say they don't have to build Project Turquoise," he says. But now that the investment banks have partially dipped their toes in the water and found that the temperature is too their liking, will they want to totally submerge themselves in the new competitive landscape that beckons or will they need to be thrown a life raft?
Arguably, it's a win-win situation for the investment banks regardless of whether Project Turquoise gets off the ground or not. Even if they don't attract liquidity, one thing they will have succeeded in doing is forcing the exchanges to reduce costs. Costs will inevitably come down. But if the investment banks do succeed, and surely we can expect to see more MTF announcements on the not too distant horizon, then what impact will all these venues have on already ballooning market data volumes?
According to Octavio Marenzi, CEO, Celent, who chaired the Interactive Data debate, MiFID says post-trade data can be published on web sites as long as it is "machine readable". 'Does that mean that there will be 60 different data sources?' he asked the esteemed panel. Danny Moore, COO, Wombat Financial Software, hinted that there could be real problems with 'symbology' if post-trade data can be published anywhere. "Symbology is a huge issue," he said. "It would be easier if everyone used the same symbology but somebody has to do the conversion. We can't do that as a vendor so it is pushed back onto the clients."
Monday, November 27, 2006
Looking for Mr Chips
Having commented ad nauseam last week about the spate of new high speed trading execution venues emerging in Europe to challenge the traditional stock exchanges, on Monday evening I found myself seated in front of a panel, which included some of the protagonists involved in the unravelling of Europe's trading landscape post-MiFID (Markets in Financial Instruments Directive).
Representatives from leading investment banks Credit Suisse (one of the seven banks behind the announced pan-European MTF otherwise known as Project Turquoise), Lehman Brothers, the London Stock Exchange, AtosEuronext, Reuters and BT Radianz, had assembled on the top floor of The Gherkin (architect Norman Foster's homage to the pickled vegetable) in London's CBD as part of Intel's Faster City launch to celebrate the release of its Quad-Core Xeon Processor 5300 series.
Intel delivered the Quad-Core Xeon processors earlier than anticipated having recently launched its Dual-Core Xeon Processor. With customers such as investment banks and market data providers requiring even faster processing speeds and computational capabilities, Richard Curran, vice president, European operations, Intel, told attendees that Intel planned to reduce the number of man years it took to launch the next generation of its micro-architecture, which is scheduled for 2008.
Intel was obviously keen to enlighten the assembled investment bankers and exchanges as to how Quad-Core and Dual-Core Xeon Processors could help them reduce latency through faster processing speeds (4.5 times performance gain), whilst not hitting firms where it hurts the most in terms of reduced power consumption (from 110W to 80W) and maximising the use of scarce real estate for housing server farms.
In an effort perhaps to demonstrate the point, parked outside The Gherkin were a series of four or five scooters trailing Intel billboards that read something like, 'Good things come in small packages'. Perhaps a racing car would have been more appropriate though as the theme of the evening was 'the need for speed'.
Peter Moss, global head, enterprise solutions, Reuters, chipped in that a year ago when it was benchmarking microprocessors in its labs, AMD chips were faster than Intel's. But recent studies at its Securities Technology Analysis Centre of the Linux version of Reuters' Market Data System running on a HP server using Dual-Core Intel Xeon processors, found that Intel had the edge.
The evening's host, Nigel Woodward, head, financial services, Intel, led a panel debate about the 'need for speed' amongst investment banks, exchanges and market data providers in the City of London. He joked that he did not want to turn the discussion into a debate on MiFID, but he may as well have as the list of panellists he had assembled (investment banks, exchanges, market data providers) meant it was difficult to ignore the heightened competition that is rapidly emerging amongst all of them.
Credit Suisse and Lehman Brothers are already competitors, but if they become systematic internalisers under MiFID or band together to form rival execution venues, which at least one of them has done, then they pose a serious competitive threat to the LSE, Deutsche Bourse and Euronext who will also be competing with one another for business under MiFID.
The question is will Intel Quad-Core Xeon processors be an essential part of each firms' armoury in the new competitive landscape that beckons? Kevin Covington, head, new product development, global network provider, BT Radianz, likened the quest for speed spurred on by the rise of algorithmic trading, which is only likely to increase under MiFID, to an "arms race".
Whilst the issue of latency dominated the debate, the panellists tippy-toed around the real implications of faster trading and execution times. Ultimately it is about customers wanting trades to be executed more quickly and cheaply, but the upshot of all that is a new competitive landscape where the exchanges will be seriously challenged by supposedly higher speed and cheaper alternative execution venues. Broker-dealers will also have to constantly prove that they are faster and better than the next guy.
PJ DiGiammarino, CEO of JWG-IT alluded to the scale of change likely to occur under MiFID when he said he expected 2007 - the year of MiFID - to be the most "memorable of our lives". "Costs have got to come down," he said. John Goodie, global head, exchange business unit, AtosEuronext, didn't beat around the bush saying that exchange consolidation and price wars were definitely on the cards.
Not surprisingly perhaps, the LSE's representative, CTO Robin Paine played his cards close to his chest hinting at the new competitive landscape that was emerging in the form of Project Turquoise. "The ability to continue to innovate and deliver consistency and predictability in terms of latency," are the challenges ahead for the LSE, he said. But surely it is difficult for any 'monopoly' to innovate to the extent that may be required?
One thing perhaps that we can be certain of is that post-MiFID, don't be surprised if you look under the hood of trading engines that you find Intel Quad-Core Xeon processors ticking over.
When is ESP not ESP?
Whenever a new concept in technology makes it onto the radar screens of analysts and a few forward thinking companies, vendors tend to want to share in some of the limelight. That is why, for example, after Gartner analysts coined the phrase, Enterprise Service Bus (ESB) and it gained significant notoriety and publicity, even mainstream EAI vendors that initially rejected the ESB concept, were champing at the bit to say, 'We've got an ESB offering too.'
It appears that the same thing may be happening in the event stream processing (ESP) space. In my last post I covered off on ESP, a relatively nascent market, and how it was being used in trading applications, logistics and company supply chains to enable companies to respond and act on real-time streaming data and events.
A word of warning, however, is that as ESP is a relatively immature market, definitions of what constitutes ESP differ from vendor to vendor. Phil Howard, research director at Bloor Group, defines an event, as "an event of some importance." In other words, an event stream processing application is not interested in every event that may occur.
Events for example, can come from transaction databases, Bloomberg or Reuters market data feeds or RFID tags on boxes of books. Event processing is also about managing exceptions such as credit card fraud detection. The next step up from that is complex event processing (CEP), which Howard says is managing 'a set of different exceptions.' "It is easiest to think of ESP as a pipe with water flowing through it and onto that pipe are placed fine mesh grills," Howard explains. "The water flows through those mesh grills, which are not fixed but interchangeable."
In its SOA maturity model, Oracle apparently puts CEP at Level 5, indicating that for most companies it is something that they consider implementing much later on if at all.
However, as Howard points out, firms can implement CEP without having to go down the service-oriented architecture route. In algorithmic trading, for example, which uses event stream processing to detect movements in stocks based on pre-configured algorithms, firms have not necessarily implemented an SOA.
Whilst event stream processing is about handling throughput of data, when it comes to complex event processing, Howard says it is all about implementing technology that can handle complex data streams. Traditional relational databases are less favored in this environment as the perception is that they fall far short of the requirements for responding to incoming data streams in a timely fashion.
By now you are probably thinking isn't ESP or CEP just another form of business intelligence? Well, yes of sorts. According to Howard, event processing incorporates real-time operational business intelligence. However, he adds, some of the core business intelligence software vendors such as Business Objects, have technology that is not "process aware", which is needed if companies want to build operational business intelligence platforms based on CEP or ESP.
Howard says some of the database vendors are looking to embed more intelligence into their data warehousing offerings. He cites the example of Sybase, which he says is looking to partner with an event processing vendor on the front end so it can offer a complete solution. IBM's WebSphere Front Office for Financial Markets allows companies to combine and filter data feeds, and although it may act as a front-end to an event processing engine, according to Bloor Group it is not an event processing solution as such. Click here for more of Bloor's insights.
It appears that the same thing may be happening in the event stream processing (ESP) space. In my last post I covered off on ESP, a relatively nascent market, and how it was being used in trading applications, logistics and company supply chains to enable companies to respond and act on real-time streaming data and events.
A word of warning, however, is that as ESP is a relatively immature market, definitions of what constitutes ESP differ from vendor to vendor. Phil Howard, research director at Bloor Group, defines an event, as "an event of some importance." In other words, an event stream processing application is not interested in every event that may occur.
Events for example, can come from transaction databases, Bloomberg or Reuters market data feeds or RFID tags on boxes of books. Event processing is also about managing exceptions such as credit card fraud detection. The next step up from that is complex event processing (CEP), which Howard says is managing 'a set of different exceptions.' "It is easiest to think of ESP as a pipe with water flowing through it and onto that pipe are placed fine mesh grills," Howard explains. "The water flows through those mesh grills, which are not fixed but interchangeable."
In its SOA maturity model, Oracle apparently puts CEP at Level 5, indicating that for most companies it is something that they consider implementing much later on if at all.
However, as Howard points out, firms can implement CEP without having to go down the service-oriented architecture route. In algorithmic trading, for example, which uses event stream processing to detect movements in stocks based on pre-configured algorithms, firms have not necessarily implemented an SOA.
Whilst event stream processing is about handling throughput of data, when it comes to complex event processing, Howard says it is all about implementing technology that can handle complex data streams. Traditional relational databases are less favored in this environment as the perception is that they fall far short of the requirements for responding to incoming data streams in a timely fashion.
By now you are probably thinking isn't ESP or CEP just another form of business intelligence? Well, yes of sorts. According to Howard, event processing incorporates real-time operational business intelligence. However, he adds, some of the core business intelligence software vendors such as Business Objects, have technology that is not "process aware", which is needed if companies want to build operational business intelligence platforms based on CEP or ESP.
Howard says some of the database vendors are looking to embed more intelligence into their data warehousing offerings. He cites the example of Sybase, which he says is looking to partner with an event processing vendor on the front end so it can offer a complete solution. IBM's WebSphere Front Office for Financial Markets allows companies to combine and filter data feeds, and although it may act as a front-end to an event processing engine, according to Bloor Group it is not an event processing solution as such. Click here for more of Bloor's insights.
Wednesday, November 22, 2006
Dealing with complexity
Ok folks here goes. The next big thing according to those in the know (analysts) is CEP and ESP on an ESB or SOA for real-time business intelligence or BAM. I thought I would try and cram as many three letter acronyms into one sentence as possible to show how ridiculous analysts' obsession with three letter acronyms has become.
By now you are probably thinking here we go again. First it was ESB (enterprise service bus),then SOA (service-oriented architecture), now its CEP (complex event processing). As one journalist from Information Age quipped recently at a Progress Software press event about event processing (an umbrella term used to talk about CEP and ESP-event stream processing),'Everyone is bored of SOA, we've heard it all before,' and by the way is anyone actually doing it? So is CEP or ESP just another three letter acronym destined for the same fate?
Well, unless you design trading algorithms or are a logistics company tracking goods throughout the supply chain, you probably have not heard of CEP or ESP, therefore your boredom threshold is unlikely to have been maxed out yet. And as for whether people are actually doing it, well, the short answer, very few. According to Mark Palmer, general manager, Apama division, Progress Software, the Event Driven Architecture market is currently worth $30 to $50 million, small fry by software standards.
A lot of these three letter acronymns tend to be the 'love child' of computer boffins who spend most of their lives in laboratories dreaming up great whizz bang technologies that rarely find it into every day application. You could say event processing is one of these technologies having been pioneered by boffins at Stanford and Cambridge universities. However, Phil Howard, research director at Bloor Research, believes that event processing will be widely used, but that adoption will be gradual. After all, event processing has yet to reach the peak of its hype cycle on Gartner's adoption curve before it descends into the 'trough of disillusionment'.
Having said that, Dr Giles Nelson, director of technology, Progress Software and co-founder of Apama, a Cambridge UK startup (bought by Progress) that developed one of the first CEP engines, did a good job of explaining why you may want to think about adopting event processing some time in the not too distant future, particularly if you are a business that needs to make rapid decisions on streaming data (tick prices, for example) that is "constantly changing". According to Dr Giles, putting the complexities of the technology itself aside, the nuts and bolts of event processing "is about being able to understand information in real time."
This information could be coming from multiple sources both within and outside the company, RFID tags for example on cases of goods. But Nelson made the clear distinction between business intelligence software, which tends to be based on historical data and doesn't allow someone to act on that data in real time, and ESP. Unlike conventional data management models, where data is indexed and stored and then request/response queries are made on it, if a company needs to act on information in real time, Nelson says there is no time to index data. "That is why SQL is unsuitable for this," he adds. Also vendors like Progress Software tend to favour object oriented databases as opposed to relational for ESP.
In a nutshell, ESP says Nelson, is about storing queries and then flowing data (both historical and real-time) across them. Great you say, but what would I use it for? Well, it has long been used by algorithmic traders who want to test out VWAP and other trading strategies on real-time and historical data. It could also find application under regulations such as MiFID in Europe and RegNMS in the US, where the emphasis is on achieving best price for clients and smart order routing to the cheapest execution venue. As best price is a constantly changing variable, according to Progress it is suited to ESP.
By now you are probably thinking here we go again. First it was ESB (enterprise service bus),then SOA (service-oriented architecture), now its CEP (complex event processing). As one journalist from Information Age quipped recently at a Progress Software press event about event processing (an umbrella term used to talk about CEP and ESP-event stream processing),'Everyone is bored of SOA, we've heard it all before,' and by the way is anyone actually doing it? So is CEP or ESP just another three letter acronym destined for the same fate?
Well, unless you design trading algorithms or are a logistics company tracking goods throughout the supply chain, you probably have not heard of CEP or ESP, therefore your boredom threshold is unlikely to have been maxed out yet. And as for whether people are actually doing it, well, the short answer, very few. According to Mark Palmer, general manager, Apama division, Progress Software, the Event Driven Architecture market is currently worth $30 to $50 million, small fry by software standards.
A lot of these three letter acronymns tend to be the 'love child' of computer boffins who spend most of their lives in laboratories dreaming up great whizz bang technologies that rarely find it into every day application. You could say event processing is one of these technologies having been pioneered by boffins at Stanford and Cambridge universities. However, Phil Howard, research director at Bloor Research, believes that event processing will be widely used, but that adoption will be gradual. After all, event processing has yet to reach the peak of its hype cycle on Gartner's adoption curve before it descends into the 'trough of disillusionment'.
Having said that, Dr Giles Nelson, director of technology, Progress Software and co-founder of Apama, a Cambridge UK startup (bought by Progress) that developed one of the first CEP engines, did a good job of explaining why you may want to think about adopting event processing some time in the not too distant future, particularly if you are a business that needs to make rapid decisions on streaming data (tick prices, for example) that is "constantly changing". According to Dr Giles, putting the complexities of the technology itself aside, the nuts and bolts of event processing "is about being able to understand information in real time."
This information could be coming from multiple sources both within and outside the company, RFID tags for example on cases of goods. But Nelson made the clear distinction between business intelligence software, which tends to be based on historical data and doesn't allow someone to act on that data in real time, and ESP. Unlike conventional data management models, where data is indexed and stored and then request/response queries are made on it, if a company needs to act on information in real time, Nelson says there is no time to index data. "That is why SQL is unsuitable for this," he adds. Also vendors like Progress Software tend to favour object oriented databases as opposed to relational for ESP.
In a nutshell, ESP says Nelson, is about storing queries and then flowing data (both historical and real-time) across them. Great you say, but what would I use it for? Well, it has long been used by algorithmic traders who want to test out VWAP and other trading strategies on real-time and historical data. It could also find application under regulations such as MiFID in Europe and RegNMS in the US, where the emphasis is on achieving best price for clients and smart order routing to the cheapest execution venue. As best price is a constantly changing variable, according to Progress it is suited to ESP.
Big Brother is watching
When quizzed at Sibos as to whether he would have done things differently regarding SWIFT allowing US intelligence agencies to monitor traffic on its network, Leonard Schrank said he never expected the Brussels banking co-operative to end up on the front page of major newspapers.
The bank owned financial messaging network has tended to avoid mainstream publicity and gone quietly about its business without the average Joe on the street knowing or caring what really goes on in its La Hulpe headquarters. Well, not for much longer it seems. SWIFT had better get used to the publicity it seems with the Wall Street Journal carrying a report on Tuesday that the EU was likely to concur with the Belgian Privacy Commission's ruling that SWIFT violated European privacy laws when it allowed intelligence agencies to monitor transactions on its network.
SWIFT hopes that the global community can agree on a set of data privacy standards to help organisations like itself in this situation. In its legal rebuttal to the Belgian Privacy Commission's ruling, SWIFT argues that "the boundary between security and data privacy must be defined by governments."
It did not take too kindly to the Privacy Commission's finding that SWIFT had "committed a serious error of judgement". SWIFT's argument is that as it simply transmits financial messages on behalf of financial institutions based on their instructions and does not access the data in financial messages, it is merely a "data controller" rather than a "data processor" and therefore as a "data controller" it has complied with Belgian privacy law.
The question is though should SWIFT be granting US intelligence agents access to the data in those messages without the permission of the banks sending them. Financial-i carried a report in its last issue saying that whilst SWIFT had alerted the G10 banks to its decision to allow US intelligence agents access to the messages, it had not informed its member banks. Obviously with 7000 member banks, informing all of them would be an onerous task.
But surely, the major banks with the most traffic on SWIFT deserved to be informed? After all SWIFT prides itself on the security of its network and therefore banks using that network assume that the messages they transmit on it are not going to be seen or tampered with by unauthorised parties.
SWIFT is correct in saying that a global data privacy framework needs to be formulated so that inconsistencies of interpretation where one country says it is OK to monitor transactions on a private network and another says it is not, does not arise again. However, by the same token, SWIFT perhaps also needs to address its own internal governance in terms of letting its member banks know what it is doing regardless of whether it is forced by certain laws in a particular country in which it operates to allow access to the transactions it carries.
Much like a packet of cigarettes carries a warning about the health risks, perhaps financial messaging networks should come with the warning that transactions transmitted on its network may be monitored for intelligence and surveillance purposes. After all isn't this the Big Brother era we live in?
The bank owned financial messaging network has tended to avoid mainstream publicity and gone quietly about its business without the average Joe on the street knowing or caring what really goes on in its La Hulpe headquarters. Well, not for much longer it seems. SWIFT had better get used to the publicity it seems with the Wall Street Journal carrying a report on Tuesday that the EU was likely to concur with the Belgian Privacy Commission's ruling that SWIFT violated European privacy laws when it allowed intelligence agencies to monitor transactions on its network.
SWIFT hopes that the global community can agree on a set of data privacy standards to help organisations like itself in this situation. In its legal rebuttal to the Belgian Privacy Commission's ruling, SWIFT argues that "the boundary between security and data privacy must be defined by governments."
It did not take too kindly to the Privacy Commission's finding that SWIFT had "committed a serious error of judgement". SWIFT's argument is that as it simply transmits financial messages on behalf of financial institutions based on their instructions and does not access the data in financial messages, it is merely a "data controller" rather than a "data processor" and therefore as a "data controller" it has complied with Belgian privacy law.
The question is though should SWIFT be granting US intelligence agents access to the data in those messages without the permission of the banks sending them. Financial-i carried a report in its last issue saying that whilst SWIFT had alerted the G10 banks to its decision to allow US intelligence agents access to the messages, it had not informed its member banks. Obviously with 7000 member banks, informing all of them would be an onerous task.
But surely, the major banks with the most traffic on SWIFT deserved to be informed? After all SWIFT prides itself on the security of its network and therefore banks using that network assume that the messages they transmit on it are not going to be seen or tampered with by unauthorised parties.
SWIFT is correct in saying that a global data privacy framework needs to be formulated so that inconsistencies of interpretation where one country says it is OK to monitor transactions on a private network and another says it is not, does not arise again. However, by the same token, SWIFT perhaps also needs to address its own internal governance in terms of letting its member banks know what it is doing regardless of whether it is forced by certain laws in a particular country in which it operates to allow access to the transactions it carries.
Much like a packet of cigarettes carries a warning about the health risks, perhaps financial messaging networks should come with the warning that transactions transmitted on its network may be monitored for intelligence and surveillance purposes. After all isn't this the Big Brother era we live in?
Monday, November 20, 2006
Train wreck ahead chapter 2
At the Sibos conference in Sydney, Karen Cone CEO of TowerGroup described the lack of automation in the derivatives space as a "train wreck waiting to happen". Click here to see post.
Analysts like to use colourful language when it comes to describing processing inefficiencies and regulators don't like to mince their words when it comes to calling on banks to automate or else.
The reason for all the concern. Well derivatives volumes are growing at a 'meteoric' (I thought I would try to be as colourful as the analysts) rate, with business in credit default swaps increasing by 52% in the first six months of this year to $26 trillion, according to ISDA figures.
When any business that is complex and not automated grows at such a rate, the regulators start biting their fingernails, then it is left to the industry to work out a way to help the regulators sleep at night.
Such is the case with OTC derivatives. However, the conclusions drawn from a recent roundtable event sponsored by enterprise content manager provider Interwoven and comprising representatives from buy-side firms, suggests that regulators' blanket approach to automation, may be ignoring some of the finer points and idiosyncrasies that make OTC derivatives 'unique'.
The roundtable concluded that the proliferation of initiatives to automate OTC derivatives had failed to consider the systemic risk implications. In other words, as buy side firms are not well acquainted with the "structuring" of OTC derivatives, automating them may lull firms into a false sense of security about the level of risk exposure.
"The discussion on automation has to have a huge caveat against it because there is systemic risk in automating a product [firms] do not truly understand," said Jos Stoop, general manager, financial services solutions, Interwoven. The roundtable stressed that OTC derivatives are "unique, bespoke products" and that automation may seek to standardise where no ‘standard’ exists.
Despite efforts to tackle the backlog of confirmations for "vanilla" credit derivatives, roundtable participants indicated that there may be a "two year time lag" between new products being developed and participants agreeing on processes and standards for automating confirmations.
Some indication of the complexities associated with automating derivates was given by Bill Stenning, vice president, business development of the DTCC's Deriv/SERV. Whilst the DTCC has been successful in deploying master confirmations to cover credit default swaps, he said that it was difficult to produce standards for the more structured end of the market, which was always one step ahead of standard setters.
According to Stoop, automating derivatives confirmations is not just about standardising data but standardising communication and then leaving it to counterparties to agree on what the data should mean.
Analysts like to use colourful language when it comes to describing processing inefficiencies and regulators don't like to mince their words when it comes to calling on banks to automate or else.
The reason for all the concern. Well derivatives volumes are growing at a 'meteoric' (I thought I would try to be as colourful as the analysts) rate, with business in credit default swaps increasing by 52% in the first six months of this year to $26 trillion, according to ISDA figures.
When any business that is complex and not automated grows at such a rate, the regulators start biting their fingernails, then it is left to the industry to work out a way to help the regulators sleep at night.
Such is the case with OTC derivatives. However, the conclusions drawn from a recent roundtable event sponsored by enterprise content manager provider Interwoven and comprising representatives from buy-side firms, suggests that regulators' blanket approach to automation, may be ignoring some of the finer points and idiosyncrasies that make OTC derivatives 'unique'.
The roundtable concluded that the proliferation of initiatives to automate OTC derivatives had failed to consider the systemic risk implications. In other words, as buy side firms are not well acquainted with the "structuring" of OTC derivatives, automating them may lull firms into a false sense of security about the level of risk exposure.
"The discussion on automation has to have a huge caveat against it because there is systemic risk in automating a product [firms] do not truly understand," said Jos Stoop, general manager, financial services solutions, Interwoven. The roundtable stressed that OTC derivatives are "unique, bespoke products" and that automation may seek to standardise where no ‘standard’ exists.
Despite efforts to tackle the backlog of confirmations for "vanilla" credit derivatives, roundtable participants indicated that there may be a "two year time lag" between new products being developed and participants agreeing on processes and standards for automating confirmations.
Some indication of the complexities associated with automating derivates was given by Bill Stenning, vice president, business development of the DTCC's Deriv/SERV. Whilst the DTCC has been successful in deploying master confirmations to cover credit default swaps, he said that it was difficult to produce standards for the more structured end of the market, which was always one step ahead of standard setters.
According to Stoop, automating derivatives confirmations is not just about standardising data but standardising communication and then leaving it to counterparties to agree on what the data should mean.
LSE does it again
Only just the other day I remarked that in the ongoing exchange consolidation wars, the London Stock Exchange and Deutsche Bourse stood out like pimples on a pumpkin in failing to elicit greater economies of scale.
Deutsche Bourse's failure is not for lack of trying; it has courted the LSE on more than one occasion and then Euronext, but it was rebuked. While Deutsche Bourse is the rebuked, the LSE on the other hand, seems to be comfortable playing the role of the 'rebuker', having just rejected Nasdaq's bid for the second time according to FT reports.
After the failure of its first bid, which valued the LSE at 950 pence a share, Nasdaq CEO Robert Greifeld pitted his luck on his second round offer of 1,243 pence a share, which he described as a "full and fair offer." Well, try and tell that to Clara Furse, CEO of the LSE, who appears to be hedging her bets. Either she is happy for the exchange to go it alone in the ongoing exchange consolidation battle or she is waiting for a better offer?
Octavio Marenzi, CEO, Celent, says that the LSE has staked out a fiercely independent path and its recent outstanding earnings mean that Furse feels under no pressure to be bought out by anyone. "Firms making bids for the London Stock Exchange continue to show rather odd behaviour," says Marenzi. "Most notably, there is the tendency to make offers below the current market value of the exchange. Nasdaq appears to be attempting this again, with predictable consequences -- either the bid must be raised or it will fail."
Will Nasdaq retreat from its current bid like Deutsche Bourse has from its attempts to court Euronext? It is anyone's guess. However, despite healthy earnings at the LSE, can it afford to go it alone for much longer in view of recent announcements by Equiduct that it will establish a pan-European exchange and that seven investment banks will build an MTF?
As firms gear up for MiFID, a somewhat different trading landscape awaits the national exchanges, which have enjoyed monopoly status. But as rival "high speed" execution venues emerge offering lower trading tariffs and cheaper post-trade reporting services than the LSE, can Furse and the LSE's shareholders sustain their go-it-alone stance?
Deutsche Bourse's failure is not for lack of trying; it has courted the LSE on more than one occasion and then Euronext, but it was rebuked. While Deutsche Bourse is the rebuked, the LSE on the other hand, seems to be comfortable playing the role of the 'rebuker', having just rejected Nasdaq's bid for the second time according to FT reports.
After the failure of its first bid, which valued the LSE at 950 pence a share, Nasdaq CEO Robert Greifeld pitted his luck on his second round offer of 1,243 pence a share, which he described as a "full and fair offer." Well, try and tell that to Clara Furse, CEO of the LSE, who appears to be hedging her bets. Either she is happy for the exchange to go it alone in the ongoing exchange consolidation battle or she is waiting for a better offer?
Octavio Marenzi, CEO, Celent, says that the LSE has staked out a fiercely independent path and its recent outstanding earnings mean that Furse feels under no pressure to be bought out by anyone. "Firms making bids for the London Stock Exchange continue to show rather odd behaviour," says Marenzi. "Most notably, there is the tendency to make offers below the current market value of the exchange. Nasdaq appears to be attempting this again, with predictable consequences -- either the bid must be raised or it will fail."
Will Nasdaq retreat from its current bid like Deutsche Bourse has from its attempts to court Euronext? It is anyone's guess. However, despite healthy earnings at the LSE, can it afford to go it alone for much longer in view of recent announcements by Equiduct that it will establish a pan-European exchange and that seven investment banks will build an MTF?
As firms gear up for MiFID, a somewhat different trading landscape awaits the national exchanges, which have enjoyed monopoly status. But as rival "high speed" execution venues emerge offering lower trading tariffs and cheaper post-trade reporting services than the LSE, can Furse and the LSE's shareholders sustain their go-it-alone stance?
Wednesday, November 15, 2006
Equiduct responds to MTF announcement
Below I have pasted Bob Fuller, CEO of Equiduct's response to the announcement by seven leading investment banks that they will launch an electronic trading platform to rival the leading European exchanges.
Unlike the consortium of investment banks, which will build a multilateral trading facility or MTF, Equiduct, which will also launch in 2007, aims to provide a pan-European platform for trading liquid shares post-MiFID. Although the announcement by the consortium of Tier 1 investment banks signals that they will build their own solution, Fuller anticipates that there will be enough appetite for its pan-European platform amongst Tier 2 and Tier 3 investment banks that do not want to shoulder the upfront investment costs associated with MiFID compliance.
Unlike the consortium of investment banks, which will build a multilateral trading facility or MTF, Equiduct, which will also launch in 2007, aims to provide a pan-European platform for trading liquid shares post-MiFID. Although the announcement by the consortium of Tier 1 investment banks signals that they will build their own solution, Fuller anticipates that there will be enough appetite for its pan-European platform amongst Tier 2 and Tier 3 investment banks that do not want to shoulder the upfront investment costs associated with MiFID compliance.
"Today's announcement that a group of seven leading investment banks will confirm detailed plans to launch a trading rival to the London Stock Exchange is exciting news, and confirms the growing market requirement for alternative solutions to help meet the challenge of MiFID implementation within Europe," says Fuller.
"At Equiduct we believe that their announcement validates our analysis on the need for MiFID-compliant, Europe-wide trading. However it's interesting that they've decided to go down the MTF route, rather than opting as Equiduct has done to set up as a fully regulated pan-European electronic platform.
"This, coupled with our open access to clearing and settlement providers, gives us a distinctive proposition. We'll also be interested to see how the European marketplace relates to an offering owned by a small number of Tier-1 banks.
"With Equiduct we'll be primarily focusing on the significant number of Tier-2 and Tier-3 investment banks that can't afford to build their own solutions, however we also expect that we'll be continuing our discussions with Tier-1 investment banks including many of those currently involved in other projects as we're finding that our platform brings solutions regarding MiFID best execution that cannot easily be replicated.
Investment banks take on Europe's exchanges
The onset of MiFID in Europe's capital markets and the indecision of Europe's stock exchanges over whom they should merge with appears to be fuelling discontent amongst the world's largest investment banks.
Tired of waiting for consolidation talks between the big three exchanges, the LSE, Euronext and Deutsche Borse to bear fruit, leading investment banks such as Goldman Sachs, UBS, Merrill Lynch and Morgan Stanley have taken matters into their own hands, announcing the formation of a new "high speed" electronic trading system for European shares, which will launch in 2007. The platform will offer trading at tariffs lower than those provided by Europe's leading exchanges.
You don't need to be a rocket scientist to have seen this one coming, although that doesn't seem to have prevented the exchanges from dragging their feet. Many analysts predicted that MiFID, which removes the 'concentration rule' that forced trades in a number of countries to be conducted on national exchanges, would result in alternative trading venues being established by third parties, including investment banks.
Just recently, EASDAQ rose phoenix-like from the ashes with the announcement of a new pan-European exchange Equiduct, also launching next year, leveraging EASDAQ technology, which will provide a single point of connectivity for trading "liquid shares".
Surely the LSE, Deutsche Bourse and Europe's other myriad exchanges can no longer avoid the inevitable; consolidation and tariff reductions? The question is can Europe afford to support all the national exchanges plus new trading facilities that are likely to emerge post-MiFID? And what does this mean in terms of fragmenting liquidity? Presumably, trade flows will eventually gravitate towards those platforms that are not only cheaper, but faster, more cost effective and can provide best execution and a whole raft of services around pre- and post-trade transparency.
But for now poor old Deutsche Borse, which can't seem to find anyone that wants to merge with it, the LSE, and Euronext to a lesser extent as it looks like it will accept the NYSE's bid, are out in the cold in terms of finding suitable bedfellows to shack up with and to help deliver greater economies of scale.
Tired of waiting for consolidation talks between the big three exchanges, the LSE, Euronext and Deutsche Borse to bear fruit, leading investment banks such as Goldman Sachs, UBS, Merrill Lynch and Morgan Stanley have taken matters into their own hands, announcing the formation of a new "high speed" electronic trading system for European shares, which will launch in 2007. The platform will offer trading at tariffs lower than those provided by Europe's leading exchanges.
You don't need to be a rocket scientist to have seen this one coming, although that doesn't seem to have prevented the exchanges from dragging their feet. Many analysts predicted that MiFID, which removes the 'concentration rule' that forced trades in a number of countries to be conducted on national exchanges, would result in alternative trading venues being established by third parties, including investment banks.
Just recently, EASDAQ rose phoenix-like from the ashes with the announcement of a new pan-European exchange Equiduct, also launching next year, leveraging EASDAQ technology, which will provide a single point of connectivity for trading "liquid shares".
Surely the LSE, Deutsche Bourse and Europe's other myriad exchanges can no longer avoid the inevitable; consolidation and tariff reductions? The question is can Europe afford to support all the national exchanges plus new trading facilities that are likely to emerge post-MiFID? And what does this mean in terms of fragmenting liquidity? Presumably, trade flows will eventually gravitate towards those platforms that are not only cheaper, but faster, more cost effective and can provide best execution and a whole raft of services around pre- and post-trade transparency.
But for now poor old Deutsche Borse, which can't seem to find anyone that wants to merge with it, the LSE, and Euronext to a lesser extent as it looks like it will accept the NYSE's bid, are out in the cold in terms of finding suitable bedfellows to shack up with and to help deliver greater economies of scale.
Tuesday, November 14, 2006
IBM in bid for Chinese bank
Bloomberg carried an interesting tidbit of news yesterday regarding rumours that IBM would join Citigroup's $3 billion bid for China's Guangdong Development Bank. For the full story on Bloomberg click here.
According to the Bloomberg report, two unnamed sources said IBM may take a 5% stake in the Guangdong bank, which would help it in its efforts to win a contract from the Chinese bank to upgrade its IT systems. If the deal goes ahead, it will certainly signal the start of an interesting trend in IT services delivery, particularly in the fast growing Chinese market, where a number of major IT services providers are eager to win new business.
It does raise a number of questions though. If IBM were to gain a 5% stake in Guangdong Development Bank, does that mean that it would have first right of refusal over any IT work that needs to be done within the bank? And what would this mean for other IT services providers that may also want to bid for business with the bank?
Admittedly 5% is not a controlling stake, and arguably a company of IBM's scale with a strong IT services and software business devoted to the financial services sector would have a lot to offer a Chinese bank looking to upgrade its IT systems. But isn't it more a question of if Big Blue buys a stake in a bank it wants to do business with, does that give it an unfair advantage over other IT services providers seeking to win business from the same Chinese bank?
In a separate development, the Bloomberg report went on to talk about allegations that IBM paid $225,000 to a sales agent who is accused of bribing the former chairman of China Construction Bank, the country's fourth largest bank. According to FT reports, Zhang Enzhao, former chairman of China Construction Bank, was jailed for 15 years last week after being convicted of taking bribes in return for granting information technology contracts.
According to the Bloomberg report, two unnamed sources said IBM may take a 5% stake in the Guangdong bank, which would help it in its efforts to win a contract from the Chinese bank to upgrade its IT systems. If the deal goes ahead, it will certainly signal the start of an interesting trend in IT services delivery, particularly in the fast growing Chinese market, where a number of major IT services providers are eager to win new business.
It does raise a number of questions though. If IBM were to gain a 5% stake in Guangdong Development Bank, does that mean that it would have first right of refusal over any IT work that needs to be done within the bank? And what would this mean for other IT services providers that may also want to bid for business with the bank?
Admittedly 5% is not a controlling stake, and arguably a company of IBM's scale with a strong IT services and software business devoted to the financial services sector would have a lot to offer a Chinese bank looking to upgrade its IT systems. But isn't it more a question of if Big Blue buys a stake in a bank it wants to do business with, does that give it an unfair advantage over other IT services providers seeking to win business from the same Chinese bank?
In a separate development, the Bloomberg report went on to talk about allegations that IBM paid $225,000 to a sales agent who is accused of bribing the former chairman of China Construction Bank, the country's fourth largest bank. According to FT reports, Zhang Enzhao, former chairman of China Construction Bank, was jailed for 15 years last week after being convicted of taking bribes in return for granting information technology contracts.
Monday, November 13, 2006
Claims and counterclaims in VoIP space
The tendency for some software and hardware providers to claim that they are first with something, always sets alarm bells ringing in my head.
Recently we published an article in financial-i magazine on the use of Voice over IP in financial institution's business continuity strategies. Obviously miffed that they were not mentioned in the piece, IPC Systems, a leading provider of VoIP turrets to trading floors, got in contact saying they were first in this area and were disappointed they were not mentioned.
Financial-i has covered the VoIP market since 2000. However, sometimes the way vendors evangelise about the very technologies they are promoting, can be somewhat misleading. VoIP on the trading floor is gaining wider acceptance amongst financial institutions, and there is no doubt that for those firms wanting to set up a robust communications infrastructure incorporating remote and mobile trading facilities, VoIP ticks a lot of the boxes.
My understanding of IPC Systems is that they offer an IP only solution, having installed IP at 40,000 desktops. The flip side of that however, is that not all companies want just an IP solution. Some want to maintain their legacy TDM (Time Division Multiplexing) applications and migrate gradually to IP. There are also concerns about voice quality of IP networks over longer distances, particularly when you start talking about cross-border installations. Taking all of this into consideration, is an IP only approach to VoIP on the trading floor the best approach?
Some competing vendors of IPC Systems have told me that whilst its IP only approach may work in the US, it does not necessarily fly on this side of the Atlantic. Despite the merits of VoIP technology it does not guarantee overnight adoption. Although I have heard of some companies moving totally to IP, a hybrid approach combining TDM and IP appears to works best for other companies.
Also there seems to be some confusion around who was first to offer a combined TDM and IP solution. Last year Etrali (now called Orange Business Services Trading Solutions) announced that it would offer traders a choice of both TDM and IP. BT Trading Systems also made a similar announcement. What is the difference between the two?
Well according to Orange Business Services, what sets them apart from BT Radianz is that they offer both TDM and IP as standard within the turret, meaning that nothing has to be changed when companies decide to migrate from TDM to IP.
As the hype around VoIP reaches fever pitch, it is all to easy to prescribe it as an antidote for all of a company's network's connectivity and communication needs. But maybe we need to take a step back, read the fine print as to what the technology can and cannot do and what vendors in this space are offering, before we all clamber aboard the bandwagon.
Recently we published an article in financial-i magazine on the use of Voice over IP in financial institution's business continuity strategies. Obviously miffed that they were not mentioned in the piece, IPC Systems, a leading provider of VoIP turrets to trading floors, got in contact saying they were first in this area and were disappointed they were not mentioned.
Financial-i has covered the VoIP market since 2000. However, sometimes the way vendors evangelise about the very technologies they are promoting, can be somewhat misleading. VoIP on the trading floor is gaining wider acceptance amongst financial institutions, and there is no doubt that for those firms wanting to set up a robust communications infrastructure incorporating remote and mobile trading facilities, VoIP ticks a lot of the boxes.
My understanding of IPC Systems is that they offer an IP only solution, having installed IP at 40,000 desktops. The flip side of that however, is that not all companies want just an IP solution. Some want to maintain their legacy TDM (Time Division Multiplexing) applications and migrate gradually to IP. There are also concerns about voice quality of IP networks over longer distances, particularly when you start talking about cross-border installations. Taking all of this into consideration, is an IP only approach to VoIP on the trading floor the best approach?
Some competing vendors of IPC Systems have told me that whilst its IP only approach may work in the US, it does not necessarily fly on this side of the Atlantic. Despite the merits of VoIP technology it does not guarantee overnight adoption. Although I have heard of some companies moving totally to IP, a hybrid approach combining TDM and IP appears to works best for other companies.
Also there seems to be some confusion around who was first to offer a combined TDM and IP solution. Last year Etrali (now called Orange Business Services Trading Solutions) announced that it would offer traders a choice of both TDM and IP. BT Trading Systems also made a similar announcement. What is the difference between the two?
Well according to Orange Business Services, what sets them apart from BT Radianz is that they offer both TDM and IP as standard within the turret, meaning that nothing has to be changed when companies decide to migrate from TDM to IP.
As the hype around VoIP reaches fever pitch, it is all to easy to prescribe it as an antidote for all of a company's network's connectivity and communication needs. But maybe we need to take a step back, read the fine print as to what the technology can and cannot do and what vendors in this space are offering, before we all clamber aboard the bandwagon.
Wednesday, November 08, 2006
Hats off to Red Hat
What a month it has been for the Linux or open source software movement. Software companies that were slow to heed the open source mantra: providing open access to the source code of software applications so that it could be changed, improved on and altered by other software developers; are now embracing Linux with new found religious zeal.
At Oracle OpenWorld 2006 recently, CEO Larry Ellison, intent on world domination in the enterprise application software space, announced enterprise level support for Linux in the form of its Enterprise Linux Program. Some commentators billed the announcement as Ellison living up to his 'pirate of Silicon Valley' reputation. For more on this visit Network World.
Oracle's announcement has re-ignited the debate around patent infringement in open source software development. In 2003, SCO Group slapped a $3 billion lawsuit on IBM alleging that Big Blue had illegally copied SCO's proprietary UNIX code into its Linux operating system. In order to head off any patent infringement allegations from the major software vendors, Red Hat told its customers it would rewrite code found to violate another's intellectual property.
Late last week Microsoft, a long standing critic of the open source software movement got into bed with Novell, which following its acquisition of Ximian and SUSE, shifted its allegiances from the Unix to the Linux camp. In an effort to preserve the dominance of its Windows servers and operating system Microsoft has long resisted the allure of the open source movement.
But with Ellison upping the stakes in terms of Oracle's enterprise application support for Linux, Microsoft obviously twigged that it could not afford to maintain its nonchalant stance towards Linux for much longer so it jumped into bed with one of Linux's biggest proponents, Novell.
Microsoft's business and strategic partnership with Novell is meant to help solve integration issues for those customers running Windows and Linux environments. These customers are likely to benefit the most from the Microsoft/Novell announcement which addresses some of the key interoperability challenges that have prevented Linux from gaining wider market traction. The Oracle and Microsoft announcements should also mean more widespread availability of products and enterprise applications that have Linux embedded in them.
But for the true proponents of Linux, those that have always believed software should be for the people by the people, what will they make of Microsoft's sudden about face on Linux and Ellison's efforts to hijack a technology which truly innovative software companies like Red Hat helped pioneer?
While no software company should be exempt from competition, does Microsoft and Oracle's increasing stranglehold on Linux seriously challenge the true essence of the open source software movement; open access to code, free redistribution of software, and for any software modifications to be readily distributed under the same conditions as the original software license? For more information on the core principles of open source software click here.
Unfortunately, the true software innovators like Red Hat are likely to be overshadowed or swallowed up by their far bigger and wealthier competitors; the very companies that first shunned open source developers as 'pond life' on the outer edges of the software development community.
I wonder what Linus Torvalds, the father of the open source software movement, makes of all this? Would he it see as a boon for Linux or a retrograde step?
The illustration above first appeared in Le Virus Informatique hors serie numero 01. For more humourous illustrations on Linux and Microsoft, click here.
At Oracle OpenWorld 2006 recently, CEO Larry Ellison, intent on world domination in the enterprise application software space, announced enterprise level support for Linux in the form of its Enterprise Linux Program. Some commentators billed the announcement as Ellison living up to his 'pirate of Silicon Valley' reputation. For more on this visit Network World.
Oracle's announcement has re-ignited the debate around patent infringement in open source software development. In 2003, SCO Group slapped a $3 billion lawsuit on IBM alleging that Big Blue had illegally copied SCO's proprietary UNIX code into its Linux operating system. In order to head off any patent infringement allegations from the major software vendors, Red Hat told its customers it would rewrite code found to violate another's intellectual property.
Late last week Microsoft, a long standing critic of the open source software movement got into bed with Novell, which following its acquisition of Ximian and SUSE, shifted its allegiances from the Unix to the Linux camp. In an effort to preserve the dominance of its Windows servers and operating system Microsoft has long resisted the allure of the open source movement.
But with Ellison upping the stakes in terms of Oracle's enterprise application support for Linux, Microsoft obviously twigged that it could not afford to maintain its nonchalant stance towards Linux for much longer so it jumped into bed with one of Linux's biggest proponents, Novell.
Microsoft's business and strategic partnership with Novell is meant to help solve integration issues for those customers running Windows and Linux environments. These customers are likely to benefit the most from the Microsoft/Novell announcement which addresses some of the key interoperability challenges that have prevented Linux from gaining wider market traction. The Oracle and Microsoft announcements should also mean more widespread availability of products and enterprise applications that have Linux embedded in them.
But for the true proponents of Linux, those that have always believed software should be for the people by the people, what will they make of Microsoft's sudden about face on Linux and Ellison's efforts to hijack a technology which truly innovative software companies like Red Hat helped pioneer?
While no software company should be exempt from competition, does Microsoft and Oracle's increasing stranglehold on Linux seriously challenge the true essence of the open source software movement; open access to code, free redistribution of software, and for any software modifications to be readily distributed under the same conditions as the original software license? For more information on the core principles of open source software click here.
Unfortunately, the true software innovators like Red Hat are likely to be overshadowed or swallowed up by their far bigger and wealthier competitors; the very companies that first shunned open source developers as 'pond life' on the outer edges of the software development community.
I wonder what Linus Torvalds, the father of the open source software movement, makes of all this? Would he it see as a boon for Linux or a retrograde step?
The illustration above first appeared in Le Virus Informatique hors serie numero 01. For more humourous illustrations on Linux and Microsoft, click here.
Tuesday, November 07, 2006
The 'no frills' network provider
Last night I attended a function for luminaries of London's banking and financial services industry. Surrounded by bankers and technologists, one attendee remarked to me 'Isn't this a little dry for you?' One doesn't make a habit of attending after hours functions frequented by City bankers talking shop.
However, on this occasion I was motivated not only by the prospect of some banker letting slip a little tidbit of information, which may give me something to share with you on the blog. The main reason for me attending was to hear SWIFT CEO Leonard Schrank speak in less formal surroundings than the Sibos conference (yes, I know I am a glutton for punishment, as if I hadn't heard enough SWIFT speak at Sibos in Sydney). Away from the rehearsed script and glitzy video presentations, I thought Mr Schrank may make an off-the-cuff remark that would have his PR people tearing their hair out.
I have to say I was disappointed. Lenny as he prefers to be called in less formal surroundings, delivered a précis of the opening plenary speech he made at Sibos in Sydney, interspersed with a few personal anecdotes. It was along the lines that SWIFT's future growth strategy hinges on being able to conquer the BRIC countries, standardising derivatives contracts and enhancing corporate access to its network.
We had heard it all before, but Mr Schrank admitted that his own personal knowledge of the derivatives industry was limited (it appears some people would not know a credit default swap if it jumped up and bit them on the nose). But given SWIFT's successful track record in standards development, it does not see any reason why it cannot standardise a derivatives contract.
One of the questions from the audience was whether SWIFT would take the next step of allowing interoperability between corporates without them having to join a Closed User Group (CUGs)? It seems however that SWIFT is only focused on the Top 2000 corporates, the GE's, Microsoft's and IBM's of the world that are multi-banked and can afford to participate in CUGs. SMEs do not figure in its 2010 growth strategy.
Interestingly, at most banking events these days, the name PayPal rears its head. Bankers like to punish themselves by saying, 'Why didn't we think of that?' Well you didn't think of it, so get over it and move on, or take a leaf out of SWIFT's book. Apparently SWIFT executives do not lose sleep at night worrying whether PayPal is going to be the next biggest online threat to their business.
But it does raise an interesting question. Where is SWIFT's next biggest threat going to come from? Schrank did allude to the card providers such as Visa, but it is owned by the banks, and as we know banks are not known for moving quickly or being innovative. Some of the dinner guests also reminded me that SWIFT is a monopoly and despite the annual rebate carrot it likes to wave in front of its members, it is expensive compared to other IP networks.
A charming banker from India sitting next to me said he did not think SWIFT would be successful in expanding its network into emerging markets as most of India's domestic banks could not afford to connect to SWIFT. Furthermore, he said, India already had its own TCP/IP network, INFINET (Indian Financial Network), which is used for messaging, electronic debit and clearing, online processing, trade in government securities, centralised funds and inter-branch reconciliation.
I asked another gentleman at my dinner table as to why given the proliferation of internet bandwidth and the ubiquitousness of the internet, there were no serious contenders to SWIFT. He replied that there would be, comparing it to what had happened in the airline industry with low cost 'no frills' airlines emerging to challenge the hegemony of airlines like British Airways.
But there are already IP network providers that could challenge the dominance of SWIFT as the major financial messaging network. Looking at the intelligence Cisco Systems is embedding in its network, anything is possible. But why are networks like Cisco, BT Radianz and Savvis not providing a serious alternative to SWIFT? Part of the reason appears to be security. As SWIFT likes to remind its members, who may be tempted to jump ship to another network, in its more than 30 years of operations, no one has seriously cracked its multiple layers of authentication.
For now SWIFT can continue to rely on the robustness of its network. However, as a leading figure from UK bank Barclays pointed out, at some point the Belgian banking co-op may need to address splitting the standards it carries on its network from its governance. That however is unlikely to happen any time soon.
Although SWIFT may be happy to carry other standards on its network, like FpML for example (one audience member remarked it had no choice), Schrank does not believe its future lies in just providing the bandwidth or the pipeline that carries financial messages. With Schrank scheduled to retire next year, one wonders what his successor will make of all this? Will it be more of the same from SWIFT or a fresh approach?
However, on this occasion I was motivated not only by the prospect of some banker letting slip a little tidbit of information, which may give me something to share with you on the blog. The main reason for me attending was to hear SWIFT CEO Leonard Schrank speak in less formal surroundings than the Sibos conference (yes, I know I am a glutton for punishment, as if I hadn't heard enough SWIFT speak at Sibos in Sydney). Away from the rehearsed script and glitzy video presentations, I thought Mr Schrank may make an off-the-cuff remark that would have his PR people tearing their hair out.
I have to say I was disappointed. Lenny as he prefers to be called in less formal surroundings, delivered a précis of the opening plenary speech he made at Sibos in Sydney, interspersed with a few personal anecdotes. It was along the lines that SWIFT's future growth strategy hinges on being able to conquer the BRIC countries, standardising derivatives contracts and enhancing corporate access to its network.
We had heard it all before, but Mr Schrank admitted that his own personal knowledge of the derivatives industry was limited (it appears some people would not know a credit default swap if it jumped up and bit them on the nose). But given SWIFT's successful track record in standards development, it does not see any reason why it cannot standardise a derivatives contract.
One of the questions from the audience was whether SWIFT would take the next step of allowing interoperability between corporates without them having to join a Closed User Group (CUGs)? It seems however that SWIFT is only focused on the Top 2000 corporates, the GE's, Microsoft's and IBM's of the world that are multi-banked and can afford to participate in CUGs. SMEs do not figure in its 2010 growth strategy.
Interestingly, at most banking events these days, the name PayPal rears its head. Bankers like to punish themselves by saying, 'Why didn't we think of that?' Well you didn't think of it, so get over it and move on, or take a leaf out of SWIFT's book. Apparently SWIFT executives do not lose sleep at night worrying whether PayPal is going to be the next biggest online threat to their business.
But it does raise an interesting question. Where is SWIFT's next biggest threat going to come from? Schrank did allude to the card providers such as Visa, but it is owned by the banks, and as we know banks are not known for moving quickly or being innovative. Some of the dinner guests also reminded me that SWIFT is a monopoly and despite the annual rebate carrot it likes to wave in front of its members, it is expensive compared to other IP networks.
A charming banker from India sitting next to me said he did not think SWIFT would be successful in expanding its network into emerging markets as most of India's domestic banks could not afford to connect to SWIFT. Furthermore, he said, India already had its own TCP/IP network, INFINET (Indian Financial Network), which is used for messaging, electronic debit and clearing, online processing, trade in government securities, centralised funds and inter-branch reconciliation.
I asked another gentleman at my dinner table as to why given the proliferation of internet bandwidth and the ubiquitousness of the internet, there were no serious contenders to SWIFT. He replied that there would be, comparing it to what had happened in the airline industry with low cost 'no frills' airlines emerging to challenge the hegemony of airlines like British Airways.
But there are already IP network providers that could challenge the dominance of SWIFT as the major financial messaging network. Looking at the intelligence Cisco Systems is embedding in its network, anything is possible. But why are networks like Cisco, BT Radianz and Savvis not providing a serious alternative to SWIFT? Part of the reason appears to be security. As SWIFT likes to remind its members, who may be tempted to jump ship to another network, in its more than 30 years of operations, no one has seriously cracked its multiple layers of authentication.
For now SWIFT can continue to rely on the robustness of its network. However, as a leading figure from UK bank Barclays pointed out, at some point the Belgian banking co-op may need to address splitting the standards it carries on its network from its governance. That however is unlikely to happen any time soon.
Although SWIFT may be happy to carry other standards on its network, like FpML for example (one audience member remarked it had no choice), Schrank does not believe its future lies in just providing the bandwidth or the pipeline that carries financial messages. With Schrank scheduled to retire next year, one wonders what his successor will make of all this? Will it be more of the same from SWIFT or a fresh approach?
Friday, November 03, 2006
EASDAQ makes a comeback
This is an important postscript to my rant earlier this week on the emergence of a new pan-European exchange Equiduct.
Equiduct, is a new offering spearheaded by Bob Fuller, ex-Dresdner and the Joint Working Group on the IT implications of MiFID. Due to go live in the second quarter of next year, the Belgian-regulated electronic platform will provide pre- and post-trade services in accordance with MiFID guidelines. It aims to be a single point of connectivity European-wide for investment banks that operate as 'systematic internalisers', small exchanges or banks that want to establish their own MiFID compliant trading facilities, but don't want to make the upfront investment.
In my earlier post I mentioned that Equiduct will be based on the technology platform used by NASDAQ Europe, formerly EASDAQ. Last year EASDAQ sold the rights to its trading platform to NASDAQ in North America. Now it appears it is rising from the ashes in the guise of Equiduct.
This comment from Dr Jos B. Peeters, president of EASDAQ suggests that Equiduct is EASDAQ's latest attempt at reviving its pan-European platform (all be it an upgraded version) in response to MiFID. "We are delighted that Bob has accepted to spearhead our Equiduct initiative," said Dr Peters. "He brings a vast experience with trading infrastructure and the implications of MiFID to the table."
Equiduct, is a new offering spearheaded by Bob Fuller, ex-Dresdner and the Joint Working Group on the IT implications of MiFID. Due to go live in the second quarter of next year, the Belgian-regulated electronic platform will provide pre- and post-trade services in accordance with MiFID guidelines. It aims to be a single point of connectivity European-wide for investment banks that operate as 'systematic internalisers', small exchanges or banks that want to establish their own MiFID compliant trading facilities, but don't want to make the upfront investment.
In my earlier post I mentioned that Equiduct will be based on the technology platform used by NASDAQ Europe, formerly EASDAQ. Last year EASDAQ sold the rights to its trading platform to NASDAQ in North America. Now it appears it is rising from the ashes in the guise of Equiduct.
This comment from Dr Jos B. Peeters, president of EASDAQ suggests that Equiduct is EASDAQ's latest attempt at reviving its pan-European platform (all be it an upgraded version) in response to MiFID. "We are delighted that Bob has accepted to spearhead our Equiduct initiative," said Dr Peters. "He brings a vast experience with trading infrastructure and the implications of MiFID to the table."
Thursday, November 02, 2006
I'll have some intelligence with my data
For many years (some may argue it still is) banking was all about products, locking the customer in for life to a particular bank's you beaut online cash management application or securities servicing offering.
Ultimately, however, customers have come to realise that as the business of banking (moving money and securities around) has become commoditised, a lot of these 'you beaut' banking products all start to look the same; they all provide similar levels of functionality, albeit packaged perhaps somewhat differently.
The banks however have been a lot slower than the customers to cotton onto this. In fact, a constant gripe of corporate customers is that banks still continue to push proprietary solutions at them.
The real value for the banks is not their products, they are unlikely to admit that though. It is difficult to picture a candid banker telling a corporate customer, 'Well actually our products are mediocre, but hey our mining of customer data and how we leverage that to provide you with a better level of customer service, is amazing.'
I would maintain (that is perhaps why I am not a banker) that more banks should be having conversations like this with their customers. Banks go on a lot about how much data they collect about customers, their transaction histories. But few banks are leveraging this data in any meaningful or value-added way.
There are many reasons for this: legacy technology investments which means a lot of this data is stored in silos in different departments that do not talk to one another. It is the good old integrate your silos argument. However, the banks are not going to be able to use that excuse for much longer as a lot of the hype and promise around service-oriented architectures, data mining and customer intelligence becomes reality.
Let's be honest though, whilst banks are large users of technology and large portions of their businesses are wholly reliant on it for their everyday operations, banks have not fully grasped the real potential of intelligent and media rich IP-based technologies.
The analyst community predicts that rich media applications such as video and broadband connectivity will add a new dimension to data aggregation and management.
To some extent online sites like Amazon have given us a taster of the potential for gathering customer intelligence via the web and then regurgitating it back at us in the form of personalised information and book or DVD selections based on our buying history. Increasingly, the internet is becoming an experience that is not only interactive but tailored to our specific tastes and interests.
Why can't banking be like this too. I am not talking about simplistic banking applications that customise data pertaining to a corporate treasurer's most recent transactions or their global liquidity position. Some banks may argue it is difficult to drill down into legacy systems and provide this level of information.
However, the banking experience in general is nowhere near as interactive or engaging as it could be. It doesn't really leverage any of the intelligence that banks gather on their customers, to provide an experience that is not only richer but tailor-made to suit a particular customer's needs, whether it is buying FX, selling securities, or transferring money half way round the world.
Isn't it time that banks actually started doing something intelligent and clever with all the information they gather on their customers, customised down to the individual or group level, rather than trying to flog to customers the latest ubiquitous banking application?
Ultimately, however, customers have come to realise that as the business of banking (moving money and securities around) has become commoditised, a lot of these 'you beaut' banking products all start to look the same; they all provide similar levels of functionality, albeit packaged perhaps somewhat differently.
The banks however have been a lot slower than the customers to cotton onto this. In fact, a constant gripe of corporate customers is that banks still continue to push proprietary solutions at them.
The real value for the banks is not their products, they are unlikely to admit that though. It is difficult to picture a candid banker telling a corporate customer, 'Well actually our products are mediocre, but hey our mining of customer data and how we leverage that to provide you with a better level of customer service, is amazing.'
I would maintain (that is perhaps why I am not a banker) that more banks should be having conversations like this with their customers. Banks go on a lot about how much data they collect about customers, their transaction histories. But few banks are leveraging this data in any meaningful or value-added way.
There are many reasons for this: legacy technology investments which means a lot of this data is stored in silos in different departments that do not talk to one another. It is the good old integrate your silos argument. However, the banks are not going to be able to use that excuse for much longer as a lot of the hype and promise around service-oriented architectures, data mining and customer intelligence becomes reality.
Let's be honest though, whilst banks are large users of technology and large portions of their businesses are wholly reliant on it for their everyday operations, banks have not fully grasped the real potential of intelligent and media rich IP-based technologies.
The analyst community predicts that rich media applications such as video and broadband connectivity will add a new dimension to data aggregation and management.
To some extent online sites like Amazon have given us a taster of the potential for gathering customer intelligence via the web and then regurgitating it back at us in the form of personalised information and book or DVD selections based on our buying history. Increasingly, the internet is becoming an experience that is not only interactive but tailored to our specific tastes and interests.
Why can't banking be like this too. I am not talking about simplistic banking applications that customise data pertaining to a corporate treasurer's most recent transactions or their global liquidity position. Some banks may argue it is difficult to drill down into legacy systems and provide this level of information.
However, the banking experience in general is nowhere near as interactive or engaging as it could be. It doesn't really leverage any of the intelligence that banks gather on their customers, to provide an experience that is not only richer but tailor-made to suit a particular customer's needs, whether it is buying FX, selling securities, or transferring money half way round the world.
Isn't it time that banks actually started doing something intelligent and clever with all the information they gather on their customers, customised down to the individual or group level, rather than trying to flog to customers the latest ubiquitous banking application?
Subscribe to:
Posts (Atom)