jump to navigation

Market Liquidity Analysis (MLA) Video June 12, 2008

Posted by newyorkscot in Audio & Video, Complex Event Processing.
add a comment

I added a video of the Aleri MLA tool on Lab49’s website here.

CEP Partners @ SIFMA June 12, 2008

Posted by newyorkscot in Complex Event Processing.
add a comment

I spent some time with our CEP friends at SIFMA this week – lots of new stuff going on…

Aleri – their booth was BUZZING over the last few days, mainly due to the very cool (Lab49 built) user interface for the Market Liquidity Analysis (MLA) tool. I spoke to a lot of partners, reporters and client prospects about this, almost all of whom were really excited and impressed by the offering. Aleri also have a number of feature upgrades.

Apama announced version 4.0 that vastly improves latency, and some new offerings including a deal with NYSE Euronext’s Advanced Trading Solutions group to provide their algo platform as a hosted offering. They also announced a port to Solaris 10, and a new “real-time pricing accelerator”. Just as important, Apama had an open bar at the conference – those beers were needed late in the afternoon..

Coral8 – also announced a packaged offering with Wombat (ie NYSE Euronext ATS), and also got a nice client announcement in the form of market surveillance for LiquidNet.

Oracle – spoke to the folks at Oracle about their new combined Oracle-BEA offering that incorporates the Oracle Continuous Query language (CQL), the BEA Weblogic Event Server java container, Coherence for Distributed Cache, plus some new graphical improvements on the IDE front (finally!) and High Availability (HA) story. Clearly, an enterprise / scalability play.

Streambase announced their new 6.0 version with an improved development & testing environment that now includes the ability debug CEP code through breakpoints, and support for unit testing (we got a demo and it is very cool). This should really help those nasty synchronization timing problems we have seen with presenting market data and complex derived data simultaneously to the user. Streambase are also narrowing their industry focus down to FS and Federal verticals, hiring more domain experts and refreshing their executive team. Good news all around.

Lab49 CEP Podcast – Part V April 15, 2008

Posted by newyorkscot in Audio & Video, Complex Event Processing, Marketing.
4 comments

The fifth installment of the CEP podcast series has been posted here and on the Lab49 website. This is the third and final part of the interview I did with Daniel Chait of Lab49 and Brad Bailey of the Aite Group.

All of these CEP podcasts and other audio/video sessions we have recorded can be found on the Lab49 website at: http://www.lab49.com/insidethelab/audiovideo

Lab49 CEP Podcast – Part IV April 1, 2008

Posted by newyorkscot in Audio & Video, Complex Event Processing, Marketing.
add a comment

The fourth installment of the CEP Podcast series has been posted here. This podcast is the second part of the interview I conducted with Brad Bailey of the Aite Group and Daniel Chait of Lab49. In this part we are continuing to discuss the challenges facing capital markets firms and how CEP provides an opportunity to manage these market challenges in new ways.

All of these CEP episodes, and other  podcasts we have done can be found at http://www.lab49.com/insidethelab/audiovideo

Lab49 CEP Podcast – Part III March 7, 2008

Posted by newyorkscot in Audio & Video, Complex Event Processing, Marketing.
add a comment

The third installment of the CEP Podcast series has been posted here. This podcast is the first part of the interview I conducted with Brad Bailey of the Aite Group and Daniel Chait of Lab49. In the interview we discussed many of the market, client and technology trends that are driving the CEP marketplace. The second and third parts of the interview will be posted in the next week or two.

Lab49 CEP Podcast – Part II February 22, 2008

Posted by newyorkscot in Audio & Video, Complex Event Processing, Marketing.
add a comment

The second part of the Lab49 CEP podcast Daniel Chait and I recorded is now available, here

In the next podcast to be published (already recorded), I will be discussing with Brad Bailey (ex-Aite Group analyst) some of the industry and client trends he has found through his research on CEP. Daniel also pitched in as a technical expert.

Really Massive Complex Event Processing February 20, 2008

Posted by newyorkscot in Complex Event Processing, HPC, Science.
1 comment so far

ScientificAmerican ran a feature this month on the Large Hadron Collider (LHC) being built by CERN to conduct  the largest physics experiments ever. Aside from its sheer physical scale, one of the remarkable aspects of the project is the massive volumes and frequency of data generated, causing it to be probably the most impressive combinations of complex event processing and distributed grid computing ever:

  • The LHC will accelerate 3000 bunches of 100 billion protons to the highest energies ever generated by a machine, colliding them head-on 30 million times a second, with each collision spewing out thousands of particles at nearly the speed of light.
  • There will be 600 million particle collisions every second. Each one called an “event”.
  • The millions of channels of data streaming away from the detector produce about a megabyte of data from each event: a petabyte, or a billion megabytes, of it every two seconds.
  • More details here and diagram here.

This massive amount of streaming data needs to be converged, filtered and then processed by a tiered grid network. Starting with a few thousand computers/blades at CERN (Tier 0), the data is routed (via dedicated optical cables) to 12 major institutions around the world (Tier 1), and then finally down to a number of smaller computing centers at universities and research institutes (Tier 2). Interestingly, the raw data coming off the LHC is saved onto magnetic tape (allegedly the most cost-effective and secure format).

I wonder how many nano-seconds they took to consider what CEP vendor they wanted to use for this project ?!!

The CEP Podcasts February 11, 2008

Posted by newyorkscot in Audio & Video, Complex Event Processing, Marketing.
add a comment

Daniel Chait and I will be hosting a series of podcasts discussing Complex Event Processing (CEP) in Capital Markets. The first podcast is a general discussion on trends leading to the demand for CEP, some of the primary use cases and some discussion on what to look for in a CEP vendor. It has been split into two parts: Part One is here. The series is managed by VoicesInBusiness and sponsored by BEA.

Clients, CEP and Trends January 30, 2008

Posted by newyorkscot in Audio & Video, Complex Event Processing, Marketing.
add a comment

At Lab49, we finished a very strong year with a flurry of client and marketing activity. On the client side, we have seen a remarkable uptick in demand from the buy-side (hedge funds) and exchanges. This is in addition to our traditional tier-one sell-side client base.

Complex Event Processing was a big growth area for us (as was large scale computing and WPF), with us continuing to build relationships with the main vendors in the space. Many of these companies have brought us into client opportunities, wanted us to build demos specific to financial services’ front offices, and engaged us in various marketing-related activities. This month we launched a whitepaper with BEA based on our fixed income CEP demo using the BEA Weblogic Event Server (of course, that was not the tipping point for Oracle to buy BEA 2 days later 🙂 ). Recently, Daniel Chait wrote a nice article for Aleri on the practicalities of selecting a CEP vendor. Watch this space for a series of podcasts that I will be hosting with Daniel that will discuss CEP in capital markets with various people from the industry.

In November, a few of us did some brainstorming on some of the cool technologies and IT trends that we think emerge or gain traction in 2008. When we ran these by a number of the horizontal and fintech press, we got some very positive traction, resulting in a whole series of articles, features and podcats we will be writing or contributing to over the next few weeks/months. Network World did a podcast on some of these trends with Daniel during their “Predictions Week”.

Separately, Joe was also interviewed in an article discussing SOAP vs REST

Oracle Buying BEA January 16, 2008

Posted by newyorkscot in Complex Event Processing, HPC.
add a comment

Just announced an hour ago, for $8.5billion. Will now be interesting to see what the Weblogic product strategy looks like with respect to providing distributed cache integration, ie when will Coherence be part of the stack ?

BEA-Intel-Lab49 Whitepaper on CEP in Capital Markets January 14, 2008

Posted by newyorkscot in Articles, Complex Event Processing, Marketing, Visualization / UX.
add a comment

Finally, the whitepaper we have been writing with BEA and Intel on how the BEA Weblogic Event Server can be leveraged in the front office has been published here and on the Lab49 website. The paper outlines general trends in the CEP market; use cases for the application of CEP; and a reference architecture. This work is based on the fixed income demo Lab49 built with BEA which focused specifically on analytics integration and performance (as well as a cool WPF visualization).

Data Streaming Crosses the Chasm December 20, 2007

Posted by newyorkscot in Complex Event Processing, HPC, Marketing, Visualization / UX.
add a comment

Lab49‘s Daniel Chait provides SDTimes’ editor-in-chief David Rubinstein with his views on CEP, HPC, multi-core processing, WPF/Flex, etc, here.  Nice article, David.

Lab49 Client & Marketing Update.. December 12, 2007

Posted by newyorkscot in Client Engagement Mgt, Complex Event Processing, HPC, Marketing, SOA / Virtualization, Visualization / UX.
add a comment

The last few months and weeks have been a bit mad with a host of new client projects coming online, and a tidal wave of marketing activities.

On the project front, and despite some dodgy market conditions, we have been continuing to see (and have started) some interesting projects around automated trading: from market simulation environments to real-time pricing to risk management systems. We have also seen more projects on the buy-side where the level of innovation and adoption of the latest technologies is still impressive. In advanced visualization (specifically, WPF/Silverlight), we are starting to see interest across various trading businesses which is very promising going forward. We also continue to be involved in quite a few projects involving grid computing, distributed cache, etc.

On the marketing front, we have been busy publishing new articles, contributed to a number of features in various industry publications and are currently in the process of writing some thought leadership pieces for technology and finance publications. We have also been doing some great sales & marketing activities with some of our technology partners around, including working on some new client opportunities and developing some demos leveraging WPF and CEP platforms. (We will also be starting to talk a bit more openly about our various partnerships)

What’s great about the recent flurry of project and marketing activity has been the balance across high performance computing (grid, cache, etc), Java (J2EE, Spring, opensource), Microsoft (WPF, Silverlight) and other technologies (messaging, market data, visualization, etc), which really helps to show Lab49‘s depth and breadth across the technology space. Some highlights from the last few months include:

Lots more news, articles, features, partner updates, etc in the pipeline that I will post as they happen..

New Capital Markets Benchmarking Council September 25, 2007

Posted by newyorkscot in Complex Event Processing, HPC.
add a comment

The Securities Technology Analysis Center (STAC) recently announced the creation of a new benchmark council that includes some of the leading securities firms such as JPMC, Citigroup and HSBC. The new council will establish benchmarks in three areas:

  • Market data: benchmarks based on workloads such as direct exchange-feed integration, market data distribution, tick storage and retrieval, etc.
  • Analysis: benchmarks based on workloads such as trading algorithms, price generation, risk calculation, etc.
  • Execution: benchmarks based on workloads such as smart order routing, execution-related messaging, etc.

“STAC Benchmarks will measure the performance of software such as market data systems, messaging middleware, and complex event processing systems (CEP), as well as new underlying technologies, such as hardware-based feed and messaging solutions, hardware-based analytics accelerators, compute and data grid solutions, InfiniBand and 10-gigabit Ethernet networks, multicore processors, and the latest operating system and server technologies.”

Benchmarks in complex event processing, huh ? That will be interesting. Will it be based on specific sets of use cases ? Will it give us insight into the hype and myth of the various CEP vendors’ proclamations of processing gazillions messages / sec ? Will it tell us what happens when you try to scale these products ? I wonder what the various CEP vendors think about this …?

Some FAQs here.

BEA Event Server Fixed Income Demo September 19, 2007

Posted by newyorkscot in Audio & Video, Client Engagement Mgt, Complex Event Processing, Marketing.
5 comments

Lab49 has been working with BEA on their new Weblogic Event Server (WLEVS) product in the complex event processing space. As part of this effort, we have been building a demo of how one could automatically reprice a portfolio of fixed income instruments against streaming market data. In addition to WLEVS, we used Quantlib‘s C++ libraries and Lab49’s own market data simulator, with the front end built in WPF.

A couple of us from Lab49 attended the BEA World Conference in San Francisco last week and I presented the demo as part of the “Introduction to Event Server” session.

Click on the image below to view a screencast of the demo .. (this is a .wmv version for the moment..)

We were able to run the demo pricing 400 bonds/sec, with 4.6ms latency, on an Intel Quad processor – the folks at Shuttle helped us with their latest snazzy new XPC box. We also developed a cool WPF visualization of the Event Processing Network where we can automatically generate XAML from the underlying EPN configuration which we then data-bind to the event server’s performance monitoring meta-data.

I will put the demo and more information on the Lab49 website, and hopefully we will get it onto BEA’s Dev2Dev portal soon…