Forefront Communications

TabbFORUM: The Market Data Microservice Revolution

Xignite

Alexandra Hamer

Alexandra Hamer

By: Stephane Dubois (CEO, Xignite)

To deploy market data in the cloud, the best approach is a hybrid one. Rather than building a whole new market data technology stack in the cloud, as some very large firms have chosen to do, which would require significant maintenance of that technology stack, firms can choose to deploy battle-tested microservices.

It’s been more than 17 years since Xignite launched its first API and more than 10 since we announced our first AWS-based solution, and still, the world of market data remains riddled with legacy technology: firehose feeds and FTP files are still the norm for data distribution. But this is about to change. The logistical challenges created by the pandemic will likely give larger financial institutions the impetus they need to accelerate their transition to the Cloud. And nothing is likely to have as much impact on the market data industry going forward as the deployment of microservices.

Flashback to Legacy 

Most of the technology used by financial institutions has its roots in the pre-internet early ’90s (think Triarch/Tib) and although this technology was upgraded over time, the core architectural principles behind it never really changed: a multi-tiered, on-premise tech stack designed to carry data from high-bandwidth dedicated private networks via proprietary protocols down multiple layers of on-premise distribution technologies (collocation, distribution servers, feed handlers, access points, data hubs, message buses, etc.). The net results (after years of iterative layering of new functionality) has been a complex hodge-podge of proprietary software and hardware that has limited compatibility from version to version, requires extensive experience and training to maintain, is overkill for basic data distribution needs, and simply cannot easily be upgraded or unplugged without putting the business at risk.

Total Lack of Innovation 

To be fair, the reasons behind such heavy architectures were legitimate. At that time, there was no bandwidth capable of handling the market data volumes of the time, there was no cloud or grid computing, there were no widely accepted interoperation protocol, and there were only very basic Application Programming Interface (API) standards. In other words, technology was just not far along enough to enable alternatives.

To read the full article, click here.

Leave a comment

Your email address will not be published. Required fields are marked *