Data is the new latency
— Jeff Denworth
Yes, I’m quoting myself now 🙂
For more than a decade, many of the world’s most technically-proficient trading firms have sought competitive advantage by leveraging faster trading routes and building over-clocked trade execution engines to eke out nanoseconds of advantage over competing traders. The one-upmanship of high-frequency trading is a famous tale that’s unfolded in best-selling novels like The Flash Boys and on the big screen in movies like the Hummingbird Project. For over 10 years, savvy firms have worked to build systems and networks that can trade closer and closer to the speed of light, and while these efforts have resulted in a quantum leap in trade engine reaction time, the speed of light has remained constant as network improvements have slammed into a wall of diminishing performance improvements…such that, each additional latency gain becomes a fractionally smaller improvement in the quest for alpha.
So, if reaction-time is no longer a competitive trading advantage…what’s next?
While several (in)famous trading firms were chasing the almighty nanosecond, a new class of trading strategies emerged under the realization that it is now possible to predict market conditions using statistical models. Here, enriched market data, alternative data and large-scale simulations make it possible to evolve the discourse from reactionary trading to predictive trading. Simulation has subsequently given rise to AI-based inference as classic Monte Carlo codes have been replaced with next generation machine learning engines. In all cases, the amount and quality of data informs the accuracy of the applied statistical model. If yesterday’s trading organizations worked to react to trade flows using high-performance infrastructure, today’s successful trading teams understand market dynamics by analyzing vast reserves of data and can now predict and capitalize on market events before they happen.
Enter G-Research, a London-based quantitative finance research firm with a mission to apply advanced technology to predict events in financial markets. In 2021, their team came to understand the awesome power that can be derived by marrying massive AI infrastructure with massive datasets to, in their words, “construct an even bigger crystal ball.” Data, and the ability to process it at massive scale, has become the strategic advantage for this world-leader in quantitative finance.
The term “partnership” is oft-overused when infrastructure vendors discuss their customers, yet there is no more fitting word to describe the relationship between G-Research and VAST. VAST has been fortunate to partner with one of the most supportive, intelligent and resourceful customers in the world - and since 2021 we’ve pioneered new fields in data-driven artificial intelligence workloads. The G-Research team challenges us to be a better company and to continuously improve our offering, a mission we happily accept and a mission that benefits all of the VAST customer community.
Rarely do trading firms talk openly about their technology decisions. Because of this unspoken rule of silence, we were surprised and humbled when the team sent over a blog post they’ve authored to showcase the work we’ve done together. Chris Goddard, G-Research’s CTO and Partner, once explained to me that selecting a new data platform after a decade of experience with a previous system is “equivalent to getting a heart transplant.” We’re happy to report that the procedure has gone off without a hitch and our patient is bigger, faster and stronger following this successful operation.
Without further ado, it’s my pleasure to direct you to their website where you can learn more about how they think about building a big crystal ball by using the VAST Data Platform.
Happy Reading.
- Jeff