Having successfully conducted three webinars in the Tech Talks series, the London Stock Exchange Group will be conducting its fourth webinar on September 15 2021. Sparking conversations that educate audiences on trending topics in the fields of finance and technology, LSEG’s upcoming webinar focuses on “Efficient Information Exchange Using Memory Transplant”, with Chief Architect Rajith Priyanga.
Currently heading the Post Trade Systems engineering team at LSEG Technology, Rajith Priyanga joined the Group in 2004. He has been leading the efforts of designing and building the next generation Post Trade system that runs at London Clearing House – Europe’s largest financial clearing organisation. Prior to this role, he worked on building Surveillance Systems, Smart Order Routing Systems and Exchange Systems, including the trading and surveillance systems for the London Stock Exchange (LSE).
He obtained his BSc in Computer Science and Engineering from the University of Moratuwa, and is also reading for a Master of Arts at the University of Colombo. Speaking with Roar, Rajith gives us a little insight into the upcoming webinar, delving into the concept of “snapshottable” memory and how it can be utilised for efficient information exchange.
- Could you give us a brief overview of your topic, “Efficient Information Exchange Using Memory Transplant”?
Back in 2015, when we were developing our next clearing system for London Clearing House, which went live in February 2020, we realised conventional methods of retrieving databases or files were too time-consuming and inefficient. Hence, we focused on delivering a platform that would drastically improve operational efficiencies for financial services and firms.
With the use of memory transplant and the proper implementation of high-performance technology, organisations can benefit from seamless processes and transitions. The upcoming webinar will focus on the concept we used during the course of this complex project, and how it was built to achieve a significant increase in efficiency.
- What features does this concept hold to improve performance for financial services?
Financial services usually manage millions of transactions, making reducing processing time a constant objective. Our goal was to implement a better real-time clearing system, capable of handling the highest rate required whilst processing trades in real time, validating risk, and providing responses back to members with no disruptions. At the time, technology itself was not advanced enough to handle this, so everything was built from scratch.
Our source of inspiration stemmed from two popular movies that ingeniously used memory- The Matrix (1999) and Source Code (2011). We utilised the concept of cloning memory (i.e. taking a “snapshot” of data), retaining a backup of critical business processes, and transplanting this cloned memory across different systems for instant updates. This allowed both real-time processing and batch processing to occur in parallel with no pauses or interruptions.
The data structure used was highly efficient, and accounted for loading only the necessary information without redundant data or duplications, cutting down on unnecessary storage. Initially, the process of loading a database took minutes but with this process of “snapshotting” and transplanting memory, we were able to instantaneously update systems and process data. For example, the platform we implemented for LCH is configured to clear and risk manage 20 million trades per day. It successfully processed the service’s highest ever clearing volumes within the first five days of operation!
- What is the biggest challenge you believe financial services face when it comes to information exchange and data management?
I believe there are two main challenges. Firstly, conventional clearing systems are usually backdated. However, with the latest trends, all trading systems, surveillance systems, and risk management systems are real-time. Information is meant to be shared instantaneously, and processes need to be optimised to handle that.
Secondly, security is critical for data management. In terms of financial services, it’s vital to ensure that data is secure whilst maintaining efficient data movement. Achieving both of these without compromising on either factor is the real challenge.
- Data security in the digital age is crucial. How does this concept support data on a larger scale?
It’s important to implement multiple layers of data protection. Efficient data transfer, regardless of scale, happens within a core network with strict access privileges. Backups are carried out daily and all data is encrypted. At LSEG Technology, we used the concept of memory transplant in conjunction with high-performing technology that includes risk and collateral management to ensure our platform provided secure data management at all times.
- How well do you believe data is managed in Sri Lanka? How much room do you believe we have for improvement?
I believe everyone has the means to utilise backup technology in Sri Lanka. The technology is accessible to anyone who knows how to operate it. However, there are multiple processes involved in data management that are not just technical in nature. I believe we need better data regulations in place.
There are many regulations and protocols that need to be formed from a business perspective. For example, a backup should also take access privileges, how to handle duplicates, security, and other such factors into consideration. In terms of improvement, we need a more holistic process enforced within organisations, because many organisations lack a strict data governance policy.
End of Q & A
Interested in learning more about how to utilise efficient information exchange through innovative techniques? Rajith’s upcoming webinar will share some eye-opening insights on how to use cloning and memory transplant to drastically improve your operational processes.
Follow the link below and sign up to participate in this webinar:
https://lseg.zoom.us/webinar/register/WN_3UM3FCw1TJ-nAOcjyRQiyA