The Monetary Authority of Singapore’s (MAS) Project Guardian has been making waves in recent months for its progressive approach to digital assets.
For example, MAS recently launched the world’s first live buyback transaction using a digitally native bond on a public blockchain. These and countless other innovations mark a major milestone in the recent approach to digital assets.
While optimistic about MAS’s developments, Ralf Kubli, board member of the Casper Association, the organization responsible for overseeing Casper Network, emphasizes that this innovation is not without challenges.
Kubli believes that a crucial but often overlooked aspect of the tokenization process is the issue of standardization.
In an interview with CryptoSlate, he explained that current asset tokenization practices focus primarily on digitizing the asset itself, but neglect to include the associated liabilities and cash flows in this digital transformation. This results in the creation of asset-backed tokens that are added to blockchains, usually accompanied by a simple PDF of the terms and conditions.
Kubli believes that this approach, while seemingly efficient, still requires manual intervention for cash flow calculations, potentially leading to errors and discrepancies. He points out that this lack of transparency and auditability of cash flows is very similar to the problems that caused the 2008 banking crisis. Furthermore, Kubli argues that the key to averting a similar economic catastrophe is to ensure that cash flows are digitized, tokenized into a machine-executable format and, crucially, standardized.
In the upcoming interview, Ralf Kubli delves deeper into these challenges and explores the possible paths to a more secure and efficient future in asset tokenization.
You have highlighted the lack of standardization in asset tokenization practices as a major issue. Can you elaborate on the risks and challenges this poses, especially in the context of the recent Monetary Authority of Singapore initiative?
The recent announcement of the Monetary Authority of Singapore’s Project Guardian initiative is a big step towards demonstrating the benefits that tokenization can bring. However, these tokenized assets still do not use standards that make them both secure and interoperable across the financial ecosystem. The current projects do not define the payment obligations, which means that the cash flows of the financial instrument are in a machine-readable and machine-executable term sheet. If we don’t do that, we will still face the same risks that have plagued the financial sector for years.
As for the challenges, it may take some time for everyone to adopt the same standards, but if projects like MAS’s really want to make progress, they have to.
You mentioned that tokenization platforms often overlook liabilities and cash flows. How crucial is it to include these elements in the tokenization process, and what would be the ideal approach to achieve this?
As it stands, most tokenized assets do not contain algorithmic descriptions of their liabilities or cash flows. They simply tokenize a PDF version of a contract, meaning people still have to manually read, interpret and process it and find the associated documents detailing the financial contract. This completely undermines the point of tokenization and does not meaningfully move the financial sector forward.
By implementing cash flow logic into the smart contracts representing these assets, they become ‘smart financial contracts’ that are now machine readable, executable and auditable. This allows us to truly enjoy the benefits that tokenization brings, allowing much faster, more efficient and transparent financing.
Ultimately, incorporating cash flows and payment obligations into smart financial contracts solves the reconciliation problem, both within and between financial companies, while enabling systemic risk management.
You draw parallels with the 2008 banking crisis and suggest that a lack of transparency in cash flows can be dangerous. How can blockchain and tokenization technologies be used to prevent such economic risks in the future?
By automating financing through tokenization, any company’s balance sheet can be fully monitored in near real-time. Because the financial assets appearing on these companies’ balance sheets are forward-looking, static and dynamic, “what if?” Simulations can be run at any time.
Companies will be able to see exactly where they stand in terms of liquidity and easily model how they would fare under all conceivable economic conditions. This should effectively reduce the risk of events like those that led to the 2008 crisis, as well as the more recent volatility and contagion we have seen.
Understanding the current status of every financial contract on a company’s balance sheet in an algorithmic and standardized form will also reduce regulatory burden, enabling effective and progressive regulation and systemic risk analysis for many companies.
Do you see the move by the Monetary Authority of Singapore as a step towards addressing these tokenization challenges globally, or is it more of a local effort? How can other supervisors learn from this?
Many MAS initiatives are developed in collaboration with various regulators; therefore, everything that happens in Singapore with major international financial companies is global in nature.
What do you think the future holds for the regulation of tokenized assets? How important is international cooperation in standardizing these practices?
Tokenized financial assets will revolutionize the way financial systems work. You can see it as improving the drainage of the capital markets. Tokenization is already happening on a large scale with cash and cash equivalents (deposit tokens, money market funds, T-Bills, etc.). For fund tokenization, many major players are investing heavily (such as Fidelity, Franklin Templeton and KKR).
For debt, structured instruments and derivatives, algorithmic definitions of the cash flows of the underlying financial instrument are a prerequisite for the successful adoption of infrastructure for tokenized financial assets.
A bond or mortgage remains a bond or mortgage when it is tokenized. Therefore, regulators should welcome a DLT-enabled financial infrastructure, where it is much easier to track which party has which obligation.
Without the cash flows within the tokens representing debt, structured instruments or derivatives, these tokens will remain dumb and not provide the necessary efficiency in price discovery and post-trade automation.
What are some potential solutions or innovations that you foresee that could address the standardization problem in asset tokenization?
A comprehensive set of open banking standards that algorithmically define how financial contracts interact. Combining tokenization with clearly defined standards can bring a new level of efficiency, transparency and legitimacy to finance and businesses. Fortunately, standards that can address these issues already exist, most notably those set out by the Algorithmic Contract Types Universal Standards (ACTUS) Research Foundation. Implementing such a structure is what is required for tokenization if it is to truly be adopted.
Do you believe the issues you have identified with tokenization are specific to stablecoins or indicative of a broader trend in the financial system?
The truth is that using stablecoins for payments involves little innovation in financing. The innovations in payments are mistaken for innovations in finance because finance is the exchange of cash over time, and payments today are the exchange of cash.
DeFi currently consists mainly of over-collateralized lending, which leaves it as a niche form of financing as very small amounts of over-collateralized lending exist in the real world. The reason why DeFi loans must be so heavily collateralized is because DeFi is unable to calculate the cash flows or liabilities of a loan without human intervention.
As I said, in order to innovate and attract institutions, liabilities and cash flows must be tokenized, machine-executable, and, perhaps most importantly, standardized. With sound financial logic underlying the blockchain-based tokenization we see today, DeFi can grow beyond its niche status and become the revolutionary technology it aims to become.
What advice would you give to blockchain innovators and regulators to effectively address these challenges?
For innovators, don’t just build a new payment rail – that just creates a new channel that needs to be independently monitored. Instead, leverage smart financial contracts that can be controlled through automation. This is the real innovation.
As for the regulators, understand that embracing tokenization that follows agreed-upon standards will really make your job much easier. All these tools and rails will be transparent and enforced through code. This means that it won’t even be possible for companies to do things like overvaluing positions and moving liabilities, and it would be completely visible if they did somehow.
Finally, what is your vision for the future of blockchain and tokenization in creating a more efficient, transparent and stable financial ecosystem?
This is the first time in sixty years, since the introduction of computers in banks, that we have been able to address and solve the key problems facing the banking and financial systems. By implementing open source, algorithmic financial contracts, tomorrow’s financial world will run so much more efficiently and balance sheets will align within minutes or hours with fewer or eliminated cases of fraud.
If done right, the Blockchain can truly provide the reliability needed to improve enterprise-wide risk management and make systemic risk management possible again. I think this happens; it will just take a little longer to get everyone on board.