A Brief History of Tokenization
A unique idea has intricately paved its way into the mainstream blockchain mantra and will profoundly transform the way we think of, and invest in physical assets. A process aptly referred to as “tokenizing an asset”, the idea of tokenization centers around transforming ownership rights of any particular asset into a digital format, and more specifically a token. This idea extends to facilitating multi-party ownership for what would normally be considered an indivisible asset such as artwork, music content, and even extensive multi-party land and property ownership. Once you factor in blockchain technological capabilities, you end up with a network that allows for a far easier exchange of these assets – be they divisible, indivisible, or completely intangible (“All hail NFTs!”). It’s not exactly difficult to find tokenization-related projects built on the blockchain across the Internet, although it’s worth stressing that this isn’t the first iteration of tokenization’s use case.
HISTORY OF TOKENIZATION
In truth, the concept of tokenization and its application have its origins dating back to the early 2000s, where tokens would be issued as a data protection mechanism, with an underlying objective of satisfying the security and compliance requirements of the organization in question, without sacrificing sensitive data. The process essentially exchanges sensitive values for non-sensitive placeholders, again called tokens. The splendor here is that these tokens were indecipherable meaning that they were irreversible, thus restricting the ability to reveal the original data, ultimately preventing any possible data theft threats or similar. Today this is still used in practice whereby organizations host the stored tokens on their cloud, mitigating the risk of storing any sensitive information internally. The motto “no data, no theft” springs to mind. In 2001 this digital tokenization idea was utilized as a means of protecting credit card information by the likes of such organizations as TrustCommerce, eliminating the need for merchants to store credit card data internally, and thus vastly increasing the security and privacy of cardholder information.
The tokenization dogma actually predates this practical use, however. We only need to think of a casino and its daily operations to explore this further. Imagine walking blissfully into a casino ready to make bank. You would approach the cashier desk, hand in your cash, and “hey presto”, you would receive chips in return – those of which could be thought of as tokens themselves, with their underlying value pegged to the initial cash that you handed in. But wait, let’s rewind even further …
In the late 17th Century paper currency in the United States was officially born, primarily to fund military expeditions at the time. The practice of printing paper notes quickly spread to other colonies as a result. Over the course of the next 100 years or so, the familiar US dollar sign was introduced, various permutations of bank notes were created to challenge and deter counterfeit measures, and banknote quality was also improved. During this period commercial banks would secure or “tokenize” these notes with physical gold, in other words, the paper notes were a tokenized representation of the underlying store of value held, that being gold. Individuals would then use these notes to transact in the same way we do today, therefore allowing the exchange of objects of value, which is a far cry from the bartering economies that once stood centuries before.
In response to the fluctuating financial needs of the United States, a national banking system was formed in 1913 and the Federal Reserve Act was established, instilling the Federal Reserve with the responsibility of being the United States central bank and being the sole printer of the US Dollar. At this time, it was still possible to redeem your United States Dollars with the Federal Reserve for physical gold, and it wasn’t until 1971 that this gold standard finally came to an end, giving birth to the fiat system. Under the fiat system, currency notes are essentially issued in good faith by a country’s Central Bank. This means that the Central Bank in question stood behind the banknote and declares that this unit of fiat currency must be universally accepted as a specific unit of value.
TOKENIZATION IN TODAY’S WORLD
Fast forward to today’s world and we’re spoilt with exposure to the new age of tokens in a blockchain context, whereby anything and everything that has value can be tokenized, traded, transferred, and invested in. The likes of Bitcoin, Litecoin, and Dash to name a few, are cryptocurrencies that enable the trading and exchange of their digital token, have a varying store of value, and can be used as mediums of payment. In this instance, the digital token of a cryptocurrency is itself, an asset.
However, the next evolution of asset tokenization currently being deployed is in relation to real-world asset classes. Tokenizing an asset – and loosely speaking, any asset for that matter – involves issuing part or full ownership of that asset on a blockchain, in a digital token format. In essence, that token represents ownership of the underlying tangible or intangible asset, allowing the underlying economic value of that asset to be conferred to the token itself. Appreciating that asset tokenization bears virtually no limit, you can only imagine the compelling, far-fetching, and perhaps even ludicrous implications and opportunities that this process presents itself, across a multitude of industry sectors. Of course, this is only heightened by tokenization’s relationship with blockchain technology, particularly as both become exponentially more powerful.
The justification for tokenization spreads far and wide, while the popularity to do so is evermore increasing. Consider a typically indivisible physical asset such as art. These can now be segmented into divisible portions via the tokenization process, each portion represented by the underlying tokens, therefore enabling fractional ownership, unlike anything we’ve seen before.
What’s the outcome? Well, the bottom line is that tokenization significantly diminishes barriers to investment and opens up an indefinite spectrum of asset markets to new pools of investors. Furthermore, a broader investor market permits far better liquidity, increased asset transfer capabilities, and deepens the market overall. It is these core characteristics that prohibit investors from being restricted or hindered by their geographical location completely, while unnecessary “middle-manning” brokers and their associated costs are predominantly, if not completely, eradicated – shock!
It’s no surprise then, to see that trading instruments within the financial sector are also being heavily tokenized. Across the DeFi sector, we can see notable strides in expanding these markets to allow the trading of digital tokens that represent an underlying asset that would usually only be tradable across traditional financial exchanges. From trading fractional units of gold via the cryptocurrency world to having the ability to trade exposure to NASDAQ-linked stocks represented by digital tokens, the opportunities appear to be endless and growing at the same time. What makes this more unique is that ownership of these digital tokens (or any tokens for that matter) is indisputably and immutably recorded on the blockchain while remaining independent of where the asset is physically stored or traded.
All it takes is an open mind to see the true value advantages of asset tokenization in its entirety. The benefits afforded to an individual or enterprise from doing so are prolific, but we must also appreciate that the full potential of tokenization is still unfolding. The fact is, today the world and wider blockchain ecosystem continue to explore the prospects offered by tokenization and have come to understand that tokenization is rapidly becoming an essential vessel for transforming the conventional precedents of asset ownership and management around us.
Author - Matthew Romu