Jesus Rodriguez: Security Token Standards without Application are “Figments of Hallucination”
In a recent publication, Jesus Rodriguez explained why the security token industry needs to shift focus. Instead of developing numerous security token standards that focus on the structure of tokens, Rodriguez says the industry needs to develop areas that involve interoperability between different market participants.
The Many Security Token Standards on Ethereum
There is no doubt that the young security token industry has quite a few token standards already. The following is a list of the most known standards in the space:
- Harbor’s Regulated Token (R-Token)
- OpenFinance Network’s Smart Securities Standard (S3)
- Securitize’s Digital Securities Protocol
- TokenSoft’s ERC-1404
- Polymath’s ST-20
- Atomic Capital’s Digital Security Standard (DSS)
- Swarm’s SRC20
- Securrency’s CAT-20 and CAT-721
- The ERC-1400
Some believe the high volume of development is a good thing— that it reflects the high potential in tokenizing the traditional financial securities sector.
However, Jesus Rodriguez— managing partner at Invector Labs— thinks the industry’s focus is a little off-track.
In a recent publication, Rodriguez wrote the following:
“Obviously, there is a segment of the community that believes in the need for standardization. I tend subscribe to a different thesis. The security token space is too nascent, it [is] still missing 99% of the infrastructure required to be a relevant vehicle for securities and there are simply not enough security tokens issued to make a statistically-significant sample. At this stage, standards act more like a constraining force rather than a vehicle for innovation. At this point, we simply don’t know enough about how security tokens are going to evolve and we certainly haven’t encountered any challenges that requires standardization. In my opinion, standards without applications are figments of hallucination.”
The Right and Wrong Way to Think of Security Token Standards Explained
Rodriguez is of the opinion that there is both a right and a wrong way to think about security token standards.
The ‘wrong way’ involves establishing multiple standards before a specific industry has experienced any traction. The history of technology illustrates this through Service Oriented Architecture (SOA).
SOA was designed to solve interoperability issues with web services. Its development initiated a rivalry between many enterprises, and each of those enterprises developed web service standards with a variety of security. Eventually, the protocols became so complex that even their creators could not implement them, says Rodriguez.
“The end result was the entire industry shifted to simpler approaches like the representational state transfer (REST) that rely on universal internet protocols such as HTTP instead of committee-designed standards.”
Contrarily, the ‘right way’ involves a natural evolution which stems from competition. An example of this, explains Rodriguez, includes internet browsers.
“The intense innovation has caused consumers to use different browsers forcing the need for interoperability. As a result, the best technologies in the space such as HTML5 or Google v8 become widely adopted within the entire ecosystem.”
While the wrong areas to think about standards involve the structure of tokens, the right areas— especially for security tokens— are those that deal with interoperability between different market participants. As Rodriguez says,
“In the context of security tokens, standards should focus less on the structure of the tokens and more on the areas of the market that require interactions between different participants.”
With that said, Rodriguez has identified five areas that he suggests are better suited for standardization than others:
- Integration with Exchanges: to include listing, transferring, and notifications.
- Information Disclosure: protocols for publishing and disclosing information pertinent to specific security tokens.
- On-Chain Compliance: due to the regulatory compliance needed in security tokens.
- Liquidity: perhaps the biggest challenge for the industry, this should be seen at the protocol level, and shouldn’t rely primarily on market interactions.
- Ownership: since security tokens ultimately concern ownership claims, focusing here could speed-up security token adoption.
All-in-all, Rodriguez says we need a shift from security token standards, to the actual tokenization of securities:
“At this moment, we don’t need standards, we need more and better security tokens. After all, a security token standard that hasn’t been implemented in any security tokens is the definition of an oxymoron.”
What do you think of Jesus Rodriguez’s remarks on the current state of security token standards? Does the industry need to shift focus when it comes to standardization, or is implementation just around the corner? We’d love to know what you think in the comments below.
Image courtesy of CIO Review.