DTCC Rejects ‘Walled Garden’ Approach in Its Tokenization Strategy, Digital Assets Chief Says
The Depository Trust & Clearing Corporation (DTCC), the backbone of U.S. securities settlement, is positioning its tokenization strategy to be network-agnostic and interoperable—rather than locked into any single blockchain or proprietary ecosystem.
Handling around $10 trillion in securities transactions each day, DTCC is approaching digital assets with the caution expected of a systemically important market utility. Yet it’s also signaling a clear willingness to plug into multiple networks, provided that risk management and data standards are not compromised, according to Nadine Chakar, Global Head of DTCC Digital Assets.
Speaking at a virtual forum, Chakar emphasized that DTCC’s vision for tokenized securities is explicitly opposed to building closed, isolated systems.
“We’re not building walled gardens,” she said. For Chakar, true interoperability means “being able to move things seamlessly from one chain to another, without risk [or] extra expenses.” The goal is a market where tokenized instruments can circulate across platforms and technologies without fragmentation or friction.
Tokenization Without Network Lock-In
DTCC’s strategy is not tethered to a single blockchain, consortium, or vendor. While many tokenization projects in traditional finance have debuted on private or permissioned chains, DTCC is signaling that it wants flexibility: integration with different networks, different models, and potentially both public and private infrastructure.
That stance matters because of DTCC’s central role in post-trade infrastructure. If the firm were to select a single preferred chain and push all activity there, it could tilt the entire market. Instead, by keeping its approach network-neutral, it is leaving room for competitive innovation among technology providers, while insisting that any solution must meet stringent risk, security, and data requirements.
Chakar also acknowledged that the company’s vision is inevitably influenced by its legacy and regulatory obligations. DTCC’s systems were built to safeguard stability in highly regulated markets, and that mindset carries over into its experiments with digital assets. Tokenization, in this framework, is not about abandoning existing safeguards, but about embedding them in new rails.
Risk Controls and Data Standards Above All
While the firm talks openly about interoperability and open architecture, its priorities remain consistent with its traditional mandate: controlling systemic risk, ensuring accurate recordkeeping, and protecting market integrity.
Any tokenized securities platform associated with DTCC must:
– Respect existing regulatory frameworks and reporting obligations
– Preserve clear ownership records and settlement finality
– Maintain robust cybersecurity and resilience
– Support standardized data formats that can be audited, reconciled, and integrated with current systems
For DTCC, interoperability is not just a technical challenge—such as bridging tokens from one chain to another—but also a data and legal one. It must be completely clear who owns what, under what jurisdiction, and with which rights, at any given moment, even as assets move between different networks.
This is why Chakar repeatedly ties interoperability to “without risk.” If tokenized assets can jump between chains but the process introduces ambiguity, loss of control, or inconsistent records, DTCC would see that as unacceptable.
From Legacy Rails to Tokenized Markets
DTCC’s infrastructure is deeply rooted in the traditional post-trade environment—centralized books and records, established clearing processes, and compliance with regulators across multiple jurisdictions. Chakar doesn’t deny that heritage; instead, she frames tokenization as an evolution on top of a proven foundation.
In practice, this means DTCC is more likely to:
– Build connective tissue between legacy systems and new blockchain-based platforms
– Experiment with tokenized representations of traditional instruments (like equities, bonds, or funds) rather than purely native crypto assets
– Focus on incremental, production-grade use cases rather than speculative experiments
The firm’s stance reflects a broader trend in institutional finance: tokenization isn’t being pursued as a radical break, but as a way to modernize existing markets—improving settlement, collateral mobility, and operational efficiency—while maintaining oversight.
What Interoperability Really Means for Institutions
When Chakar talks about seamless movement of assets “from one chain to another,” she’s pointing at one of the biggest unresolved issues in digital asset markets. Today, tokenized instruments and stablecoins often live in siloed environments, where a version on one chain is not easily fungible with the same asset on another.
For large financial institutions, this fragmentation creates:
– Operational complexity: multiple versions of the same asset must be tracked and reconciled
– Liquidity fragmentation: order books and pools are split across chains
– Additional cost: specialized infrastructure is required for each network
DTCC’s vision of interoperability implies a future where an institution can hold or transact a tokenized security without caring which underlying chain it currently lives on—or can switch networks as needed, without introducing new risk or incurring high switching costs.
That requires not only technical bridges, but trusted governance, clear legal frameworks, and consistent data models across all enabled networks.
Why DTCC’s Position Matters for Tokenization
Because of its central role in the global securities plumbing, DTCC’s approach could heavily influence how tokenization evolves in mainstream finance.
If it succeeds in building interoperable, standardized tokenization rails, several outcomes become more likely:
– Broader institutional adoption: Banks, asset managers, and brokers may be more comfortable using tokenized instruments if they are integrated into familiar DTCC workflows.
– Reduced fragmentation: Instead of dozens of isolated tokenization platforms, markets may converge around common standards and interoperable infrastructures.
– Regulator-friendly innovation: By embedding tokenization in existing oversight frameworks, DTCC can offer a path that balances innovation with investor protection and systemic stability.
Conversely, if major market utilities chose a closed, proprietary model, tokenization could devolve into isolated pockets of liquidity, limiting its transformative potential for capital markets.
Public vs. Private Chains: A Neutral, Conditional Stance
Although the company has not explicitly endorsed one category of blockchain over another, its emphasis on risk and controls suggests that permissioned or highly governed environments will play an important role in early implementations.
However, by rejecting “walled gardens,” DTCC leaves open the possibility that public networks, or hybrid architectures, could be involved—so long as they can meet institutional-grade standards around:
– Security and resilience
– Regulatory compliance
– Identity, KYC, and AML controls
– Governance and change management
In this sense, DTCC is not publicly staking out a maximalist position on any specific technology. Instead, it’s defining the conditions under which any technology—public, private, or hybrid—can plug into its tokenization model.
Implications for Market Structure and Competition
An open, interoperable tokenization framework backed by a central market utility could reshape competition across financial infrastructure providers.
On one hand, interoperability can:
– Lower switching costs for institutions
– Prevent lock-in to a single vendor or chain
– Encourage innovation at the application and protocol layers
On the other hand, it sets a high bar for technology providers hoping to integrate with DTCC-linked systems—especially around compliance, reliability, and data integrity. Players that cannot meet those standards may find themselves sidelined, regardless of their technical sophistication.
For issuers and asset managers, an interoperable model could make it more attractive to tokenize products, knowing they are not constrained to one platform and can reach multiple venues and investors under a unified framework.
The Balancing Act: Innovation vs. Systemic Responsibility
DTCC’s messaging underscores the central tension facing traditional financial institutions exploring digital assets: how to leverage the efficiency and programmability of blockchain technology without introducing new forms of systemic risk.
Chakar’s comments suggest a deliberate, stepwise approach:
– Start from established obligations to safeguard markets
– Experiment with tokenization in controlled settings
– Demand interoperability to avoid fragmentation
– Refuse to sacrifice risk management for speed or hype
This is not the fast-moving world of retail crypto speculation; it is a slow rebuild of core market infrastructure. But precisely because DTCC sits at the center of the system, even small moves toward tokenization can have outsized impact.
A Long-Term, Infrastructure-Level Transformation
While the company’s approach may appear conservative compared to crypto-native projects, it aligns with the scale and responsibilities of a clearance and settlement giant.
By insisting it is “not building walled gardens” and by defining interoperability as movement “without risk [or] extra expenses,” DTCC is signaling the kind of market it wants to help build: one in which tokenized assets are not stranded in isolated silos, but integrated into a coherent, multi-network financial fabric.
If that vision is realized, tokenization will not just be a new speculative niche; it could become a standard feature of how securities are issued, traded, and settled across the global financial system—rooted in the past, but architected for a more connected, programmable future.
