# Minting and Trading Data-Backed Tokens

# Overview

Data tokenization allows you to connect your data to the wider DeFi ecosystem while retaining the rights to your data and its privacy.

Data tokenization is a lot like creating an NFT except the tokenized asset is data, and ownership of the token does not necessarily imply ownership of the underlying data. We refer to the tokens created through data tokenization as data-backed tokens (DBTs) to emphasize that the value of the token is derived from the data itself.

This tutorial presents a speed-run of data tokenization and includes creating a token, tokenizing a data asset, and transferring rights to the asset by transferring the token. It assumes that you already have a handle on the rest of Parcel's features including documents, grants, and jobs. If you follow what's going on in the Running Compute Jobs tutorial, you should be good to go. If you'd like a more detailed overview (is that an oxymoron?), there's also the Parcel Primer (opens new window) for your consideration.

# Example

Consider the problem of buying digital artwork: artists don't want to release high-res copies until the buyer has paid, but the buyer doesn't want to pay without assurance of receiving the art. This issue actually applies to NFTs on public blockchains, and the less-than-ideal workaround is to use the off-chain marketplace as an escrow service. A better solution could use Parcel:

  1. Artist uploads private, high-res art to Parcel.
  2. Artist adds the data to a token that grants its holder full access.
  3. Artist runs (or approves) a Parcel compute job that creates a auditable low-res preview.
  4. Token is bridged to Ethereum where it gets listed in a DEX contract.
  5. Buyer purchases the bridged token on Eth and automatically receives the token on Parcel where the data can be downloaded.

In this way, the buyer can be assured that the art will be delivered and the seller is guaranteed to receive payment.

Data tokenization can also be used for things like credit scoring, monetizing personal data, and basically anything that looks like making trades based on (private) data. Whatever you'd like to build, this series of tutorials aims to get you started in the right direction!

# Mint a Token

The fun begins when you mint a token. At the very least, the token must have a supply and an associated grant that will be given to holders of the token. You can also set the token's name and some other things, but don't worry about those now; they'll be the subject of later tutorials. When a token is first minted, it doesn't contain any data assets. You can still transfer it to others and, when assets are added, the holders will automatically have access according to the token's grant.

Here's how that looks:

console.log('Creating a new data-backed token owned by Bob, the artist.');
const artToken = await parcelBob.mintToken({
  name: "Bob's Saguaro Sunset painting",
  grant: {
    condition: null, // Allow full access to anyone holding the token.
  },
  supply: 1, // It's an NFT.
});
console.log('The art token is:', artToken);

# Tokenize a Data Asset

For data to be tokenizable, it needs to be owned by the Parcel escrow identity, a special identity that can't be accessed by anyone, and exists only to hold data assets–a programmatic trusted third party, if you will. Transferring ownership of data to the escrow identity is similar to any other ownership change: you forgo the ability to access the data or grant others access without a grant to yourself. However, a key difference is that, when you hold the entire supply of a token, you can de-tokenize the asset and reclaim it from the escrow identity. The escrow identity is necessary to guarantee that token holders won't get assets pulled out from under them.

Once that important prerequisite is fulfilled, tokenization is quite simple:

console.log('Create a new document and prepare it for tokenization.');
const artContents = String.raw`
  ........::::::::::::..           .......|...............::::::::........
     .:::::;;;;;;;;;;;:::::.... .     \   | ../....::::;;;;:::::.......
         .       ...........   / \\_   \  |  /     ......  .     ........./\
...:::../\\_  ......     ..._/'   \\\_  \###/   /\_    .../ \_.......   _//
.::::./   \\\ _   .../\    /'      \\\\#######//   \/\   //   \_   ....////
    _/      \\\\   _/ \\\ /  x       \\\\###////      \////     \__  _/////
  ./   x       \\\/     \/ x X           \//////                   \/////
 /     XxX     \\/         XxX X                                    ////   x
-----XxX-------------|-------XxX-----------*--------|---*-----|------------X--
       X        _X      *    X      **         **             x   **    *  X
      _X                    _X           x                *          x     X_
`;
const artDocument = await parcelBob.uploadDocument(artContents, {
  owner: 'escrow', // ⚠️  The data must be owned by the escrow identity to be tokenized. This can be done after uploading, too.
  toApp: undefined,
}).finished;

console.log('Add the document to the token.');
await artToken.addAsset(artDocument.id);
// More data assets can also be added (by anyone).

# Transfer Access by Transferring a Token

And now, the piece that we've all been waiting for: granting data access to the holder of the data token!

// Bob transfers the token.
const transferReceipt = await parcelBob.transferToken(
  artToken.id,
  1, // Transfer one token, the entire supply.
  acmeIdentity.id,
);
console.log('Receipt of token transfer to Acme, the buyer:', transferReceipt);

// Acme can now download the underlying data.
const artChunks = [];
const artDownload = parcelAcme.downloadDocument(artDocument.id);
for await (const chunk of artDownload) {
  artChunks.push(chunk);
}

console.log("Acme now has access to the art! (And Bob doesn't anymore.)");
assert.strictEqual(Buffer.concat(artChunks).toString(), artContents);

EXAMPLE

You can view the full example in the Parcel Examples repository.