Polkadot Vault

Polkadot Vault - Turn your smartphone into a hardware wallet

Polkadot Vault is a mobile application that allows any smartphone to act as an air-gapped crypto wallet. This is also known as "cold storage".

You can create accounts in Substrate-based networks, sign messages/transactions, and transfer funds to and from these accounts without any sort of connectivity enabled on the device.

You must turn off or even physically remove the smartphone's Wifi, Mobile Network, and Bluetooth to ensure that the mobile phone containing these accounts will not be exposed to any online threat. Switching to airplane mode suffices in many cases.

Disabling the mobile phone's networking abilities is a requirement for the app to be used as intended, check our wiki for more details.

Have a look at the tutorial on our wiki to learn how to use Polkadot Vault together with Polkadot-js app.

Any data transfer from or to the app happens using QR code. By doing so, the most sensitive piece of information, the private keys, will never leave the phone. The Polkadot Vault mobile app can be used to store any Substrate account, this includes Polkadot (DOT) and Kusama (KSM) networks.

Key features

  • This is not a complete crypto wallet in itself. The Vault does not sync with blockchain, so it does not know your account balance, whether transactions were successful or even if the account exists! This is a cold wallet app that only stores keys, reads and signs messages. It should always be used with hot wallet like polkadot.js.
  • The Vault alone does not make your accounts secure. You must maintain security yourself. Airgap should be only part of your security protocol, improper use of Vault could still lead to loss of funds and/or secrets.
  • When properly used, Vault provides best achievable security with Substrate networks to-date.

System requirements

Currently Vault is available only for iOS. Android version is coming soon.

Getting Started

These tutorials and docs are heavily outdated at the moment, please use them as references or help improving

If you are upgrading from older version of Vault, please see changelog and upgrading Vault

Please note that the Vault app is an advanced tool designed for maximum security and complex features. In many use cases, more user-friendly tools would be sufficient.

Getting started guide

User Guides


Legacy versions

Older versions of this app could be useful for development, however, they are not safe for use in production. They are available at following branches:


Polkadot-Vault is GPL 3.0 licensed.



What is Vault?

Vault is an app for an air-gapped device, it turns an offline device — usually a smartphone — into a secure hardware wallet. Vault offers you a way to securely generate, store, manage and use your blockchain credentials.

Should I use Vault?

Vault is optimized for the highest security requirements. If you already manage many accounts on multiple networks, Vault is great for you. If you have little experience with blockchain networks but still want good security affordances, you might find the learning curve steep. We strive to make Vault as intuitive as possible; get in touch via signer@parity.io or GitHub Issues if you can help us get there!

How does an offline device communicate with the outside world?

Communication happens through scanning and generating QR codes. Scanned with Vault input-QRs interact with keys stored in Vault to, generate response-QRs on behalf of those keys. Usually, input-QR is a blockchain transaction, and a response-QR is a signature for this transaction. There are tried and true cryptographic algorithms that power these QR codes, as well as some smart engineering that make your dedicated device safe to use.

How do I keep my keys secure?

Vault is a safe way to use your keys. However, that alone won't be enough to keep your keys secure. Devices break and get lost. This is why we always recommend backing up your seed phrases and derivation paths on paper. We are such big fans of paper backups that we even support a special tool to power your paper backup game by splitting your backups into shards called Banana Split.

How do I know I am not interacting with malicious apps or actors?

The Vault does not interact with a network. The app itself does not have a way to check if an app or an account you're interacting with is malicious. If you use Vault with PolkadotJS Browser Extension, PolkadotJS Apps, or Signer Component Browser Extension they will rely on a community-driven curated list of potentially less-than-honest operators: https://polkadot.js.org/phishing/# to prevent you from interacting with certain sites and addresses. However, there are no limitations on the use of Vault with other tools.

I want to play with Vault to get a better feeling of how it works. Is there a way to do it without spending valuable tokens?

Yes. In Vault, you should add a key for an address on Westend network and request test tokens for that address, see the step-by-step guide on Polkadot Network Wiki.

You can use test tokens in the same way you would use value-bearing tokens.

For example with PolkadotJS Apps you can create a transaction on behalf of your account, generate a signature with Vault and submit it to the network. All of this without keys ever leaving your offline device.


What networks does Vault support?

From-the-shelf Polkadot Vault supports Polkadot, Kusama, and Westend networks. But it's not limited to these networks. More experienced users can generate metadata for any network to expand the capability of Polkadot Vault.

How can I update metadata version for a network?

Parity verifies and publishes recent metadata versions on Metadata Update Portal. With off-the-shelf Vault you can scan one of the multipart QR-"movies" same way you scan transaction QR:
in Vault open scanner, scan the QR for the respective network and accept new metadata.

Currently, Metadata Update Portal follows Polkadot, Kusama, and Westend network metadata updates. Parity is open to collaboration with participants of other networks and is currently exploring safe and more decentralized ways of publishing verified metadata.

If you want to update networks that you've added manually, please follow the Add Metadata steps in Add New Network guide.

Why do I need to update network metadata versions at all?

It's a safety feature. Substrate-based blockchain networks can be updated and otherwise changed; without recent metadata version of a network Vault won't be able to parse a transaction correctly, and you won't be able to read it and verify what you sign. Given that Vault is an app for an air-gapped device, you have to update the network version by using camera.

How can I add a new network to Vault?

Parity verifies and publishes network specs on Metadata Update Portal. To add one of the listed networks, in Metadata Update Portal click "Chain Specs", scan the network specs QR same way you scan transaction QR: in Vault open scanner, scan the QR and accept new network spec. Then scan the multipart QR-"movie" containing recent metadata for this network.

Can I add a network that does not have network specs and metadata QR published anywhere?

Yes. Follow the Add New Network step-by-step guide.

Currently, the process requires you to have rust, subkey and parity-signer repository on your machine.

Seeds and keys

Can I import my keys from polkadot{.js} apps or extension to Polkadot Vault?

Yes. Keys are compatible between polkadot{.js} and Polkadot Vault, except for the keys generated with Ledger (BIP39). To import seed keys into Polkadot Vault, you need to know:

  1. Seed phrase
    It should always be backed up in paper!
  2. Network you are adding address to and whether Polkadot Vault installed on your device has metadata for the respective network.
    If (2) is not one of the default built-in networks, you will need to add network yourself or find a distribution center for adding networks.
  3. Derivation path
    Only if you are importing a derived key, usually keys generated with polkadot{.js} are seed keys.

In Polkadot Vault go to Keys, then press "Plus" icon in the top right of the screen, select "Recover seed", enter display name to identify your seed, press "Next", enter the seed phrase. Done, you've got your seed key imported!
If you are importing a derived key select the seed from which your key is derived, select account's network, press "Plus" icon next to "Derived keys", enter your derivation path.

What is the difference between seed key and derived key? Why should I use derived keys?

A seed key is a single key pair generated from a seed phrase. You can “grow” as many derived keys from a single seed by adding derivation paths to your seed phrase.

Learn more about types of derivation paths on substrate.io.

Derivation path is sensitive information, but knowing the derivation path is not enough to recover a key. Derived keys cannot be backed up without both of the ingredients: seed phrase (can be shared between multiple keys) and a derivation path (unique for each of the keys “grown” from that seed).

The main reason to use derived keys is how easy it is to back up (and restore from a backup) a derivation path compared to seed phrase.

What is an identicon, the image next to my keys?

An identicon is a visual hash of a public key — a unique picture generated from your public key. The same public key should have the same identicon regardless of the application. It is a good tool to distinguish quickly between keys. However, when interacting with keys, i.g. verifying a recipient of a transaction, do not rely only on identicons, it is better to check the full public address.

How can I rename one of my seeds?

Due to security considerations, you cannot rename a seed. Please back up the seed and derived keys, remove it and add the seed again with a new name instead.

Security and Privacy

Device security

Polkadot Vault is built to be used offline. The mobile device used to run the app will hold important information that needs to be kept securely stored. It is therefore advised to:

  • Get a separate mobile device.
  • Make a factory reset.
  • Enable full-disk encryption on the device, with a reasonable password (might not be on by default, for example for older Android devices).
  • Do not use any kind of biometrics such as fingerprint or face recognition for device decryption/unlocking, as those may be less secure than regular passwords.
  • Once the app has been installed, enable airplane mode and make sure to switch off Wifi, Bluetooth or any connection ability of the device.
  • Only charge the phone on a power outlet that is never connected to the internet. Only charge the phone with the manufacturer's charging adapter. Do not charge the phone on public USB chargers.

How to get it and use it?

Install the app

The app is available in beta for Android and iOS :

Please double check carefully the origin of the app, and make sure that the company distributing it is Parity Technologies. Usual security advice apply to this air-gapped wallet:

  • When creating an account using Polkadot Vault Mobile app, make sure to write down the recovery phrase and store it in safe places.
  • Always double check the information of the transactions you are about to sign or send.
  • Make sure to first transfer a small amount of Ether with the app and verify that everything is working as expected before transferring larger amounts of Ether.

How to update Polkadot Vault securely

Once Polkadot Vault is installed, your device should never go online. This would put your private keys at threat. To update, you will need to :

  1. Make sure you possess the recovery phrase for each of your accounts. You can find it on Polkadot Vault by :
  • v4.0 choosing an identity > click the user icon at the top right > “Show Recovery Phrase”
  • v2.2 tapping an account > 3 dots menu at the top right > “Backup Recovery Phrase”
  • v2.0 tapping an account > tap on the account address > “Backup Recovery Phrase”
  1. Factory reset the device.
  2. Enable full-disk encryption on the device and set a strong password (might not be on by default, for example for older Android devices).
  3. Do not use any kind of biometrics such as fingerprint or face recognition for device decryption/unlocking, as those may be less secure than regular passwords.
  4. Install Polkadot Vault from the Apple store or Android store or download the APK from Polkadot Vault's Github repository (make sure you are on the right website and verify the checksum)
  5. Once the app has been installed, enable airplane mode and make sure to switch off Wifi, Bluetooth, and any other connection ability the device has.
  6. Only charge the phone on a power outlet that is never connected to the internet. Only charge the phone with the manufacturer's charging adapter. Do not charge the phone on public USB chargers.
  7. Recover your accounts.

What data does it collect?

None, it's as simple as that. The Polkadot Vault Mobile Android and iOS apps do not send any sort of data to Parity Technologies or any partner and work completely offline once installed.

Polkadot Vault Accounts Management

Polkadot Vault v4 has introduced the Hierarchical Deterministic Key Derivation (HDKD) feature for Substrate networks. This article explains how to use this feature.

  • Notice: The UI may be variant for different versions, but the functionalities are the same in v4 version.


Seed is the starting point for generating accounts. The seed itself does not have any network affiliation. For Substrate networks, generating a new accounts means entering a derivation path and choosing a network. With this feature, you can manage as many accounts as needed with just one seed phrase safely stored.

Key Generation

Create an account for a Substrate based network.

Key generation also refers to accounts creation, with your created Identity:

  • Go to key manager and create a new seed or select an existing one
  • Choose a network
  • Tap on any key
  • Tap Derive or N+1 Button
  • In path derivation screen, input any path and name you like (or accept naming suggestion)
  • (optional) type password
  • Tap Derive Button
  • Done, you can start using new address.

The form of path

Paths also refer to the chain codes which described in , though it is different from BIP-32 style:

  • Soft derivation starts with a single slash, like: /soft
  • Hard derivation starts with a double slash, like: //hard

Users are able to create any combination of hard derivation with // and/or soft derivation with /.

The encoded string are limited to 32 Bytes.

For technical information about the soft and hard derivations on Substrate, please refer to introduction here.

Path also could contain optional password; in Subkey standard password is prefixed with ///. However, for convenience, Vault device has separate password entry field with password confirmation, thus do not add /// to derivation field, it will result in error - instead omit /// and type password into its' special field. It will not be stored on the device and will be required for any operation that requires private key of the account. There is no way to restore this password if it is lost so please back it up carefully.

Further notes

  • With the same BIP32 seed users could create keys under different networks.
  • Each derived account is bound to certain networks, which prevents it to be misused in another network until it is explicitly added for that network as well. Root account is available for all networks by default.


  1. https://github.com/w3f/schnorrkel
  2. https://wiki.polkadot.network/docs/en/learn-keys



Android version release, minor fixes

New in version 5.0.0


No more typescript or react native. Backend is completely in Rust, frontend is in native.



Number of dependencies was greatly reduced; no npm/yarn/nodejs/cocoapods, etc. All dependencies are handled by:

  • Cargo (rust packages)
  • Xcode (only default iOS frameworks are used)
  • Gradle

Rust backend

Rust libraries were moved back into the repository. Crypto functions are imported from Substrate. All logic and most of storage is written in Rust. An important hack here is that rust/signer crate has 2 versions of Cargo.toml for android and iOS architectures, as target library features could not be adjusted by normal means.

Native frontend

Frontend for both iOS and Android re-written in native frameworks. Thus, standard out-of-the-box build scripts could be used for building once Rust libraries are built and linked


Secure seed storage

Secrets are stored in devices' encrypted storage and some effort is made to prevent them leaking in system memory. Thus, all is as safe as the phone is - the same credentials used for unlocking the phone are used to unlock seeds. User is responsible to keep them adequate.

Transaction preview

Transactions content is shown before signing; no hash signing is allowed, but signing messages is possible.

History feature

The Vault now logs all operations it performs. It it important to remember that this is not log of account operations, but log of device history. This history could be cleared if needed, but not modified by other means. Detected presence of network connection is also logged.

N+1 derivation

Much requested feature that makes Vault automatically increment numbered seeds on creation.

Network and metadata updates

All network data updates now could be performed through scanning QR codes. Whenever some update is needed, most probably you should just scan some QR video. Don't worry about skipped frames, it's fountain code so you only need enough frames.

All updates could be signed, and signing key will be trusted on first use, so Vault device should be linked to single source of authority on correct metadata.

Key re-use in different networks

Keys could be used only in one network. Need to re-use key in another network? Just create key with the same derivation path in that network to allow re-use and it will work.

User Guides

Starting with Vault

This is suggested usage pattern; you should adjust it to your security protocol if you are certain you know what you are doing.


Factory reset the phone

The Vault should be installed in most secure environment possible. To achieve that, the phone should be reset to factory state.

Wipe the phone to factory state. This is good time to install newer version of operating system if you like. Make sure your system is genuine by all means provided by OS vendor.

Set up phone

Before installing the Vault, you need to set up the phone. It is essential that you enable sufficient authentication method; your secret seeds in Vault are as safe as the phone is. Seed secrets are protected with hardware encryption based on vendor authentication protocol. Other than that, you might want to select dark mode (Vault remains dark for historic reasons).

Install Vault

Download signed application through application store or from github. Make sure the signature is valid! Install the app. Do not start the app just yet!

Disable network

Before starting the Vault, you should make sure that network is disabled. Many operating systems allow only partial network monitoring; although there are network detection features in Vault, they are limited and only have informational function. User is responsible for maintaining airgapped state! The simplest way to disable connectivity is setting the phone in airplane mode. Advanced users might want to use physical methods to further protect the phone from connections. Perform all preparations before starting the Vault app!

First start

When you first launch Vault, it prompts you to read and accept terms and conditions and privacy policy. Once that is done, the database is pre-populated with built-in networks and Vault is ready for use. It could import network data or read transactions, but to sign anything you need to create keys.

Create keys

Open key manager by tapping bottom left symbol. On fresh start you will be prompted to create seed (otherwise you could always create more seeds by tapping New seed button in Key Manager). Enter any convenient seed name (it does not matter anything and is not used anywhere except for this particulat Vault device) and - if you would like to use custom seed phrase - switch to recovery mode and type the seed phrase. Custom seed phrase should be used only to recover or import existing key(s), do not input custom seed phrase unless it is properly random! Security of your accounts relies on randomness of seed phrase. If you are generating new seed phrase, use built-in random generator and do not input a custom seed phrase.

Once you click create button, you will be prompted to authenticate yourself. This will happen every time cruptographic engine of the phone is used to handle seeds - on all creations, backups, derivations and signatures and in some OS versions on starting the Vault.

You will see the created secret seed. Please back it up on paper and store it in safe place. If you lose your Vault device or it will become non-functional, you will be able to recover your keys using this seed phrase. Anyone could recover your keys with knowledge of this phrase. If you lose this seed phrase, though, it will be impossible to recover your keys. You can check the seed phrase anytime in Settings menu, but make sure that it is backed up at all times.

Once you dismiss seed phrase backup screen, the seed and some associated keys will be created. For every network known to the Vault, a network root derivation key will be generated, hard-derived from seed phrase with network name. A root key will be generated and made available in all networks. Do not use the root key unless you know what you do!.

To learn more on key generation, read subkey specifications that Vault follows tightly and Vault key management.

Export public key

Once you have a keypair you would like to use, you should first export it to hot wallet. Tap the key and select Export button. You will see the export QR code you can use with hot wallet.

Details on signing with Pokadot.js Apps

Upgrading Vault

First of all, you need to be certain you want to upgrade Vault. Starting from v5, all network information could be downloaded through QR codes and upgrades are needed only to add new software features.

Preparation to upgrade

Back up your keys

Make sure your keys are backed up - you should have all seed phrases and derivations recorded somewhere. Once you proceed to the next step, there is no way to recover lost information.

Make sure to back up all keys for all networks you use. Ideally you should already have some kind of backup, make sure it is up to date.

Wipe Vault device

Once you are certain that you have backed up everything, open settings, select Wipe all data button and confirm your action. All data in the Vault will be factory reset; congratulations!

Factory reset the phone

When the Vault is removed, wipe the phone to factory state. This is good time to install newer version of operating system if you like. Make sure your system is genuine by all means provided by OS vendor.

Set up phone

Before installing the Vault, you need to set up the phone. It is essential that you enable sufficient authentication method; your secret seeds in Vault are as safe as the phone is. Seed secrets are protected with hardware encryption based on vendor authentication protocol. Other than that, you might want to select dark mode (Vault remains dark for historic reasons)

Install Vault

Download signed application through application store or from github. Make sure the signature is valid! Install the app. Do not start the app just yet!

Disable network

Before starting the Vault, you should make sure that network is disabled. Many operating systems allow only partial network monitoring; although there are network detection features in Vault, they are limited and only have informational function. User is responsible for maintaining airgapped state! The simplest way to disable connectivity is setting the phone in airplane mode. Advanced users might want to use physical methods to further protect the phone from connections. Perform all preparations before starting the Vault app!

Start the Vault

Start the app. Read and accept information provided on launch. Congratulations! The app is ready to use now, however it does not have any keys and only few basic built-in networks. Proceed with setting up keys and adding new networks.

Add New Network

Polkadot Vault supports adding any substrate-based networks or updating an existing network via QR code.

After you've installed required software, you need to add Network's Specs to Vault and add Network Metadata for this network, so that Vault is able to decode, and you could read and verify transactions you are signing on this network.

If you need to update metadata for already existing network you only need to update Network Metadata. Off-the-shelf Vault comes with networks that you can update by scanning a multipart QR codes that contain recent metadata for these networks at Metadata Update Portal.

Network Specs

  1. Get
  2. Sign
  3. Feed into Vault

Network Metadata

  1. Get
  2. Sign
  3. Feed into Vault


  • Network details
    • RPC endpoint (websocket URL)
      Hint: You can RPC endpoints for some of the public networks e.g. in polkadot-js/apps repository
    • Encryption algorithm
  • rust
  • a clone of parity-signer repository
  • subkey
  • Dedicated keypair specifically for signing updates
    Please make sure you have a backup <secret-phrase> and Public key (hex) of this keypair. You will be able to update a network only with metadata that is signed with the same keypair as network specs. You can generate it with any tool of your choice, e.g with subkey: subkey generate.

Let's get started!

Add Network Specs

Get Network Specs

In parity-signer/rust/generate_message

cargo run add-specs -u <network-ws-url> --encryption <crypto>

// e.g.
cargo run add-specs -u wss://statemint-rpc.polkadot.io --encryption sr25519

For networks supporting several tokens:

cargo run add-specs -d -u <network-ws-url> --encryption <crypto> --token-decimals <decimals> --token-unit <SYMBOL>

// e.g.
cargo run add-specs -d -u wss://karura-rpc-0.aca-api.network --encryption sr25519 --token-decimals 12 --token-unit KAR

Now your <specs-file> is in parity-signer/rust/files/for_signing.

Hint: you can read more about interface with hot database if you want to maintain it.

Sign Network Spec

Get signature

In parity-signer/rust/files/for_signing

cat <spec-file> | subkey sign --suri <seed-phrase-and-derivation>
// e.g.
cat sign_me_add_specs_statemint_sr25519 | subkey sign --suri "bottom drive obey lake curtain smoke basket hold race lonely fit walk//Alice"

This will return a <signature> you need to make a signed QR.

Make signed QR

In parity-signer/rust/generate_message

cargo run --release make --goal qr --crypto <crypto> --msg add-specs --payload <spec-file> --verifier-hex <public-key> --signature-hex <signature>
// e.g.
cargo run --release make --goal qr --crypto sr25519 --msg add-specs --payload sign_me_add_specs_statemint_sr25519 --verifier-hex 0x927c307614dba6ec42f84411cc1e93c6579893859ce5a7ac3d8c2fb1649d1542 --signature-hex fa3ed5e1156d3d51349cd9bb4257387d8e32d49861c0952eaff1c2d982332e13afa8856bb6dfc684263aa3570499e067d4d78ea2dfa7a9b85e8ea273d3a81a86

Now your <spec-qr> is in parity-signer/rust/files/signed

Feed Network Specs into Vault

In Vault open scanner, scan your <spec-qr> and approve chain specs.

Add Network Metadata

Get Network Metadata

In parity-signer/rust/generate_message

cargo run load-metadata -d -u `<network-ws-url>`
// e.g.
cargo run load-metadata -d -u wss://statemint-rpc.polkadot.io

This will fetch fresh <metadata_file>, update the database with it, and - most relevant to us currently - generate file with message body in parity-signer/rust/files/for_signing.

Sign Network Metadata

Get Signature

In parity-signer/rust/files/for_signing

cat <metadata-file> | subkey sign --suri <seed-phrase-and-derivation>
// e.g.
cat sign_me_load_metadata_statemintV800 | subkey sign --suri "bottom drive obey lake curtain smoke basket hold race lonely fit walk//Alice"

Make signed QR

In parity-signer/rust/generate_message

cargo run --release make --goal qr --crypto <crypto> --msg load-metadata --payload <metadata-file> --verifier-hex <public-key> --signature-hex <signature>
// e.g.
cargo run --release make --goal qr --crypto sr25519 --msg load-metadata --payload sign_me_load-metadata_statemintV800 --verifier-hex 0x927c307614dba6ec42f84411cc1e93c6579893859ce5a7ac3d8c2fb1649d1542 --signature-hex 6a8f8dab854bec99bd8534102a964a4e71f4370683e7ff116c84d7e8d5cb344efd3b90d27059b7c8058f5c4a5230b792009c351a16c007237921bcae2ede2d84

This QR might take some time to be generated. After it is finished you can find your <metadata-qr> in parity-signer/rust/files/signed. It is a multipart QR-"movie", if your image viewer does not render it correctly, we suggest to open it in a browser.

Feed Network Metadata into Vault

In Vault open scanner, scan your <metadata-qr> and accept new metadata.

Congratulations! You've fetched network specs, signed them, fed them into Vault, fetched recent metadata for the network, signed and fed it into Vault as well. Now you are ready to safely sign transactions on this network.

Polkadot Vault tutorial with Polkadot-js apps

This tutorial will walk you through setting up a Kusama account with the Polkadot Vault Android or iOS App and then use this account together with Polkadot-js apps to see your balance and transfer funds or perform any extrinsic from this account.

  • Notice: The UI may be variant for different versions, but the functionalities are the same in v4 version.


1. Get Polkadot Vault mobile application

Device security

Polkadot Vault is meant to be used offline. The mobile device used to run Polkadot Vault will hold valuable information that needs to be kept securely stored. It is therefore advised to:

  • Get a Polkadot Vault dedicated mobile device.
  • Make a factory reset.
  • Enable full-disk encryption on the device, with a reasonable password (might not be on by default, for example for older Android devices).
  • Do not use any biometrics such as fingerprint or face recognition for device decryption/unlocking, as those may be less secure than regular passwords.
  • Once Polkadot Vault has been installed, enable airplane mode and make sure to switch off Wifi, Bluetooth or any connection ability of the device.
  • Only charge the phone using a power outlet that is never connected to the internet. Only charge the phone with the manufacturer's charging adapter. Do not charge the phone on public USB chargers.

Please find more info here about the Polkadot Vault application.

Install Polkadot Vault mobile application

Install Polkadot Vault making sure that it originated from Parity Technologies

2. Setup or recover an Identity

When launching the app for the first time, no identity has been set up yet. At this stage, you will either want to create an identity directly from your mobile device or recover an identity previously created.

Create an identity

Tap on the Create button, and give a name to this identity.

In the next step, your recovery phrase will be presented to you. Think of it as a master key. If you lose it, you lose your money. Write this recovery phrase down and store it in a safe place. If your phone gets stolen/broken/forgotten this will be the only way to recover your account.

You will then be asked to choose a pin code. This pin will be needed later on to unlock your account to manage the identity or sign a transaction.

The next screen will allow you to select a network to generate an account. If you choose an Ethereum network, the related Ethereum account will be generated for the identity,

If you choose a Substrate network (like Kusama), you will first create a root account, and then you will be able to derive more accounts with specified paths and names. The name can be changed later on, but once the path is set, it can not be changed. More information about path derivation see here.

For each derived account, you will be able to see the address and its related QR code.

create account

Recover an identity with your recovery phrase

If you already have an account created with either Polkadot Vault or any other wallet, you can recover it by doing so:

  • Tap on the top right side user icon, and choose + Add Identity.
  • Input the new identity name and tap the Recover Identity button.
  • Type in the recovery phrase, word suggestion helps you prevent any typo. The field will turn red if the recovery phrase is not a bip39.
  • Tap Recover Identity.
  • Select a PIN number and confirm it by typing it again.
  • Identity generated, now you can select the network to create the first account.

NOTICE: For V3 user, after recovering the seed phrase of Kusama account, the account will appear as an identity root account aside with identity name in the network selection screen.

3. Add Polkadot Vault's account to Polkadot-js apps

To be able to follow this tutorial and interact with the Blockchain from a freshly created account on Polkadot Vault, you will need to get some KSMs on this account first. Polkadot-js apps allows you to manage your Vault account seamlessly.

  • Visit Polkadot-js apps website.
  • Go to Accounts from the left sidebar.
  • Click on Add via QR button in the top right-hand corner.
  • It will ask for the webcam permission for you to scan the Polkadot Vault's account QR code, accept it.
  • On Polkadot Vault, choose on the account you want to copy the address of.
  • Scan the QR code displayed on your phone with your computer's webcam. Make sure the QR code is fully displayed on your mobile's screen.
  • You can now name this account on Polkadot-js apps.

4. Sign a transaction

Assuming that your Polkadot Vault account now has funds, you will be able to send some funds securely to anyone, without transferring your private key, and without needing any internet connection on your mobile phone.

  • On Polkadot-js apps, click on the send button next to your account.
  • On Polkadot-js apps, enter the address of the account you want to send funds to. Make sure to try with a small amount of money first before sending larger amounts.
  • Click on Make Transfer
  • Review the transaction, you can add tip to this transaction.
  • Click on Scan via QR Code when you're done.

Polkadot Vault Polkadot send transaction

You will now be presented with a QR code that represents the transaction. Since this transaction is sending funds from your Polkadot Vault mobile app account, only this account (sitting on your phone) can sign and authorize this transaction. This is what we'll do in the next steps:

  • From the Polkadot Vault account overview, tap the scan button on the top right and scan the QR code presented by the Polkadot-js apps website.
  • Review the transaction addresses and the amount to send on your phone. The amount and addresses must match what you've entered in apps. If you got phished, this is where you can realize it and reject the transaction.
  • Once you're sure, scroll down and click Sign Transaction to enter your pin and get the QR code of the scanned transaction.

Sign Polkadot apps transaction

Your phone has now signed the transaction offline using your Polkadot Vault account private key. The QR code that is now displayed on your phone represents a signed transaction that can be broadcasted. We will do this in the next steps:

  • On Polkadot-js apps, click on Scan Signature QR, this will ask to turn on your webcam again.
  • Face your phone's display to your webcam for the website to be able to read the signed transaction.
  • Your transaction is sent automatically.
  • Congrats you just sent funds from an air-gapped account :)

Recover Account from PolkadotJS

The default behavior on Polkadot Vault and PolkadotJS Apps is a little different. This tutorial will walk you through recovering an account on Polkadot Vault from PolkadotJS Apps.

  • Notice: The UI may be variant for different versions, but the functionalities are the same in v4 version.

Get the mnemonic phrase and path

When creating an account on PolkadotJS, it will give you a mnemonic phrase on the beginning without any key derivation. You can change it if you click Advanced creation options button, here you can specify any path you like, leave it as an empty string if you do not want changes.

Create Account on PolkadotJS Apps

Recover identity with mnemonic phrase

On Polkadot Vault, each mnemonic phrase represents an identity, every account starts with an identity, and identity could derive infinite accounts from it. So firstly let's recover the identity from the mnemonic phrase.

recover the identity.

After tapping one network from the list, you will have the default account created for this network with a default path (as on the above image, //polkadot), but this one is different from the one created from Polkadot.js Apps, because the paths are different.

create the account by path

On Polkadot Vault, accounts are grouped by different networks, and the accounts generated by the default way are always prefixed with the network name. For example, //polkadot//fund or //polkadot//staking. So to recover an account with an arbitrary path, we need tap Add Network Account -> Create Custom Path.

Recover Account

Here We can input the path from PolkadotJS Apps, if you do not have a specific path, then just leave it empty. And then after we choose the network, we will have the same account as we created from PolkadotJS Apps.



First and foremost, make sure you have the latest Rust installed in your system. Nothing will work without Rust.

If you get errors like cargo: feature X is required, it most likely means you have an old version of Rust. Update it by running rustup update stable.


1. You probably already have Xcode installed if you are reading this. If not, go get it.

2. Compile the core Rust library first:

cd scripts && ./build.sh ios

3. Open the NativeSigner.xcodeproj project from the ios folder in your Xcode and click Run (Cmd+R).

4. The first time you start the app, you will need to put your device into Airplane Mode. In the iOS simulator, you can do this by turning off WiFi on your Mac (yes, this is an official apple-recommended way).

However, we strongly recommend that you use a real device for development, as some important parts (e.g. camera) may not work in the simulator.


1. Download Android Studio.

2. Open the project from the android directory.

3. Install NDK. Go to File -> Project Structure -> SDK Location. Next to the "Android NDK location" section, click "Download Android NDK" button.

We highly recommend you to update all existing plugins and SDK's for Kotlin, Gradle, etc even if you just downloaded a fresh Android Studio. It's always a good idea to restart Android Studio after that. This can save you many hours on Stackoverflow trying to fix random errors like "NDK not found".

4. Connect your device or create a virtual one. Open Tools -> Device Manager and create a new phone simulator with the latest Android.

5. (macOS) Specify path to python in local.properties.


6. Run the project (Ctrl+R). It should build the Rust core library automatically.

Vault structure

Architectural structure

On top level, Vault consists of following parts:

  1. Rust backend core
  2. FFI interface
  3. Native frontend
  4. Database

Rust backend

There are 3 actual endpoints in rust folder: signer, which is source of library used for Vault itself; generate_message, which is used to update Vault repo with new built-in network information and to generate over-the-airgap updates; and qr_reader_pc which is a minimalistic app to parse qr codes that we had to write since there was no reasonably working alternative.

Sub-folders of the rust folder:

  • constants — constant values defined for the whole workspace.
  • db_handling — all database-related operations for Vault and generate_message tool. Most of the business logic is contained here.
  • defaults — built-in and test data for database
  • definitions — objects used across the workspace are defined here
  • files — contains test files and is used for build and update generation processes. Most contents are gitignored.
  • generate_message — tool to generate over-the-airgap updates and maintain network info database on hot side
  • navigator — navigation for Vault app; it is realized in rust to unify app behavior across the platforms
  • parser - parses signable transactions. This is internal logic for transaction_parsing that is used when signable transaction is identified, but it could be used as a standalone lib for the same purpose.
  • printing_balance — small lib to render tokens with proper units
  • qr_reader_pc — small standalone PC app to parse QR codes in Vault ecosystem. Also is capable of parsing multiframe payloads (theoretically, in practice it is not feasible due to PC webcam low performance)
  • qr_reader_phone — logic to parse QR payloads in Vault
  • qrcode_rtx — multiframe erasure-encoded payload generator for signer update QR animation.
  • qrcode_static — generation of static qr codes used all over the workspace
  • signer — FFI interface crate to generate bindings that bridge native code and rust backend
  • transaction_parsing — high-level parser for all QR payloads sent into Vault
  • transaction_signing — all operations that could be performed when user accepts payload parsed with transaction_parsing

FFI interface

For interfacing rust code and native interface we use uniffi framework. It is a framework intended to aid building cross-platform software in Rust especially for the cases of re-using components written in Rust in the smartphone application development contexts. Other than Vault itself one of the most notable users of the uniffi framework are the Mozilla Application Services

uniffi framework provides a way for the developer to define a clear and a typesafe FFI interface between components written in Rust and languages such as Kotlin and Swift. This approach leads to a much more robust architecture than implementing a homegrown FFI with, say, passing JSON-serialized data back and forth between Kotlin and Rust code. Here is why.

Suppose the application needs to pass a following structure through FFI from Kotlin to Rust or back:

#[derive(Serialize, Deserialize)]
 struct Address { street:String, city: String, }

This would mean that on the Kotlin side of the FFI there would have to be some way of turning this type from JSON into a Kotlin type. It may be some sort of scheme or even a manual JSON value-by-key data extraction.

Now suppose this struct is changed by adding and removing some fields:

#[derive(Serialize, Deserialize)]
 struct Address { country: String, city: String, index: usize, }

After this change on a Rust-side the developer would have to remember to reflect these changes on the Kotlin and Swift sides and if that is not done there is a chance that it will not be caught in build-time by CI. It is quite hard to remember everything and having a guarantee that such things would be caught at compile time is much better than not having this sort of guarantee. One of the things uniffi solves is exactly this: it provides compile-time guarantees of typesafety.

The other concern with the JSON serialization approach is performance. As long as small objects are transferred back and forth it is no trouble encoding them into strings. But suppose the application requires transferring bigger blobs of binary data such as png images or even some metadata files. Using JSON would force the developer to encode such blobs as Strings before passing them into FFI and decoding them back into binary blobs on the other side of the FFI. uniffi helps to avoid this also.

Native frontend

Native frontends are made separately for each supported platform. To keep things uniform, interfaces are made as simple as possible and as much code is written in unified Rust component, as possible. Yet, platform-specific functions, including runtime management and threading, are also accessed through native framework. The structure of native frontend follows modern (2022) reactive design pattern of View-Action-Model triad. Thus, all backend is located in data model section, along with few native business logic components.

It is important to note, that native navigation is not used, as due to subtle differences in its seemingly uniform design across platforms. Navigation is instead implemented on Rust side and, as an additional advantage, is tested there at lower computational cost for CI pipelines.


For storage of all data except secrets, a sled database is used. Choice of db was based on its lightweightness, reliability, portability.

Vault database structure

Functional structure

Vault has the following systems:

  • Secure key management
  • Signing
  • Transaction parsing
  • Transaction visualization
  • Airgap data transfer
  • Airgap updating
  • Network detector
  • Logging
  • Self-signing updating capability
  • UI

These systems are located in different parts the app and some of them rely on hot-side infrastructure. The general design goal was to isolate as much as possible in easily maintainable Rust code and keep only necessary functions in native side. Currently, those include:

  • Hardware secret storage: we rely on hardware designer's KMS in accordance with best practices
  • Network detector: network operations are limited by OS and we try to keep network access permissions for the app to minimum while still maintaining simple breach detection
  • Camera: currently image capture and recognition systems implementations in native environments by far surpass 3rd party ones. This might change in the future, but at least image capture will be managed by OS to maintain platform compatibility.
  • UI: we use native frameworks and components for rendering and user interaction for best look and feel of the app.

Secure key management

Keypairs used in Vault are generated from secret seed phrase, derivation path and optional secret password, in accordance with specifications described in subkey manual using code imported directly from substrate codebase for best conformance.

Secret seed phrase storage

Secret seed phrase is stored as a string in devices original KMS. It is symmetrically encrypted with a strong key that either is stored in a hardware-protected keyring or uses biometric data (in case of legacy android devices without strongbox system). Secrets access is managed by operating system's built-in authorization interface. Authorization is required for creation of seeds, access to seeds and removal of seeds. One particular special case is addition of the first seed on iOS platform, that does not trigger authorization mechanism as the storage is empty at this moment; this is in agreement with iOS key management system design and potentially leads to a threat of attacker replacing a single key by adding it to empty device; this attack is countered by authorization on seed removal.

Thus, secret seeds source of truth is KMS. To synchronize the rest of the app, list of seed identifiers is sent to backend on app startup and on all events related to changes in this list by calling update_seed_names(Vec<String>).

Random seed generator and seed recovery tools are implemented in Rust. These are the only 2 cases where seed originates not in KMS.

Derivation path management

The most complex part of key management is storage of derivation strings and public keys. Improper handling here may lead to user's loss of control over their assets.

Key records are stored as strings in database associated with secret seed identifiers, crypto algorithm, and list of allowed networks. Public key and its cryptographic algorithm are used to deterministically generate database record key - thus by design distinct key entries directly correspond to addresses on chain.

Creation of new records requires generation of public keys through derivation process, thus secret seed should be queried - so adding items to this database requires authentication.

Substrate keys could be natively used across all networks supporting their crypto algorithm. This may lead to accidental re-use of keys; thus it is not forbidden by the app, but networks are isolated unless user explicitly expresses desire to enable key in given network. From user side, it is abstracted into creation of independent addresses; however, real implementation stores addresses with public keys as storage keys and thus does not distinguish between networks. To isolate networks, each key stores a field with a list of allowed networks, and when user "creates" address with the same pubkey as already existing one, it is just another network added to the list of networks.

Keys could be imported through QR code created by generate_message tool (instructions). A plaintext newline-separated list of derivations should be supplied to the tool along with network identifier; the import thus is bound to certain network, however, it is not bound to any particular seed - user can select any of created seeds and, after authorization, create keys with given paths. Bulk import of password-protected keys is forbidden at the moment.

Optional password

Optional password (part of derivation path after ///) is never stored, only addresses that have password in their derivation path are marked. Thus, password is queried every time it is needed with a tool separate from OS authentication interface, but together with authentication screen, as password is always used with a secret seed phrase.

Memory safety in secure key management

All memory handles by native framework relies on native framework's memory protection mechanisms (JVM virtualization and Swift isolation and garbage collection). However, when secrets are processed in Rust, no inherent designed memory safety features are available. To prevent secrets remaining in memory after their use, zeroize library is used. Also, describe string destruction protocol or fix it


Every payload to be signed is first extracted from transfer payload in agreement with UOS specification and polkadot-js implementation. Only payloads that could be parsed and visualized somehow could be signed to avoid blind signing - thus on parser error no signable payload is produced and signing procedure is not initiated.

When signable payload is ready, it is stored in TRANSACTION tree while user makes decision on whether to sign it. While in storage, database checksum is monitored for changes.

Signing uses private key generated from KMS-protected secret seed phrase, derivation string and optional password. Signing operation itself is imported directly from substrate codebase as dependency.

Signing event or its failure is logged and signature wrapped in UOS format is presented as a qr static image on the phone.

Transaction parsing

Transaction parsing process is described in UOS format documentation

Transaction visualization

Signable transaction is decomposed into hierarchical cards for clarity. All possible scale-decodable types are assigned to generalized visualization patterns ("transaction cards") with some types having special visualizations (balance formatted with proper decimals and units, identicons added to identities, etc.). Each card is assigned order and indent that allow the cards to be shown in a lazy view environment. Thus, any networks that have minimal metadata requirements should be decodable and visualizable.

Some cards also include documentation entries fetched from metadata. Those could be expanded in UI on touch.

Thus, user has opportunity to read the whole transaction before signing.

Airgap data transfer

Transactions are encoded in accordance to UOS standard in QR codes. QR codes can be sent into Vault - through static frames or dynamic multiframe animations - and back - only as static frames. QR codes are decoded through native image recognition system and decoded through rust backend; output QR codes are generated in png format by backend. There are 2 formats of multiframe QR codes: legacy multiframe and raptorq multiframe. Legacy multiframe format requires all frames in animation to be collected and is thus unpractical for larger payloads. RaptorQ multiframe format allows any subset of frames to be collected and thus allows large payloads to be transferred effortlessly.

Fast multiframe transfer works efficiently at 30 fps. Typical large payloads contain up to 200 frames at current state of networks. This can be theoretically performed in under 10 seconds; practically this works in under 1 minute.

Airgap updating

Vault can download new networks and metadata updates from QR data. To prevent malicious updates from compromising security, a system of certificates is implemented.

Updates could be generated by any user; they can also be distributed in signed form to delegate validity check job to trusted parties. These trusted parties should sign metadata with their asymmetric key - certificate - and they become verifiers once their update is uploaded to Vault. There are 2 tiers of certificates - "general" and "custom", with the first allowing more comfortable use of Vault at cost of only one general verifier allowed.

Rules about verifier certificates are designed around simplicity of security protocol: one trusted party becomes main source of trust and updates generated by it are just accepted. If that party does not have all required updates available, other party can be added as custom verifier. That verifier is not allowed to change specs at will and suspicious activity by custom verifier would interfere with network usage thus stopping user from doing potentially harmful stuff. This allows less strenuous security policy on user side.

It is important to note that certificates could not be effectively revoked considering airgapped nature of the app, thus it is recommended to keep their keys on airgapped Vault devices if updates signed by these certificates are distributed publicly.

More on certificates

Network detector

An additional security feature is network detector. When the app is on, it runs in the background (on low-priority thread) and attempts to monitor the network availability. This detector is implemented differently on different platforms and has different features and limitations; however, it does not and could not provide full connectivity monitoring and proper maintaining of airgap is dependent on user. Vault device should always be kept in airplane mode and all other connectivity should be disabled.

The basic idea of network detection alertness is that when network connectivity is detected, 3 things happen:

  1. Event is logged in history
  2. Visual indication of network status is presented to user (shield in corner of screen and message in alert activated by the shield)
  3. Certain Vault functions are disabled (user authentication, seed and key creation, etc.) - features that bring secret material into active app memory from storage

When network connectivity is lost, only visual indication changes. To restore clean state of Vault, user should acknowledge safety alert by pressing on shield icon, reading and accepting the warning. Upon acknowledging, it is logged in history, visual indication changes to green and all normal Vault functions are restored.

Network detector in iOS

Airplane mode detection in iOS is forbidden and may lead to expulsion of the app from the App Store. Thus, detector relies on probing network interfaces. If any network interface is up, network alert is triggered.

Network detector in Android

Network detector is triggered directly by airplane mode change event.

Bluetooth, NFC, etc,

Other possible network connectivity methods are not monitored. Even though it is possible to add detectors for them, accessing their status will require the app to request corresponding permissions form OS, thus reducing app's isolation and decreasing overall security - first, by increasing chance of leak in breach event, and second, by making corrupt fake app that leaks information through network appear more normal. Furthermore, there is information that network might be connected through cable in some devices in airplane mode; there was no research on what debugging through cable is capable of for devices in airplane mode. Thus, network detector is a convenience too and should not be relied on as sole source of security; user is responsible for device isolation.


All events that happen in Vault are logged by backend in history tree of database. From user interface, all events are presented in chronological order on log screen. On the same screen, history checksum could be seen and custom text entries could be added to database. Checksum uses time added to history records in computation and is therefore impractical to forge.

Events presented on log screen are colored to distinguish "normal" and "dangerous" events. Shown records give minimal important information about the event. On click, detailed info screen is shown, where all events happened at the same time are presented in detail (including transactions, that are decoded for review if metadata is still available).

Log could also be erased for privacy; erasure event is logged and becomes the first event in recorded history.

Self-signing updating capability

Vault can sign network and metadata updates that could be used for other signers. User can select any update component present in Vault and any key available for any network and generate a qr code which, upon decoding, can be used by generate_message or similar tool to generate over-the-airgap update. See detailed documentation

This feature was designed for elegance, but it is quite useful to maintain update signing key for large update distribution centers, for it allows to securely store secret certificate key that could not be practically revoked if compromised.


User interface is organized through View-Action-DataModel abstraction.


Vault visual representation is abstracted in 3 visual layers placed on top of each other: screen, modal and alert. This structure is mostly an adaptation of iOS design guidelines, as android native UI is much flexible and it is easier to adopt it to iOS design patterns than vice versa. Up to one of each component could be presented simultaneously. Screen component is always present in the app, but sometimes it is fully or partially blocked by other components.

Modals and alerts are dismissed on goBack action, screens have complex navigation rules. Modals require user to take action and interrupt flow. Alerts are used for short information interruptions, like error messages or confirmations.

In addition to these, header bar is always present on screen and footer bar is presented in some cases. Footer bar always has same structure and only allows navigation to one of navigation roots. Top bar might contain back button, screen name, and extra menu button; status indicator is always shown on top bar.


Almost all actions available to user are in fact handled by single operation - action() backend function, that is called through pushButton native interface. In native side, this operation is debounced by time. On rust side, actions are performed on static mutex storing app state; on blocked mutex actions is ignored, as well as impossible actions that are not allowed in current state of navigation. Thus, state of the app is protected against undefined concurrency effects by hardware button-like behavior of action().

Most actions lead to change of shown combination of screen, modal and alert; but some actions - for example, those involving keyboard input - alter contents of a UI component. In most cases, all parameters of UI components are passed as states (more or less similar concept on all platforms) and frontend framework detects updates and seamlessly performs proper rendering.

Action accepts 3 parameters: action type (enum), action data (&str), secret data (&str). Secret data is used to transfer secret information and care is taken to always properly zeroize its contents; on contrary, action data could contain large strings and is optimized normally.

Data model

Data model as seen by native UI consists of 3 parts: secret seed content, network detection state and screen contents. Secret seed content consists of list of seed names that are used as handles to fetch secret material from secure storage. Network detection state is a 4-state variable that describes current network detection state (safe state, network is currently detected, network was detected before, error state). The rest of data model is a black box in Rust.

From Rust side, model is generated by navigation crate. The state of the app is stored in lazy static State object and sufficient information required for View rendering is generated into ActionResult object that is sent into native layer on each action update.


Rust side builds but app crashes on start (Android) or complains about symbols not found (iOS)

One common reason for this is inconsistency in uniffi version - make sure that installed version matches one stated in Cargo.toml

Build for Android fails on macOS, build for iOS fails on linux

This is a known issue, does not seem to be solvable at the moment. Please use 2 machines, as we do.

Links to rust docs of the apps


This document provides an interpretation of the UOS format used by Polkadot Vault. The upstream version of the published format has diverged significantly from the actual implementation, so this document represents the current state of the UOS format that is compatible with Polkadot Vault. It only applies to networks compatible with Polkadot Vault, i.e. Substrate-based networks. The document also describes special payloads used to maintain a Polkadot Vault instance.

Therefore, this document effectively describes the input and output format for QR codes used by Polkadot Vault.


The Vault receives information over an air-gap as QR codes. These codes are read as u8 vectors and must always be parsed by the Vault before use.

QR codes can contain information that a user wants to sign with one of the Vault keys, or they may contain update information to ensure smooth operation of the Vault without the need for a reset or connection to the network.

QR code content types

  1. Transaction/extrinsic - a single transaction that is to be signed
  2. Bulk transactions - a set of transactions that are to be signed in a single session
  3. Message - a message that is to be signed with a key
  4. Chain metadata: up-to-date metadata allows the Vault to read transactions content
  5. Chain specs: adds new network to the Vault
  6. Metadata types: is used to update older versions runtime metadata (V13 and below)
  7. Key derivations: is used to import and export Vault key paths

QR code structure

QR code envelope has the following structure:

QR code prefixcontentending spacerpadding
4 bitsbyte-aligned content4 bitsremainder

QR code prefix always starts with 0x4 symbol indicating "raw" encoding.

Subsequent 2 bytes encode content length. Using this number, QR code parser can instantly extract content and disregard the rest of QR code.

Actual content is shifted by half-byte, otherwise it is a normal byte sequence.

Multiframe QR

The information transferred through QR channel into Vault is always enveloped in multiframe packages (although minimal number of multiframe packages is 1). There are two standards for the multiframe: RaptorQ erasure coding and legacy non-erasure multiframe. The type of envelope is determined by the first bit of the QR code data: 0 indicates legacy multiframe, 1 indicates RaptorQ

RaptorQ multipart payload

RaptorQ (RFC6330) is a variable rate (fountain) erasure code protocol with reference implementation in Rust

Wrapping content in RaptorQ protocol allows for arbitrary amounts of data to be transferred reliably within reasonable time. It is recommended to wrap all payloads into this type of envelope.

Each QR code in RaptorQ encoded multipart payload contains following parts:

bytes [0..4]bytes [4..]
0x80000000 || payload_sizeRaptorQ serialized packet
  • payload_size MUST contain payload size in bytes, represented as big-endian 32-bit unsigned integer.
  • payload_size MUST NOT exceed 7FFFFFFF
  • payload_size MUST be identical in all codes encoding the payload
  • payload_size and RaptorQ serialized packet MUST be stored by the Cold Vault, in no particular order, until their amount is sufficient to decode the payload.
  • Hot Wallet MUST continuously loop through all the frames showing each frame for at least 1/30 seconds (recommended frame rate: 4 FPS).
  • Cold Vault MUST be able to start scanning the Multipart Payload at any frame.
  • Cold Vault MUST NOT expect the frames to come in any particular order.
  • Cold Vault SHOULD show a progress indicator of how many frames it has successfully scanned out of the estimated minimum required amount.
  • Hot Wallet SHOULD generate sufficient number of recovery frames (recommended overhead: 100%; minimal reasonable overhead: square root of number of packets).
  • Payloads fitting in 1 frame SHOULD be shown without recovery frames as static image.

Once sufficient number of frames is collected, they could be processed into single payload and treated as data vector ("QR code content").

Legacy Multipart Payload

In real implementation, the Polkadot Vault ecosystem generalized all payloads as multipart messages.

bytes position[0][1..3][3..5][5..]
  • frame MUST the number of current frame, '0000' represented as big-endian 16-bit unsigned integer.
  • frame_count MUST the total number of frames, represented as big-endian 16-bit unsigned integer.
  • part_data MUST be stored by the Cold Vault, ordered by frame number, until all frames are scanned.
  • Hot Wallet MUST continuously loop through all the frames showing each frame for about 2 seconds.
  • Cold Vault MUST be able to start scanning the Multipart Payload at any frame.
  • Cold Vault MUST NOT expect the frames to come in any particular order.
  • Cold Vault SHOULD show a progress indicator of how many frames it has successfully scanned out of the total count.

Once all frames are combined, the part_data must be concatenated into a single binary blob and treated as data vector ("QR code content").

Informative content of QR code

Every QR code content starts with a prelude [0x53, 0x<encryption code>, 0x<payload code>].

0x53 is always expected and indicates Substrate-related content.

<encryption code> for signables indicates encryption algorithm that will be used to generate the signature:

0x00 Ed25519
0x01 Sr25519
0x02 Ecdsa

<encryption code> for updates indicates encryption algorithm that was used to sign the update:

0x00 Ed25519
0x01 Sr25519
0x02 Ecdsa
0xff unsigned

Derivations import and testing are always unsigned, with <encryption code> always 0xff.

Vault supports following <payload code> variants:

0x00 legacy mortal transaction
0x02 transaction (both mortal and immortal)
0x03 message
0x04 bulk transactions
0x80 load metadata update
0x81 load types update
0xc1 add specs update
0xde derivations import

Note: old UOS specified 0x00 as mortal transaction and 0x02 as immortal one, but currently both mortal and immortal transactions from polkadot-js are 0x02.

Shared QR code processing sequence:

  1. Read QR code, try interpreting it, and get the hexadecimal string from into Rust (hexadecimal string is getting changed to raw bytes soon). If QR code is not processable, nothing happens and the scanner keeps trying to catch a processable one.
  2. Analyze prelude: is it Substrate? is it a known payload type? If not, Vault always produces an error and suggests to scan supported payload.

Further processing is done based on the payload type.


Transaction has the following structure:

preludepublic keySCALE-encoded call dataSCALE-encoded extensionsnetwork genesis hash

Public key is the key that can sign the transaction. Its length depends on the <encryption code> declared in transaction prelude:

EncryptionPublic key length, bytes

Call data is Vec<u8> representation of transaction content. Call data must be parsed by Vault prior to signature generation and becomes a part of signed blob. Within transaction, the call data is SCALE-encoded, i.e. effectively is prefixed with compact of its length in bytes.

Extensions contain data additional to the call data, and also are part of a signed blob. Typical extensions are Era, Nonce, metadata version, etc. Extensions content and order, in principle, can vary between the networks and metadata versions.

Network genesis hash determines the network in which the transaction is created. At the moment genesis hash is fixed-length 32 bytes.

Thus, the transaction structure could also be represented as:

preludepublic keycompact of call data lengthcall dataSCALE-encoded extensionsnetwork genesis hash

Bold-marked transaction pieces are used in the blob for which the signature is produced. If the blob is short, 257 bytes or below, the signature is produced for it as is. For blobs longer than 257 bytes, 32 byte hash (blake2_256) is signed instead. This is inherited from earlier Vault versions, and is currently compatible with polkadot-js.

Transaction parsing sequence

  1. Cut the QR data and get:

    • encryption (single u8 from prelude)
    • transaction author public key, its length matching the encryption (32 or 33 u8 immediately after the prelude)
    • network genesis hash (32 u8 at the end)
    • SCALE-encoded call data and SCALE-encoded extensions as a combined blob (everything that remains in between the transaction author public kay and the network genesis hash)

    If the data length is insufficient, Vault produces an error and suggests to load non-damaged transaction.

  2. Search the Vault database for the network specs (from the network genesis hash and encryption).

    If the network specs are not found, Vault shows:

    • public key and encryption of the transaction author key
    • error message, that suggests to add network with found genesis hash
  3. Search the Vault database for the address key (from the transaction author public key and encryption). Vault will try to interpret and display the transaction in any case. Signing will be possible only if the parsing is successful and the address key is known to Vault and is extended to the network in question.

    • Address key not found. Signing not possible. Output shows:

      • public key and encryption of the transaction author key
      • call and extensions parsing result
      • warning message, that suggests to add the address into Vault
    • Address key is found, but it is not extended to the network used. Signing not possible. Output shows:

      • detailed author key information (base58 representation, identicon, address details such as address being passworded etc)
      • call and extensions parsing result
      • warning message, that suggests extending the address into the network used
    • Address key is found and is extended to the network used. Vault will proceed to try and interpret the call and extensions. Detailed author information will be shown regardless of the parsing outcome. The signing will be allowed only if the parsing is successful.

  4. Separate the call and extensions. Call is prefixed by its length compact, the compact is cut off, the part with length that was indicated in the compact goes into call data, the part that remains goes into extensions data.

    If no compact is found or the length is insufficient, Vault produces an error that call and extensions could not be separated.

  5. Get the metadata set from the Vault database, by the network name from the network specs. Metadata is used to interpret extensions and then the call itself.

    If there are no metadata entries for the network at all, Vault produces an error and asks to load the metadata.

    RuntimeMetadata versions supported by Vault are V12, V13, and V14. The crucial feature of the V14 is that the metadata contains the description of the types used in the call and extensions production. V12 and V13 are legacy versions and provide only text identifiers for the types, and in order to use them, the supplemental types information is needed.

  6. Process the extensions.

    Vault already knows in which network the transaction was made, but does not yet know the metadata version. Metadata version must be one of the signable extensions. At the same time, the extensions and their order are recorded in the network metadata. Thus, all metadata entries from the set are checked, from newest to oldest, in an attempt to find metadata that both decodes the extensions and has a version that matches the metadata version decoded from the extensions.

    If processing extensions with a single metadata entry results in an error, the next metadata entry is tried. The errors would be displayed to user only if all attempts with existing metadata have failed.

    Typically, the extensions are quite stable in between the metadata versions and in between the networks, however, they can be and sometimes are different.

    In legacy metadata (RuntimeMetadata version being V12 and V13) extensions have identifiers only, and in Vault the extensions for V12 and V13 are hardcoded as:

    • Era era
    • Compact(u64) nonce
    • Compact(u128) tip
    • u32 metadata version
    • u32 tx version
    • H256 genesis hash
    • H256 block hash

    If the extensions could not be decoded as the standard set or not all extensions blob is used, the Vault rejects this metadata version and adds error into the error set.

    Metadata V14 has extensions with both identifiers and properly described types, and Vault decodes extensions as they are recorded in the metadata. For this, ExtrinsicMetadata part of the metadata RuntimeMetadataV14 is used. Vector signed_extensions in ExtrinsicMetadata is scanned twice, first for types in ty of the SignedExtensionMetadata and then for types in additional_signed of the SignedExtensionMetadata. The types, when resolved through the types database from the metadata, allow to cut correct length blobs from the whole SCALE-encoded extensions blob and decode them properly.

    If any of these small decodings fails, the metadata version gets rejected by the Vault and an error is added to the error set. Same happens if after all extensions are scanned, some part of extensions blob remains unused.

    There are some special extensions that must be treated separately. The identifier in SignedExtensionMetadata and ident segment of the type Path are used to trigger types interpretation as specially treated extensions. Each identifier is encountered twice, once for ty scan, and once for additional_signed scan. In some cases only one of those types has non-empty content, in some cases it is both. To distinguish the two, the type-associated path is used, which points to where the type is defined in Substrate code. Type-associated path has priority over the identifier.

    Path triggers:

    | Path | Type is interpreted as | | :- | :- | | Era | Era | | CheckNonce | Nonce | | ChargeTransactionPayment | tip, gets displayed as balance with decimals and unit corresponding to the network specs |

    Identifier triggers, are used if the path trigger was not activated:

    | Identifier | Type, if not empty and if there is no path trigger, is interpreted as | Note | | :- | :- | :- | | CheckSpecVersion | metadata version | gets checked with the metatada version from the metadata | | CheckTxVersion | tx version | | | CheckGenesis | network genesis hash | must match the genesis hash that was cut from the tail of the transaction | | CheckMortality | block hash | must match the genesis hash if the transaction is immortal; Era has same identifier, but is distinguished by the path | | CheckNonce | nonce | | | ChargeTransactionPayment | tip, gets displayed as balance with decimals and unit corresponding to the network specs |

    If the extension is not a special case, it is displayed as normal parser output and does not participate in deciding if the transaction could be signed.

    After all extensions are processed, the decoding must yield following extensions:

    • exactly one Era
    • exactly one Nonce <- this is not so currently, fix it
    • exactly one BlockHash
    • exactly one GenesisHash <- this is not so currently, fix it
    • exactly one metadata version

    If the extension set is different, this results in Vault error for this particular metadata version, this error goes into error set.

    The extensions in the metadata are checked on the metadata loading step, long before any transactions are even produced. Metadata with incomplete extensions causes a warning on load_metadata update generation step, and another one when an update with such metadata gets loaded into Vault. Nevertheless, such metadata loading into Vault is allowed, as there could be other uses for metadata except signable transaction signing. Probably.

    If the metadata version in extensions does not match the metadata version of the metadata used, this results in Vault error for this particular metadata version, this error goes into error set.

    If the extensions are completely decoded, with correct set of the special extensions and the metadata version from the extensions match the metadata version of the metadata used, the extensions are considered correctly parsed, and Vault can proceed to the call decoding.

    If all metadata entries from the Vault database were tested and no suitable solution is found, Vault produces an error stating that all attempts to decode extensions have failed. This could be used by variety of reasons (see above), but so far the most common one observed was users having the metadata in Vault not up-to-date with the metadata on chain. Thus, the error must have a recommendation to update the metadata first.

  7. Process the call data.

    After the metadata with correct version is established, it is used to parse the call data itself. Each call begins with u8 pallet index, this is the decoding entry point.

    For V14 metadata the correct pallet is found in the set of available ones in pallets field of RuntimeMetadataV14, by index field in corresponding PalletMetadata. The calls field of this PalletMetadata, if it is Some(_), contains PalletCallMetadata that provides the available calls enum described in types registry of the RuntimeMetadataV14. For each type in the registry, including this calls enum, encoded data size is determined, and the decoding is done according to the type.

    For V12 and V13 metadata the correct pallet is also found by scanning the available pallets and searching for correct pallet index. Then the call is found using the call index (second u8 of the call data). Each call has associated set of argument names and argument types, however, the argument type is just a text identifier. The type definitions are not in the metadata and transactions decoding requires supplemental types information. By default, the Vault contains types information that was constructed for Westend when Westend was still using V13 metadata and it was so far reasonably sufficient for simple transactions parsing. If the Vault does not find the type information in the database and has to decode the transaction using V12 or V13 metadata, error is produced, indicating that there are no types. Elsewise, for each encountered argument type the encoded data size is determined, and the decoding is done according to the argument type.

    There are types requiring special display:

    • calls (for cases when a call contains other calls)
    • numbers that are processed as the balances

    Calls in V14 parsing are distinguished by Call in ident segment of the type Path. Calls in V12 and V13 metadata are distinguished by any element of the set of calls type identifiers in string argument type.

    At the moment the numbers that should be displayed as balance in transactions with V14 metadata are determined by the type name type_name of the corresponding Field being:

    • Balance
    • T::Balance
    • BalanceOf<T>
    • ExtendedBalance
    • BalanceOf<T, I>
    • DepositBalance
    • PalletBalanceOf<T>

    Similar identifiers are used in V12 and V13, the checked value is the string argument type itself.

    There could be other instances when the number should be displayed as balance. However, sometimes the balance is not the balance in the units in the network specs, for example in the assets pallet. See issue #1050 and comments there for details.

    If no errors were encountered while parsing and all call data was used in the process, the transaction is considered parsed and is displayed to the user, either ready for signing (if all other checks have passed) or as read-only.

  8. If the user chooses to sign the transaction, the Vault produces QR code with signature, that should be read back into the hot side. As soon as the signature QR code is generated, the Vault considers the transaction signed.

    All signed transactions are entered in the history log, and could be seen and decoded again from the history log. Transactions not signed by the user do not go in the history log.

    If the key used for the transaction is passworded, user has three attempts to enter the password correctly. Each incorrect password entry is reflected in the history.

    In the time interval between Vault displaying the parsed transaction and the user approving it, the transaction details needed to generate the signature and history log details are temporarily stored in the database. The temporary storage gets cleared each time before and after use. Vault extracts the stored transaction data only if the database checksum stored in navigator state is same as the current checksum of the database. If the password is entered incorrectly, the database is updated with "wrong password" history entry, and the checksum in the state gets updated accordingly. Eventually, all transaction info can and will be moved into state itself and temporary storage will not be used.


Alice makes transfer to Bob in Westend network.



PartMeaningByte position
53Substrate-related content0
01Sr25519 encryption algorithm1
d435..a27d1Alice public key3..=34
a404..48172SCALE-encoded call data35..=76
a4Compact call data length, 4135
0403..48173Call data36..=76
04Pallet index 4 in metadata, entry point for decoding36
e143..423e5Westend genesis hash154..=185










Call content is parsed using Westend metadata, in this particular case westend9010

Call partMeaning
04Pallet index 4 (Balances) in metadata, entry point for decoding
03Method index 3 in pallet 4 (transfer_keep_alive), search in metadata what the method contains. Here it is MultiAddress for transfer destination and Compact(u128) balance.
00Enum variant in MultiAddress, AccountId
8eaf..6a486Associated AccountId data, Bob public key
0700e8764817Compact(u128) balance. Amount paid: 100000000000 or, with Westend decimals and unit, 100.000000000 mWND.


Extensions content

Extensions partMeaning
b501Era: phase 27, period 64
b8Nonce: 46
00Tip: 0 pWND
32230000Metadata version: 9010
05000000Tx version: 5
e143..423e7Westend genesis hash
538a..3f338Block hash





Message has the following structure:

prelude public key [u8] slice network genesis hash

[u8] slice is represented as String if all bytes are valid UTF-8. If not all bytes are valid UTF-8, the Vault produces an error.

It is critical that the message payloads are always clearly distinguishable from the transaction payloads, i.e. it is never possible to trick user to sign transaction posing as a message.

Current proposal is to enable message signing only with Sr25519 encryption algorithm, with designated signing context, different from the signing context used for transactions signing.

Bulk transactions

Bulk transactions is a SCALE-encoded TransactionBulk structure that consists of concatenated Vec<u8> transactions.

Bulking is a way to sign multiple translations at once and reduce the number of QR codes to scan.

Bulk transactions are processed in exactly the same way as single transactions.


Update has following general structure:

preludeverifier public key (if signed)update payloadsignature (if signed)reserved tail

Note that the verifier public key and signature parts appear only in signed uploads. Preludes [0x53, 0xff, 0x<payload code>] are followed by the update payload.

Every time user receives an unsigned update, the Vault displays a warning that the update is not verified. Generally, the use of unsigned updates is discouraged.

For update signing it is recommended to use a dedicated key, not used for transactions. This way, if the signed data was not really the update data, but something else posing as the update data, the signature produced could not do any damage.

EncryptionPublic key length, bytesSignature length, bytes
no encryption00

reserved tail currently is not used and is expected to be empty. It could be used later if the multisignatures are introduced for the updates. Expecting reserved tail in update processing is done to keep code continuity in case multisignatures introduction ever happens.

Because of the reserved tail, the update payload length has to be always exactly declared, so that the update payload part could be cut correctly from the update.

Detailed description of the update payloads and form in which they are used in update itself and for generating update signature, could be found in Rust module definitions::qr_transfers.

add_specs update payload, payload code c1

Introduces a new network to Vault, i.e. adds network specs to the Vault database.

Update payload is ContentAddSpecs in to_transfer() form, i.e. double SCALE-encoded NetworkSpecsToSend (second SCALE is to have the exact payload length).

Payload signature is generated for SCALE-encoded NetworkSpecsToSend.

Network specs are stored in dedicated SPECSTREE tree of the Vault database. Network specs identifier is NetworkSpecsKey, a key built from encryption used by the network and the network genesis hash. There could be networks with multiple encryption algorithms supported, thus the encryption is part of the key.

Some elements of the network specs could be slightly different for networks with the same genesis hash and different encryptions. There are:

  • Invariant specs, identical between all different encryptions:

    • name (network name as it appears in metadata)
    • base58 prefix

    The reason is that the network name is and the base58 prefix can be a part of the network metadata, and the network metadata is not encryption-specific.

  • Specs static for given encryption, that should not change over time once set:

    • decimals
    • unit

    To replace these, the user would need to remove the network and add it again, i.e. it won't be possible to do by accident.

  • Flexible display-related and convenience specs, that can change and could be changed by simply loading new ones over the old ones:

    • color and secondary color (both currently not used, but historically are there and may return at some point)
    • logo
    • path (default derivation path for network, //<network_name>)
    • title (network title as it gets displayed in the Vault)

load_metadata update payload, payload code 80

Loads metadata for a network already known to Vault, i.e. for a network with network specs in the Vault database.

Update payload is ContentLoadMeta in to_transfer() form, and consists of concatenated SCALE-encoded metadata Vec<u8> and network genesis hash (H256, always 32 bytes).

Same blob is used to generate the signature.

Network metadata is stored in dedicated METATREE tree of the Vault database. Network metadata identifier in is MetaKey, a key built from the network name and network metadata version.

Metadata suitable for Vault

Network metadata that can get into Vault and can be used by Vault only if it complies with following requirements:

  • metadata vector starts with b"meta" prelude
  • part of the metadata vector after b"meta" prelude is decodable as RuntimeMetadata
  • RuntimeMetadata version of the metadata is V12, V13 or V14
  • Metadata has System pallet
  • There is Version constant in System pallet
  • Version is decodable as RuntimeVersion
  • If the metadata contains base58 prefix, it must be decodable as u16 or u8

Additionally, if the metadata V14 is received, its associated extensions will be scanned and user will be warned if the extensions are incompatible with transactions signing.

Also in case of the metadata V14 the type of the encoded data stored in the Version constant is also stored in the metadata types registry and in principle could be different from RuntimeVersion above. At the moment, the type of the Version is hardcoded, and any other types would not be processed and would get rejected with an error.

load_types update payload, payload code 81

Load types information.

Type information is needed to decode transactions made in networks with metadata RuntimeMetadata version V12 or V13.

Most of the networks are already using RuntimeMetadata version V14, which has types information incorporated in the metadata itself.

The load_types update is expected to become obsolete soon.

Update payload is ContentLoadTypes in to_transfer(), i.e. double SCALE-encoded Vec<TypeEntry> (second SCALE is to have the exact payload length).

Payload signature is generated for SCALE-encoded Vec<TypeEntry>.

Types information is stored in SETTREE tree of the Vault database, under key TYPES.


Vault can accept both verified and non-verified updates, however, information once verified can not be replaced or updated by a weaker verifier without full Vault reset.

A verifier could be Some(_) with corresponding public key inside or None. All verifiers for the data follow trust on first use principle.

Vault uses:

  • a single general verifier
  • a network verifier for each of the networks introduced to the Vault

General verifier information is stored in SETTREE tree of the Vault database, under key GENERALVERIFIER. General verifier is always set to a value, be it Some(_) or None. Removing the general verifier means setting it to None. If no general verifier entry is found in the database, the database is considered corrupted and the Vault must be reset.

Network verifier information is stored in dedicated VERIFIERS tree of the Vault database. Network verifier identifier is VerifierKey, a key built from the network genesis hash. Same network verifier is used for network specs with any encryption algorithm and for network metadata. Network verifier could be valid or invalid. Valid network verifier could be general or custom. Verifiers installed as a result of an update are always valid. Invalid network verifier blocks the use of the network unless the Vault is reset, it appears if user marks custom verifier as no longer trusted.

Updating verifier could cause some data verified by the old verifier to be removed, to avoid confusion regarding which verifier has signed the data currently stored in the database. The data removed is called "hold", and user receives a warning if accepting new update would cause hold data to be removed.

General verifier

General verifier is the strongest and the most reliable verifier known to the Vault. General verifier could sign all kinds of updates. By default the Vault uses Parity-associated key as general verifier, but users can remove it and set their own. There could be only one general verifier at any time.

General verifier could be removed only by complete wipe of the Vault, through Remove general certificate button in the Settings. This will reset the Vault database to the default content and set the general verifier as None, that will be updated to the first verifier encountered by the Vault.

Expected usage for this is that the user removes old general verifier and immediately afterwards loads an update from the preferred source, thus setting the general verifier to the user-preferred value.

General verifier can be updated from None to Some(_) by accepting a verified update. This would result in removing "general hold", i.e.:

  • all network data (network specs and metadata) for the networks for which the verifier is set to the general one
  • types information

General verifier could not be changed from Some(_) to another, different Some(_) by simply accepting updates.

Note that if the general verifier is None, none of the custom verifiers could be Some(_). Similarly, if the verifier is recorded as custom in the database, its value can not be the same as the value of the general verifier. If found, those situations indicate the database corruption.

Custom verifiers

Custom verifiers could be used for network information that was verified, but not with the general verifier. There could be as many as needed custom verifiers at any time. Custom verifier is considered weaker than the general verifier.

Custom verifier set to None could be updated to:

  • Another custom verifier set to Some(_)
  • General verifier

Custom verifier set to Some(_) could be updated to general verifier.

These verifier updates can be done by accepting an update signed by a new verifier.

Any of the custom network verifier updates would result in removing "hold", i.e. all network specs entries (for all encryption algorithms on file) and all network metadata entries.

Common update processing sequence:

  1. Cut the QR data and get:

    • encryption used by verifier (single u8 from prelude)
    • (only if the update is signed, i.e. the encryption is not 0xff) update verifier public key, its length matching the encryption (32 or 33 u8 immediately after the prelude)
    • concatenated update payload, verifier signature (only if the update is signed) and reserved tail.

    If the data length is insufficient, Vault produces an error and suggests to load non-damaged update.

  2. Using the payload type from the prelude, determine the update payload length and cut payload from the concatenated verifier signature and reserved tail.

    If the data length is insufficient, Vault produces an error and suggests to load non-damaged update.

  3. (only if the update is signed, i.e. the encryption is not 0xff) Cut verifier signature, its length matching the encryption (64 or 65 u8 immediately after the update payload). Remaining data is reserved tail, currently it is not used.

    If the data length is insufficient, Vault produces an error and suggests to load non-damaged update.

  4. Verify the signature for the payload. If this fails, Vault produces an error indicating that the update has invalid signature.

add_specs processing sequence

  1. Update payload is transformed into ContentAddSpecs and the incoming NetworkSpecsToSend are retrieved, or the Vault produces an error indicating that the add_specs payload is damaged.

  2. Vault checks that there is no change in invariant specs occurring.

    If there are entries in the SPECSTREE of the Vault database with same genesis hash as in newly received specs (the encryption not necessarily matches), the Vault checks that the name and base58 prefix in the received specs are same as in the specs already in the Vault database.

  3. Vault checks the verifier entry for the received genesis hash.

    If there are no entries, i.e. the network is altogether new to the Vault, the specs could be added into the database. During the same database transaction the network verifier is set up:

    | add_specs update verification | General verifier in Vault database | Action | | :- | :- | :- | | unverified, 0xff update encryption code | None or Some(_) | (1) set network verifier to custom, None (regardless of the general verifier); (2) add specs | | verified by a | None | (1) set network verifier to general; (2) set general verifier to Some(a), process the general hold; (3) add specs | | verified by a | Some(b) | (1) set network verifier to custom, Some(a); (2) add specs | | verified by a | Some(a) | (1) set network verifier to general; (2) add specs |

    If there are entries, i.e. the network was known to the Vault at some point after the last Vault reset, the network verifier in the database and the verifier of the update are compared. The specs could be added in the database if

    1. there are no verifier mismatches encountered (i.e. verifier same or stronger)
    2. received data causes no change in specs static for encryption
    3. the specs are not yet in the database in exactly same form

    Note that if the exactly same specs as already in the database are received with updated verifier and the user accepts the update, the verifier will get updated and the specs will stay in the database.

    | add_specs update verification | Network verifier in Vault database | General verifier in Vault database | Action | | :- | :- | :- | :- | | unverified, 0xff update encryption code | custom, None | None | accept specs if good | | unverified, 0xff update encryption code | custom, None | Some(a) | accept specs if good | | unverified, 0xff update encryption code | general | None | accept specs if good | | unverified, 0xff update encryption code | general | Some(a) | error: update should have been signed by a | | verified by a | custom, None | None | (1) change network verifier to general, process the network hold; (2) set general verifier to Some(a), process the general hold; (3) accept specs if good | | verified by a | custom, None | Some(a) | (1) change network verifier to general, process the network hold; (2) accept specs if good | | verified by a | custom, None | Some(b) | (1) change network verifier to custom, Some(a), process the network hold; (2) accept specs if good | | verified by a | custom, Some(a) | Some(b) | accept specs if good | | verified by a | custom, Some(b) | Some(a) | (1) change network verifier to general, process the network hold; (2) accept specs if good | | verified by a | custom, Some(b) | Some(c) | error: update should have been signed by b or c |

    Before the NetworkSpecsToSend are added in the SPECSTREE, they get transformed into NetworkSpecs, and have the order field (display order in Vault network lists) added. Each new network specs entry gets added in the end of the list.

load_meta processing sequence

  1. Update payload is transformed into ContentLoadMeta, from which the metadata and the genesis hash are retrieved, or the Vault produces an error indicating that the load_metadata payload is damaged.

  2. Vault checks that the received metadata fulfills all Vault metadata requirements outlined above. Otherwise an error is produced indicating that the received metadata is invalid.

    Incoming MetaValues are produced, that contain network name, network metadata version and optional base58 prefix (if it is recorded in the metadata).

  3. Network genesis hash is used to generate VerifierKey and check if the network has an established network verifier in the Vault database. If there is no network verifier associated with genesis hash, an error is produced, indicating that the network metadata could be loaded only for networks introduced to Vault.

  4. SPECSTREE tree of the Vault database is scanned in search of entries with genesis hash matching the one received in payload.

    Vault accepts load_metadata updates only for the networks that have at least one network specs entry in the database.

    Note that if the verifier in step (3) above is found, it not necessarily means that the specs are found (for example, if a network verified by general verifier was removed by user).

    If the specs are found, the Vault checks that the network name and, if present, base58 prefix from the received metadata match the ones in network specs from the database. If the values do not match, the Vault produces an error.

  5. Vault compares the verifier of the received update and the verifier for the network from the database. The update verifier must be exactly the same as the verifier already in the database. If there is mismatch, Vault produces an error, indication that the load_metadata update for the network must be signed by the specified verifier (general or custom) or unsigned.

  6. If the update has passed all checks above, the Vault searches for the metadata entry in the METATREE of the Vault database, using network name and version from update to produce MetaKey.

    If the key is not found in the database, the metadata could be added.

    If the key is found in the database and metadata is exactly the same, the Vault produces an error indicating that the metadata is already in the database. This is expected to be quite common outcome.

    If the key is found in the database and the metadata is different, the Vault produces an error. Metadata must be not acceptable. This situation can occur if there was a silent metadata update or if the metadata is corrupted.

load_types processing sequence

  1. Update payload is transformed into ContentLoadTypes, from which the types description vector Vec<TypeEntry> is retrieved, or the Vault produces an error indicating that the load_types payload is damaged.

  2. load_types updates must be signed by the general verifier.

    | load_types update verification | General verifier in Vault database | Action | | :- | :- | :- | | unverified, 0xff update encryption code | None | load types if the types are not yet in the database | | verified by a | None | (1) set general verifier to Some(a), process the general hold; (2) load types, warn if the types are the same as before | | verified by a | Some(b) | reject types, error indicates that load_types requires general verifier signature | | verified by a | Some(a) | load types if the types are not yet in the database |

    If the load_types verifier is same as the general verifier in the database and the types are same as the types in the database, the Vault produces an error indicating that the types are already known.

    Each time the types are loaded, the Vault produces a warning. load_types is rare and quite unexpected operation.

Derivations import, payload code de

Derivations import has the following structure:

preludederivations import payload

Derivations import payload is a SCALE-encoded ExportAddrs structure.

It does not contain any private keys or seed phrases.

ExportAddrs structure holds the following information about each key:

  • name and public key of the seed the derived key belongs to
  • ss58 address of the derived key (h160 for ethereum based chains)
  • derivation path
  • encryption type
  • genesis hash of the network the key is used in

When processing derivations import, all data after prelude is transformed into ExportAddrs. Network genesis hash, encryption and derivations set are derived from it, or the Vault produces a warning indicating that the derivation import payload is corrupted.

Vault checks that the network for which the derivations are imported has network specs in the Vault database. If not, a warning is produced.

Vault checks that the derivation set contains only valid derivations. If any derivation is unsuitable, a warning is produced indicating this.

If the user accepts the derivations import, Vault generates a key for each valid derivation.

If one of the derived keys already exists, it gets ignored, i.e. no error is produced.

If there are two derivations with identical path within the payload, only one derived key is created.

Vault ecosystem

Vault repository contains 3 tools that are part of Vault ecosystem

Greater Vault ecosystem: