Support

  1. Home
  2. Docs
  3. Support
  4. Features
  5. What is tokenization?

What is tokenization?

Tokenization is the process of replacing sensitive data with unique identification symbols that retain all the essential information about the data without compromising its security.