hello@peernova.com   |  (669) 400-7800

Cuneiform Platform

Delivering auditability at scale for the financial industry by combining Big Data, Blockchain and Cloud technologies.

Cuneiform, named after one of the earliest methods of writing, was most often used for accounting purposes. Financial institutions generate digital records at Big Data scale.  However, digital records on their own are often trivial to modify.

Conventional blockchain-oriented projects have proposed innovative mechanisms for making data tamper-evident, or adding non-repudiation properties to data.  Unfortunately these technologies are not able to scale to the quantity and velocity of data required for true enterprise applications.  PeerNova’s Cuneiform platform addresses these issues by delivering auditability at scale.

The Cuneiform platform adds data integrity and non-repudiation techniques inspired by blockchain technology to a modern Big Data platform.  It also includes capabilities required by the financial industry such as provable event lineages, granular security policies, and the ability to redact subsets of confidential data while still being able to authenticate the whole.

Enterprise solution
at the junction of Blockchain, Big Data, and Cloud


Multiple levels of integrity checks, data at rest and data in motion encryption.


Cryptographic event lineage for audit and compliance.

Permissioned Access

Unlike conventional blockchain implementations, access to industry standard authentication and authorization support.


Ability to mask individual fields in messages and transactions, while allowing 3rd party validation.


Micro-services based architecture for horizontal scalability.

Heterogeneous Data Source

Ingestion of data possible from legacy data platforms.


Data is recorded and validated in real-time, with minimal latency.


Create live dashboards and gain insights from cryptographically validated datasets.


Enable various internal and external trusted parties to access data and applications in a permissioned manner.

Partial Replication

Ability to fully validate data with partially replicated copy amongst cloud participants.

Hybrid Cloud

Flexibility for Public Cloud, Private Cloud and Hybrid installations.

Automated Deployment

Provide scripts and automation tools for monitoring, logging, upgrading and orchestration.

“Blockchain could dramatically reduce the cost of transactions and, if adopted widely, reshape the economy.”

Harvard Business Review – Jan 2017

Cuneiform System Features

Cuneiform platform provides an application platform for financial institutions in use-cases that involve event lineage, reconciliation between parties and counter-parties, data sharing, audit, and compliance challenges.  These features are attained by performing integrity checks, providing privacy via redaction, and the ability to perform business logic playback.

Assertions of
Integrity Checks

Transaction data that is ingested into the Cuneiform system is stored as key-value pairs, the key being the one-way cryptographic SHA-256/SHA-512 hash function of the data. The data itself is extracted as per a data schema protocol. Thus, every piece of data value has an equivalent key, which is its cryptographic hash. Given a specific hashing operation, the computation of a key for any given data value is deterministic.

Any third party computing the key of that same data value and using the same hashing operation will compute the same hash. Should the original and derived keys be different, that is a clear indication that the data value that has been retrieved is corrupted. This characteristic is a very critical aspect for verification of data presented to a third party that should be able to independently compute the key of a given piece of data and is a powerful integrity-checking characteristic which the data model naturally provides.

Immutability via
External Anchoring

The use of chained Merkle trees within a Cuneiform system forms the basis for integrity checking, as any specific data set results in a unique set of Merkle roots. When the Merkle roots are recorded by the system it is making a commitment that a future user or recipient of the same data will be able to recompute these same root values.

While it is assumed that Merkle roots recorded by a Cuneiform system will always be accurate, we want independent third parties to be confident that these root values remain unchanged, and have not been substituted with alternate data. Cuneiform solves this by a processed called External Anchoring. Cuneiform Anchoring is a process by which root hash signatures can be computed and recorded in such a way that they can be trusted and used by an independent third party to verify that the specific root hash signatures existed at some specific and observable point in time.

via Redaction

In a permission-less distributed ledger, each participant has transparent access to a full copy of the ledger. In this case, any privacy assurance needs to be handled at a different layer of the stack over and above the consensus protocol.

When dealing with organizations and persons of different roles, it is often the case that shared data has sensitive information and needs to be protected from open access. In such cases, data is salted such that random data is used as an additional input to the cryptographic hash function. Moreover, the Cuneiform platform has the capability to redact or obfuscate the true value and nature of some of the composite data, but at the same time all individual participants can attest to the un-redacted contents.

Replay of
Business Logic

Financial institutions are required to submit OATS report to FINRA on a nightly basis. FINRA uses this audit trail system to recreate events in the lifecycle of orders to monitor trading practices. This is one example of a compliance system requiring auditability of the data and state. Most organizations face a heavy burden to meet these compliance requirements.

Providing complete auditability is the ability to make assertions about the state of the system at any given time by replaying the logic on an input data set at every step and validating that the final outputs match.