Skip to main content
Syntalium

Syntalium Wiki

Crypto Data Sources Used in Market Intelligence

TL;DR

Data quality determines model quality. Cross-validate feeds and downgrade confidence when data integrity degrades.

Clear explanation

Institutional pipelines combine multiple feed classes: trades, order books, derivatives basis, and liquidity signals.

Every source has failure modes such as stale updates, symbol mismatches, and venue outages.

Syntalium validates feed freshness and consistency before producing score or state outputs.

Technical example: validation before scoring

A derivatives feed degrades during an outage. Validation flags suppress aggressive model reactions.

  1. Measure freshness and sequence continuity.
  2. Detect latency spikes and data gaps.
  3. Apply confidence penalties.
  4. Store flags in SNAP payload for audit.

ASCII model

Exchange trades ----\
Order book updates --+--> Data validation --> Feature engine --> SNAP/Score
Derivatives basis ---/
Latency + gaps ------> Confidence penalties

Source classes and controls

Source classKey riskControl
Trade feedMissing burstsSequence + freshness checks
Order bookDepth distortionAggregation + persistence filters
DerivativesBasis anomaliesVenue normalization bounds

Internal links

FAQ

Why not use one exchange only?

Single-source pipelines are fragile during local outages and anomalies.

Do delayed feeds distort models?

Yes. Stale data can create false transitions and poor risk decisions.

Should on-chain data be included?

It is useful for context, but intraday execution still depends on live market microstructure.