Syntalium Wiki
Crypto Data Sources Used in Market Intelligence
TL;DR
Data quality determines model quality. Cross-validate feeds and downgrade confidence when data integrity degrades.
Clear explanation
Institutional pipelines combine multiple feed classes: trades, order books, derivatives basis, and liquidity signals.
Every source has failure modes such as stale updates, symbol mismatches, and venue outages.
Syntalium validates feed freshness and consistency before producing score or state outputs.
Technical example: validation before scoring
A derivatives feed degrades during an outage. Validation flags suppress aggressive model reactions.
- Measure freshness and sequence continuity.
- Detect latency spikes and data gaps.
- Apply confidence penalties.
- Store flags in SNAP payload for audit.
ASCII model
Exchange trades ----\
Order book updates --+--> Data validation --> Feature engine --> SNAP/Score
Derivatives basis ---/
Latency + gaps ------> Confidence penaltiesSource classes and controls
| Source class | Key risk | Control |
|---|---|---|
| Trade feed | Missing bursts | Sequence + freshness checks |
| Order book | Depth distortion | Aggregation + persistence filters |
| Derivatives | Basis anomalies | Venue normalization bounds |
Internal links
- Market Status model
See how data quality affects state confidence.
- Verification workflow
Validate source-informed snapshots with SHA256.
- What is SNAP
Data quality context is captured in immutable snapshots.
- What is Market Score
Understand source impact on score reliability.
FAQ
Why not use one exchange only?
Single-source pipelines are fragile during local outages and anomalies.
Do delayed feeds distort models?
Yes. Stale data can create false transitions and poor risk decisions.
Should on-chain data be included?
It is useful for context, but intraday execution still depends on live market microstructure.