Data Quality Dimensions in Action: Assessing and Managing Issues in Securities Lending
- guillaumelangeac
- Oct 3
- 3 min read
Updated: Nov 10

On an agency lending desk, traders negotiate loans in seconds, operations teams manage collateral movements, and reporting teams prepare regulatory filings like SFTR. Each of these functions assumes one thing: that the data driving them is reliable. But how do we know?
That’s where data quality dimensions meet reality. Validity, completeness, consistency, accuracy — these aren’t abstract ideas. They’re lenses through which we test whether the data aligns with business expectations and rules.
The process starts by assessing data against expectations:
Does every UTI and LEI appear where required in SFTR fields?
Do loan rates fall within acceptable ranges?
Are collateral allocations consistent across front-, middle-, and back-office systems?
Do positions in the agent lender’s feed reconcile with custodian records?
For businesses doing this systematically for the first time, the insights can be eye-opening: undocumented dependencies, redundant fields, contradictory rules, or values that technically conform but make no business sense.
Done well, this exercise gives clarity on:
Uses of data (who relies on it and why).
Consumers of data (traders, ops, risk, clients, regulators).
Risks and issues (where errors creep in).
Non-conformance levels (how often data breaks the rules).
Findings then need to be shared with stakeholders — data stewards, SMEs, ops managers, traders — to validate what’s important. Only then can issues be prioritized, whether as quick wins or longer-term fixes.
Common Data Quality Challenges in Securities Lending
1. Lack of Oversight and Governance
Data quality often sits low on management’s priority list. Senior leaders track loan balances, spreads, and settlement rates, but rarely ask whether the trade, collateral, and counterparty data underpinning those metrics is complete or accurate.This results in recurring capture errors, mismatches, and reporting inaccuracies that require manual fixes. Governance gaps — unclear ownership, lack of controls, weak accountability — keep data quality as a “check-the-box” exercise instead of a strategic asset.The impact is broad: higher costs, failed trades, regulatory risk, lower productivity, and in some markets, data issues only surface once operations are already disrupted. Stronger oversight and governance can prevent many of these issues before they snowball.
2. Errors from Manual Data Entry
Even in highly automated securities finance workflows, manual entry remains a weak link — especially for exceptions, recalls, margin calls, or special trades.
A wrong counterparty bank account number leads to failed settlements.
Inaccurate collateral setup misaligns balances, impacting exposure and P&L.
Missing fields during recall instructions delay timely settlement.
Interfaces make matters worse when they’re cluttered, unintuitive, or missing validation checks. If a field allows a 13-character ISIN, invalid entries sneak in; if dropdowns are overloaded, staff choose the wrong value.What would be the downstream effect? Failed payments, misreported trades, damaged client trust, and regulatory scrutiny. Training, accountability, and better UI/validation controls can close these gaps.
3. Issues in Data Processing Functions
Data flows through a complex ecosystem: trading platforms, collateral systems, custodians, and reporting engines. At each hop, errors can creep in.
Siloed teams: When front office doesn’t know how their input feeds downstream, collateral systems may miscalculate margin.
Inconsistent execution: Different desks book the same type of loan differently, creating mismatches.
Regulatory changes: If SFTR rules change but aren’t implemented consistently, reports become inaccurate.
System changes: If one platform alters security ID formats without notifying downstream users, reconciliations fail.
Without consistent documentation, training, and proactive monitoring, these issues cascade into settlement delays, margin miscalculations, and compliance risks.
4. Weakness in System Design and Integration
Even the best teams struggle if the systems themselves are flawed. Poor design often shows up as:
Broken referential integrity: orphaned records, missing parent-child relationships.
Duplicates: trades appearing twice, overstating exposure.
Mapping gaps: collateral allocated to wrong fields, misaligned IDs.
Timing errors: collateral posted after cut-off but logged incorrectly.
Data type mismatches: field lengths or numeric ranges that don’t match business needs, causing data loss or corruption.
In securities lending, these flaws distort reporting, impair collateral management, and inflate risk. Stronger enforcement of integrity, uniqueness, and validation rules at the system design level is essential.
From Discovery to Action
Assessing data against the quality dimensions provides the structured lens. But the value comes from turning insights into action:
Validate findings with stakeholders — traders, ops, clients.
Prioritize issues based on business risk and impact.
Fix at the source — whether through governance, training, process redesign, or system improvements.
When done right, this cycle transforms data from a silent risk into a strategic asset. For a securities lending business, the payoff is clear: fewer disputes, faster settlements, stronger client trust, and lower operational costs.
Please note that this article was written with the DAMA-CDMP course material as a backbone relating it to the securities lending market.


