Upload
others
View
6
Download
0
Embed Size (px)
Citation preview
Third-Party Data for Externally Managed Portfolios
Rebecca Beasley-Cockroft Clearwater Analytics
Russ WarrenThomson Reuters
Agenda
• Data aggregation for a data vendor
• How Clearwater aggregates vendor data
• TR data validation
• Clearwater data validation
Learning Objectives
• Understand the complexities of aggregating data from multiple sources
• Learn about best practices for data validation to ensure data quality
• Learn about the benefits of using a golden copy of data for your accounting, performance, and overall reporting needs
Polling Question #1
Client Data
Universal SecurityMaster File
Clean, Validated, and Enriched Data
ReportingReconciliation
• Thomson Reuters• S&P• IDC• Morningstar
Data Aggregation
TR and Data Aggregation – GovCorp
• Source from story or vendor feed (skeletal)• Issuer and issue set up (vendor or offering docs)
New Bond
• Completeness check• Leverage – underwriter, clearing house, web,
SEC filing, issuer sources
• Quality checks back to source• Make corrections as necessary
TermsMonitor
Finalize
TR and Data Aggregation – GovCorp
TR and Data Aggregation – GovCorp
TR and Data Aggregation – GovCorp
Extraction• Change tracking
• Data transformation
• Data distribution
Updates• DBoR syncing
• Intraday
• EOD
Monitoring• Visual
dashboards
• System traffic
• Database crowth
• Abnormal status
TR and Data Aggregation
TR and Data Aggregation – Due DiligenceThomson Reuters data is gathered, processed, and stored in accordance with the following tenets:
• Data is acquired from verifiable and reputable sources.
• The global marketplace is constantly monitored for new and alternate data sources.
• An active and rigorous rights management process exists to ensure that all usage and distribution complies with legal requirements.
• As members of a variety financial industry organizations, we seek to identify and spread good practice between data vendor organizations.
TR and Data Aggregation – DatabaseModel DesignThomson Reuters data is processed and stored in several major proprietary databases whose design takes into account the following:
• The granularity required of the data record
• Accurate representation of the information
• Derivation of value from data relationships
• Ease of maintenance
• Timely data retrieval
• Effective operational delivery to products
Polling Question #2
Bringing in data from multiple vendors
• Creating a connection with the vendor
• Timing of the data
• Understanding each vendor’s data model
• Storing the raw data
Universal SecurityMaster File
• Thomson Reuters• S&P• IDC• Morningstar
Creating a “Golden Copy”
• Normalization of vendors’ data into a single data model
• Aggregation of data sources
• Translation of data into the Clearwater reconciliation data model
Universal SecurityMaster File
• Thomson Reuters• S&P• IDC• Morningstar
Clean, Validated, and Enriched Data
Using the Clearwater “golden copy” downstream
• Reconciliation
• Accounting
• Performance
• Compliance ReportingReconciliation
Data Validation
Core Data Quality Principles
Completeness Accuracy Timeliness
Polling Question #3
TR Data Flow from Source(Exchanges and third-parties)
Example validation checks
• Sanity Error – A price has moved beyond a certain specified range
• Integrity Error – The lowest price recorded is lower than the low
• Jump Check Error – A price has jumped more than the percentage specified for the market over the past three days
• Flat Check Error – A price has not changed
• Missing Error – A price is missing
TR Validation – Quality ControlQuality Control
• Exchange• Other source
• Validation layer
• Database update
• Product
TR Validation – Quality ControlOther Data checks on Thomson Reuters Databases
Thomson Reuters employs a variety of mechanisms to monitor data quality:
• Price reconciliation
• File delivery time checks
• Data comparison
• Data quality sweeps
• Raw data checks
• Second pair of eyes
TR Data Validation – Quality AssuranceFeed Monitoring Program
Thomson Reuters employs a variety of mechanisms to monitor the quality and timeliness of data supplied by vendors:
• QC checks on data accuracy, completeness and timeliness
• Monitoring feed outages and data interruptions at global technical centers
• Metrics are used in the creation and management of Service Level Agreements (SLA) with suppliers
• Regular service reviews are held with key strategic suppliers globally
TR Data Validation – Quality AssuranceData Quality MeasurementThe Data Quality Measurement Program at Thomson Reuters is owned by the Global Quality Manager:
• The program consists of a series of Accuracy, Completeness, Timeliness and Freshness measurements and metrics which are run on core datasets on a regular basis
• Dedicated quality teams around the world monitor data from over 30 countries to provide a national, regional and global view of data quality
• Root cause analysis is initiated to investigate issues with accountability for results assigned to managers and team leaders. Their respective team performances are evaluated and reviewed by senior management on a monthly basis
TR Data Validation – Quality AssuranceQuality Control ProgramsThomson Reuters data audits are performed either on the database or on the product to replicate the customer experience:
• The Data Quality Measurement Program focuses on new data entered into the databases over the course of the previous calendar month
• The Data Audit Program provides an assessment of data quality across the depth of available data history.
• Data Audits are normally designed to assess levels of accuracy, timeliness and completeness in a statistically valid sample of data
Validation as part of data aggregation
• Detection of missing vendor files
• Insufficient data to map
• Data models restrict the type of data possible
• Minimum data requirements to pass information downstream
• Errors for invalid values
Validation of the “golden copy”
• Data consistency checks across the golden copy
» Conditions that should always be true
» Data sets that should be consistent
» Stale data checks
• Data comparison against publicly available sources (SEC, EMMA, etc.)
Validation of downstream systems
• Reconciliation system’s validation based on reconciliation data
• Reporting systems’ validation based on reporting data
Resolving data inaccuracies
• Determining the source of the inaccuracy
• Assessing the type of change needed
• Correcting data as quickly as possible
Outcome of Aggregation and Validation
Outcome of Aggregation and Validation
Questions?
• Take the post-session survey in the Clearwater Events app and scan out as you exit
• Take the Clearwater Client Benchmark Survey in room 440 and earn Clearwater swag
• Don’t miss the Monday networking reception from 4:30 – 6:30 p.m.
• Presentation is available in the app
Reminders