Your KYC Customer Onboarding Data Template
Your KYC Customer Onboarding Data Template
- Recommended attributes to collect
- Key activities to track
- Extraction guidance
KYC Customer Onboarding Attributes
| Name | Description | ||
|---|---|---|---|
|
Activity Name
ActivityName
|
The name of the specific task or event that occurred at a point in time during the onboarding process. | ||
|
Description
The Activity Name describes a step in the KYC onboarding workflow, such as 'Application Submitted', 'Document Review Performed', or 'Application Approved'. Each activity represents a distinct action or milestone in the process. This attribute is critical for constructing the process map, which visually represents the flow of activities. It allows for the analysis of process variants, bottlenecks between specific steps, and the frequency of rework loops. Analyzing activities is key to understanding what is happening in the process.
Why it matters
This attribute forms the backbone of the process map, allowing you to visualize and analyze the sequence of events in the customer onboarding journey.
Where to get
Typically found in an event log or audit trail table within LexisNexis Risk Solutions that tracks process steps.
Examples
Application SubmittedInitial Screening PerformedDocuments RequestedCompliance Review Completed
|
|||
|
Customer Application
CustomerApplication
|
The unique identifier for each customer onboarding application, serving as the primary case ID. | ||
|
Description
The Customer Application is the central identifier that links all related activities and data points for a single customer's onboarding journey. It starts when an application is submitted and follows the case until it is either completed or rejected. In process mining, this attribute is essential for grouping all events into a cohesive case, allowing for end-to-end analysis of the onboarding lifecycle. It enables the reconstruction of the entire process flow for each applicant, which is fundamental for calculating cycle times, analyzing process variants, and tracking an application's status over time.
Why it matters
This is the fundamental Case ID. Without it, you cannot trace the end-to-end journey of a customer application, making process analysis impossible.
Where to get
This is the primary case identifier within the LexisNexis Risk Solutions case management module.
Examples
APP-2023-001234APP-2023-005678APP-2024-009101
|
|||
|
Event Timestamp
EventTimestamp
|
The precise date and time when a specific activity started. | ||
|
Description
This timestamp marks the beginning of an activity, providing the chronological order for all events within a case. It is the foundation for all time-based analysis in process mining. Using the Event Timestamp, it is possible to calculate the duration of activities, the waiting time between them, and the total end-to-end cycle time of the onboarding process. This data is essential for identifying bottlenecks, monitoring SLA adherence, and understanding process efficiency.
Why it matters
This timestamp is essential for ordering events chronologically and calculating all time-based metrics, such as cycle times and bottlenecks.
Where to get
Located in the event log or audit trail tables alongside the Activity Name.
Examples
2023-10-26T10:00:00Z2023-10-26T11:30:00Z2023-10-27T14:15:00Z
|
|||
|
Application Status
ApplicationStatus
|
The current or final status of the customer application. | ||
|
Description
This attribute reflects the overall status of the case at a given point in time or its final outcome. Common statuses include 'In Progress', 'Approved', 'Rejected', or 'Pending Information'. Application Status is vital for tracking the outcomes of the onboarding process. It is used in the 'Application Rejection Reasons & Stages' and 'Daily Throughput and Application Status' dashboards to monitor success rates and operational flow. Analyzing how status changes over time provides insight into the case lifecycle.
Why it matters
Tracks the outcome of each application, which is essential for calculating key KPIs like the Application Rejection Rate and monitoring throughput.
Where to get
Consult LexisNexis Risk Solutions documentation or system administrator. This is typically a key field on the main case or application object.
Examples
In ProgressApprovedRejectedPending Customer Information
|
|||
|
Assigned User
AssignedUser
|
The unique identifier of the user or agent responsible for performing the activity. | ||
|
Description
This attribute identifies the specific individual who executed a task, such as a compliance officer performing a document review. It helps in analyzing workload distribution and individual performance. In analysis, Assigned User is key to the 'Resource Allocation and Workload' dashboard. It allows for filtering the process map by user, comparing performance across team members, and identifying opportunities for training or workload rebalancing. It can also help pinpoint bottlenecks caused by specific user groups.
Why it matters
This attribute is crucial for analyzing resource performance, workload distribution, and identifying opportunities for automation or resource optimization.
Where to get
Consult LexisNexis Risk Solutions documentation or system administrator. Usually found in audit trails or task management tables.
Examples
j.doem.smithk.chen
|
|||
|
Department
Department
|
The business department or team to which the assigned user belongs. | ||
|
Description
The Department attribute specifies the functional group responsible for an activity, such as 'Compliance', 'Onboarding Operations', or 'Fraud Prevention'. This attribute is used to analyze the process from a departmental view, enabling analysis of handoffs between different teams. It is a primary dimension in the 'Resource Allocation and Workload' dashboard and helps identify cross-functional inefficiencies or delays in communication between departments.
Why it matters
Allows for analysis of process handoffs and performance by functional area, helping to identify cross-departmental bottlenecks.
Where to get
Consult LexisNexis Risk Solutions documentation or system administrator. May need to be joined from a user or HR master data table.
Examples
ComplianceOnboarding TeamKYC AnalystsCustomer Support
|
|||
|
End Time
EndTime
|
The precise date and time when an activity was completed. | ||
|
Description
This timestamp marks the completion of an activity. The difference between the End Time and the Start Time for an event represents its processing time. End Time is crucial for accurately calculating how long each step takes, which is a primary input for the 'Activity Processing & Waiting Times' dashboard. It helps differentiate between the time a resource actively worked on a task versus the time the case was waiting for the next step to begin.
Why it matters
Enables the precise calculation of activity processing time, which is essential for identifying inefficient steps and analyzing resource workload.
Where to get
Consult LexisNexis Risk Solutions documentation or system administrator. Often available in event logs that record both start and end events.
Examples
2023-10-26T10:45:10Z2023-10-26T11:55:30Z2023-10-28T09:05:00Z
|
|||
|
Risk Level
RiskLevel
|
The calculated risk level of the customer application, such as Low, Medium, or High. | ||
|
Description
LexisNexis Risk Solutions specializes in assessing risk. This attribute represents the output of that assessment, categorizing each application based on its potential risk profile. The risk level often dictates the required intensity and duration of the due diligence process. This attribute is the core dimension for the 'Risk Level vs. Onboarding Duration' dashboard. Analyzing the process by risk level can reveal if high-risk applications take significantly longer, as expected, or if low-risk applications are being unnecessarily delayed. It helps in validating and refining risk-based onboarding strategies.
Why it matters
Crucial for risk-based analysis, helping to understand how customer risk profiles impact process complexity, duration, and pathways.
Where to get
Consult LexisNexis Risk Solutions documentation or system administrator. This is a core output of the risk assessment modules.
Examples
LowMediumHighSanctioned
|
|||
|
SLA Target Date
SlaTargetDate
|
The date by which the customer onboarding process is expected to be completed. | ||
|
Description
The SLA Target Date defines the service level agreement for completing an application. This date is often determined based on factors like the application type, customer segment, or jurisdiction. This attribute is essential for the 'SLA Target Adherence Monitoring' dashboard and the 'SLA Adherence Rate' KPI. By comparing the actual completion date with the SLA Target Date, organizations can measure their performance against commitments, identify cases at risk of breaching SLAs, and investigate the root causes of delays.
Why it matters
Enables performance measurement against service level agreements, highlighting process inefficiencies that cause SLA breaches.
Where to get
Consult LexisNexis Risk Solutions documentation or system administrator. This may be stored on the case or calculated based on business rules.
Examples
2023-11-10T17:00:00Z2023-11-15T17:00:00Z2023-12-01T17:00:00Z
|
|||
|
Application Type
ApplicationType
|
The type of customer application, such as 'Individual' or 'Business'. | ||
|
Description
This attribute categorizes applications based on the type of entity being onboarded. Different application types often follow distinct process paths and have different risk profiles and SLA targets. Analyzing the process by Application Type allows for segmentation of the data to compare the efficiency and complexity of onboarding different kinds of customers. It's a common filter used across most dashboards to provide a more granular view of performance.
Why it matters
Allows for powerful segmentation of the process, revealing how different types of applications are handled and where specific bottlenecks for each type exist.
Where to get
Consult LexisNexis Risk Solutions documentation or system administrator. This is typically a core field on the application or case object.
Examples
IndividualBusinessHigh Net Worth IndividualTrust
|
|||
|
Channel
Channel
|
The channel through which the application was submitted, such as 'Web', 'Mobile', or 'In-Branch'. | ||
|
Description
The Channel attribute identifies the submission source of the application. The channel can influence data quality, customer behavior, and the types of issues encountered during onboarding. This attribute is used to compare process performance across different channels. For instance, the 'Onboarding Funnel Conversion Rates' dashboard can be filtered by channel to see if mobile applicants drop off at a higher rate than web applicants, informing channel-specific process improvements.
Why it matters
Helps analyze process performance by submission channel, identifying variations that can inform channel strategy and user experience improvements.
Where to get
Consult LexisNexis Risk Solutions documentation or system administrator. This information is typically captured at the start of the application process.
Examples
Web PortalMobile AppIn-BranchAPI
|
|||
|
Compliance Reviewer
ComplianceReviewer
|
The user or agent specifically assigned to the compliance review activities. | ||
|
Description
While 'AssignedUser' captures the user for any activity, this attribute specifically identifies the compliance specialist involved in critical review steps. This allows for a more focused analysis of the compliance function. This attribute is key for the 'Compliance Review Duration and Backlog' dashboard. It helps analyze the workload and performance of the compliance team, identifying if specific reviewers are bottlenecks or if the overall team is under-resourced.
Why it matters
Provides focused insight into the compliance function, allowing for detailed analysis of reviewer workload and performance in this critical, often-delayed process stage.
Where to get
Consult LexisNexis Risk Solutions documentation or system administrator. This can be derived by filtering 'AssignedUser' for compliance-related activities.
Examples
c.joness.patelsystem_escalation
|
|||
|
Customer Country
CustomerCountry
|
The country of residence or incorporation for the customer. | ||
|
Description
This attribute specifies the customer's country, which is a critical factor in KYC due to varying international regulations and risk levels associated with different jurisdictions. Analyzing the process by Customer Country can reveal significant differences in cycle times and process complexity. For example, applications from high-risk jurisdictions may require additional compliance checks, leading to longer durations. This analysis helps in resource planning and setting realistic SLAs for different regions.
Why it matters
Enables jurisdictional analysis, which is crucial for understanding how regional regulations and risk factors impact process performance.
Where to get
Consult LexisNexis Risk Solutions documentation or system administrator. This is a standard field in customer master data.
Examples
USAGBRDEUSGP
|
|||
|
Cycle Time
CycleTime
|
The total end-to-end duration of a customer application, from submission to final decision. | ||
|
Description
Cycle Time measures the total time elapsed from the very first event (e.g., 'Application Submitted') to the very last event (e.g., 'Customer Onboarding Completed' or 'Application Rejected') for a single case. This is a primary KPI for measuring overall process health and is visualized in the 'Onboarding End-to-End Cycle Time' dashboard. Tracking average cycle time allows organizations to monitor the impact of process improvements and identify how different factors, like risk level or application type, affect the overall customer experience.
Why it matters
This is a key performance indicator that measures the total time-to-value for the customer, directly impacting customer satisfaction and operational efficiency.
Where to get
This is a calculated metric, derived by subtracting the timestamp of the first event from the timestamp of the last event for each case.
Examples
5 days 4 hours22 days 8 hours1 day 2 hours
|
|||
|
Is Automated
IsAutomated
|
A flag indicating whether an activity was performed automatically by the system or manually by a user. | ||
|
Description
This boolean attribute distinguishes between tasks executed by system automation (e.g., an initial screening check) and those requiring human intervention (e.g., a manual document review). 'Is Automated' is used to calculate the 'Manual Activity Proportion' KPI and to analyze the effectiveness of automation initiatives. In the process map, it can highlight the interface between automated and manual steps, helping to identify opportunities for further automation to reduce costs and processing times.
Why it matters
Distinguishes between manual and automated tasks, which is key for identifying automation opportunities and measuring their impact.
Where to get
Consult LexisNexis Risk Solutions documentation or system administrator. This might be a flag in the event log or inferred based on the 'AssignedUser' (e.g., a 'system' user).
Examples
truefalse
|
|||
|
Is Rework
IsRework
|
A flag that identifies activities that are part of a rework loop. | ||
|
Description
This boolean flag is set to true when an activity is repeated within the same case, such as a 'Document Review Performed' occurring a second time after 'Additional Information Requested'. It signifies that the process has moved backward. 'Is Rework' is crucial for the 'Rework and Repetition Analysis' dashboard and the 'Rework Loop Percentage' KPI. It allows for the quantification of wasted effort and helps identify the root causes of rework, such as unclear instructions or poor data quality, enabling targeted process improvements.
Why it matters
Directly quantifies inefficiency and wasted effort in the process, highlighting activities that are frequently repeated and driving up costs and cycle times.
Where to get
This is a calculated attribute, typically derived within the process mining tool by detecting repeated sequences of activities within a case.
Examples
truefalse
|
|||
|
Last Data Update
LastDataUpdate
|
The timestamp indicating the last time the data was refreshed or extracted from the source system. | ||
|
Description
This attribute provides a timestamp for when the dataset was last updated. It is typically applied to the entire dataset during the data extraction and loading process. This information is critical for dashboard users to understand the freshness of the data they are analyzing. It ensures that decisions are based on data that is as current as required and helps manage expectations about the timeliness of the insights.
Why it matters
Provides crucial context about data freshness, ensuring that analyses are relevant and that decisions are not based on outdated information.
Where to get
This is usually generated and stamped onto the dataset during the ETL (Extract, Transform, Load) process.
Examples
2024-01-15T02:00:00Z2024-01-16T02:00:00Z2024-01-17T02:00:00Z
|
|||
|
Processing Time
ProcessingTime
|
The duration of time spent actively working on an activity. | ||
|
Description
Processing Time is the duration calculated from an activity's start and end timestamps (EndTime - StartTime). It represents the time a resource was actively engaged with a task, as opposed to waiting time. This calculated metric is a cornerstone of performance analysis and directly feeds the 'Activity Processing & Waiting Times' dashboard. It helps pinpoint which specific activities are the most time-consuming, indicating targets for process improvement, training, or automation. For example, it helps calculate the 'Avg. Doc. Review Processing Time' KPI.
Why it matters
Measures the active work time for activities, helping to distinguish between value-adding time and wasteful waiting time to identify true bottlenecks.
Where to get
This is a calculated field, derived from the difference between the 'EndTime' and 'EventTimestamp' (StartTime) attributes.
Examples
25 minutes1 hour 15 minutes3 days
|
|||
|
Rejection Reason
RejectionReason
|
A code or description explaining why an application was rejected. | ||
|
Description
When an application's final status is 'Rejected', this attribute provides the specific reason. Examples include 'Failed Identity Verification', 'Sanctions Match', or 'Incomplete Documentation'. This data is the primary input for the 'Application Rejection Reasons & Stages' dashboard. Analyzing rejection reasons helps identify common failure points in the process, which can inform improvements to application guidelines, customer communication, or internal review criteria. Understanding why applications are rejected is key to improving the overall approval rate.
Why it matters
Provides direct insight into why onboarding fails, enabling targeted improvements to increase the application approval rate.
Where to get
Consult LexisNexis Risk Solutions documentation or system administrator. Often found in a field that is populated when the Application Status is set to 'Rejected'.
Examples
Sanctions List MatchIncomplete DocumentationFailed ID&VHigh Risk Profile
|
|||
|
SLA Status
SlaStatus
|
Indicates whether the completed application met its SLA target. | ||
|
Description
This attribute categorizes each completed case based on its adherence to the 'SlaTargetDate'. Typical values are 'Met' or 'Breached'. This calculated field is the foundation of the 'SLA Target Adherence Monitoring' dashboard and the 'SLA Adherence Rate' KPI. It provides a clear, high-level view of performance against service commitments and allows for drill-down analysis to understand the common characteristics of cases that breach their SLAs.
Why it matters
Provides a clear, binary outcome for SLA performance, making it easy to track, report, and analyze compliance with service level targets.
Where to get
This is a calculated attribute, derived by comparing the timestamp of the final activity to the 'SlaTargetDate' for each case.
Examples
MetBreachedAt Risk
|
|||
|
Source System
SourceSystem
|
The system or application from which the event data originated. | ||
|
Description
This attribute identifies the source system that generated the event data, such as LexisNexis Risk Solutions or an integrated third-party tool. In complex environments, data for a single process may come from multiple systems. Understanding the source system is useful for data validation, troubleshooting, and analyzing process variations that may be specific to a particular system. It helps ensure data integrity and provides context about how and where an activity was recorded.
Why it matters
Identifies the origin of the data, which is crucial for data governance, validation, and understanding process execution across different IT systems.
Where to get
This information may be stored as a static value or in a specific field within the data export or API response.
Examples
LexisNexis Risk SolutionsThreatMetrixBridger Insight XG
|
|||
KYC Customer Onboarding Activities
| Activity | Description | ||
|---|---|---|---|
|
Application Approved
|
The final decision to approve the customer's application is made and recorded in the system. This is a critical business outcome and is almost always captured as an explicit status change. | ||
|
Why it matters
This milestone marks the successful conclusion of the decision-making process. Analyzing paths leading to approval helps identify best practices.
Where to get
Look for a final status update in the application case record where the status is set to 'Approved' or a similar terminal state.
Capture
Recorded as a final, definitive status change in the main application or case table.
Event type
explicit
|
|||
|
Application Rejected
|
The final decision to reject the customer's application is recorded. This is a terminal event and is captured via a definitive status change in the system. | ||
|
Why it matters
This is the primary failure-state end event. Analyzing the stages where rejections occur and the associated reasons is crucial for process improvement.
Where to get
Captured from the final status field of the application record being set to 'Rejected', 'Declined', or a similar terminal state.
Capture
Recorded as a final, definitive status change in the main application or case table.
Event type
explicit
|
|||
|
Application Submitted
|
Marks the beginning of the KYC onboarding process when a customer's application is first received by the system. This event is typically captured explicitly when the application form is submitted through a customer portal or internal data entry system integrated with LexisNexis. | ||
|
Why it matters
This is the primary start event for the process. Analyzing the time from this activity to completion is crucial for measuring end-to-end cycle time and SLA adherence.
Where to get
Captured from system logs or an application table that records the initial creation timestamp of a new customer application record.
Capture
Event is logged upon creation of a new application case or entry in the core application table.
Event type
explicit
|
|||
|
Compliance Review Completed
|
The compliance officer completes their review and makes a recommendation, moving the case to the next stage. This can be captured explicitly when a task is marked 'complete' or inferred when the status changes from 'Pending Compliance' to another state. | ||
|
Why it matters
This is a key milestone that concludes a critical, and often manual, part of the process. It is the endpoint for measuring compliance review duration.
Where to get
Captured from a compliance task completion timestamp or a status change out of 'Under Compliance Review'.
Capture
Inferred from a status change indicating the review is finished, such as moving to 'Approved', 'Rejected', or 'Final Decision'.
Event type
inferred
|
|||
|
Compliance Review Initiated
|
A case is assigned to a compliance officer or team for manual review, typically for high-risk applications. This is often inferred from a status change to 'Pending Compliance Review' or from a task assignment log. | ||
|
Why it matters
This marks the start of a manual, often lengthy, review step. Measuring the time from this point to its completion helps quantify compliance-related bottlenecks.
Where to get
Captured from a task assignment log, a change in case ownership to a compliance team, or a status update in the case history.
Capture
Inferred from a status change like 'Under Compliance Review' or when the case is assigned to a compliance-related user queue.
Event type
inferred
|
|||
|
Customer Onboarding Completed
|
This event marks the successful end of the entire onboarding process, confirming the customer is fully active. It may be an explicit final status or inferred from the 'Account Activated' event. | ||
|
Why it matters
This is the primary success-state end event. It is essential for calculating the end-to-end cycle time for all successfully onboarded customers.
Where to get
Inferred from the 'Account Activated' timestamp or captured from a final, terminal status like 'Onboarding Complete' in the case file.
Capture
Inferred from the last significant positive event, such as account activation, or a final status update.
Event type
inferred
|
|||
|
Documents Received
|
Confirms that the customer has uploaded or provided the required documents to the system. This is typically an explicit event generated by the document submission portal or a manual entry by an agent. | ||
|
Why it matters
This activity concludes a period of waiting time and triggers subsequent review activities. It is a key milestone in the data collection phase.
Where to get
Captured from document management system logs or a timestamped entry in the application case file when new documents are attached.
Capture
Event is logged when a document is successfully uploaded or manually marked as received in the system.
Event type
explicit
|
|||
|
Risk Assessment Performed
|
The system calculates a risk score for the customer based on the information gathered and checks performed. This is a core feature of LexisNexis and is typically captured as an explicit, automated event in the case history. | ||
|
Why it matters
The outcome of this assessment often determines the subsequent process path, such as requiring enhanced due diligence. It is a critical decision point in the workflow.
Where to get
Look for an event in the application's audit log or workflow history that records the completion of the risk scoring or assessment module.
Capture
A specific event is logged when the risk engine completes its analysis and assigns a risk profile or score.
Event type
explicit
|
|||
|
Account Activated
|
Following approval, the customer's account is formally created and activated in the core banking or services platform. This activity is often logged in an audit trail or inferred from the account creation date. | ||
|
Why it matters
This is the final value-delivery step for the customer. Delays between 'Application Approved' and this step can indicate system integration issues.
Where to get
Captured from an account creation log, an API call to another system, or the creation timestamp of the account record itself.
Capture
Logged as a separate event post-approval or identified by the presence of an activation timestamp on the customer record.
Event type
explicit
|
|||
|
Additional Information Requested
|
A compliance officer or reviewer requests more information or clarification from the customer. This event is a primary driver of rework and is typically captured as an explicit status change or communication log entry. | ||
|
Why it matters
This activity creates rework loops that extend the onboarding cycle time. Tracking its frequency helps identify unclear requirements or common application deficiencies.
Where to get
Look for a status change to 'Pending Customer Information' or an outbound communication event log. This is often a user-triggered action.
Capture
Logged when an agent uses a 'Request Information' feature, which changes the case status and may log a communication event.
Event type
explicit
|
|||
|
Document Review Performed
|
A user or an automated tool reviews the submitted documents for authenticity, validity, and completeness. This activity can be inferred from a status change from 'Documents Received' to 'Review Complete' or from an explicit log entry. | ||
|
Why it matters
This is a common source of bottlenecks and rework. Analyzing its processing time and repetitions is key to improving efficiency and identifying automation opportunities.
Where to get
Inferred by tracking the time between a 'Documents Received' status and a subsequent status like 'Verification Passed' or 'Additional Info Required'.
Capture
Calculated as the time between the document receipt event and the event marking the completion of the review.
Event type
inferred
|
|||
|
Documents Requested
|
The system or a user requests specific documentation from the customer, such as a driver's license or utility bill. This event can be captured from system-generated communication logs or a status change indicating the case is awaiting documents. | ||
|
Why it matters
This activity often introduces significant waiting time into the process. Analyzing the frequency and duration helps identify delays caused by customer response times.
Where to get
Check for an event in communication logs sent to the customer or a status change on the application, for example, 'Pending Customer Documents'.
Capture
Inferred from a status change to 'Awaiting Documents' or from an outbound communication log timestamp.
Event type
inferred
|
|||
|
Identity Verification Initiated
|
Represents the point where the system begins the core identity verification process using LexisNexis services, such as checking against databases. This is usually captured as an explicit event log when the verification service is called. | ||
|
Why it matters
This activity marks the start of a critical and often time-consuming sub-process. Tracking its duration helps isolate bottlenecks related to identity checks.
Where to get
Captured from API call logs to the identity verification module or an audit trail entry showing the start of the verification task.
Capture
An event is logged when the system's identity verification module or API is triggered for the application.
Event type
explicit
|
|||
|
Initial Screening Performed
|
An automated check performed by the system immediately after submission to validate basic data completeness and run preliminary checks. This activity is often logged as an explicit, automated step in the process workflow history. | ||
|
Why it matters
Identifies applications that fail at the earliest stage, helping to understand data quality issues. It also marks the first automated value-add step in the process.
Where to get
Look for automated rule execution logs or a status change in the application workflow history indicating the completion of the initial screening step.
Capture
Logged as a completed automated task or a specific status update in the case history.
Event type
explicit
|
|||