Your Quality Management Data Template
Your Quality Management Data Template
- Recommended attributes to collect
- Key activities to track
- Extraction guidance for MasterControl
Quality Management Attributes
| Name | Description | ||
|---|---|---|---|
|
Quality Event
QualityEvent
|
The unique identifier for a single quality event, such as a non-conformance, deviation, or complaint. This ID links all related activities and documents together. | ||
|
Description
The Quality Event ID serves as the primary case identifier for the entire quality management process. It is typically an alphanumeric value generated by MasterControl when a new quality event is initiated. In process mining, this attribute is fundamental for reconstructing the end-to-end journey of each quality issue. By grouping all related activities under a single Quality Event ID, analysts can visualize the complete process flow, measure cycle times from creation to closure, and identify variations or bottlenecks specific to individual cases.
Why it matters
This is the essential case identifier that connects all process steps, enabling the analysis of the entire lifecycle of each quality issue from initiation to resolution.
Where to get
This is the primary key for a quality event record. Consult MasterControl documentation or system configuration for the specific table and field name.
Examples
QE-2023-00123NC-2023-0456CAPA-2023-0078
|
|||
|
Activity Name
ActivityName
|
The name of the specific task or event that occurred within the quality event lifecycle, such as 'Investigation Initiated' or 'Corrective Action Implemented'. | ||
|
Description
The Activity Name describes a distinct step or milestone in the quality management process. These activities are logged with timestamps and form the sequence of events that constitute the process flow for each quality event. Analyzing these activities is the core of process mining. It allows for the discovery of the actual process map, identification of common pathways, deviations from the standard procedure, and bottlenecks. The sequence and frequency of activities like 'Root Cause Analysis Performed' are critical for dashboards focused on rework and compliance.
Why it matters
This attribute defines the steps in the process, allowing for the visualization of the process map, detection of deviations, and analysis of process flow.
Where to get
This information is typically logged in an audit trail or history table associated with each quality event record in MasterControl.
Examples
Quality Event CreatedRoot Cause Analysis PerformedCorrective Action Plan ApprovedEffectiveness of Action Verified
|
|||
|
Event Time
EventTime
|
The precise date and time when a specific activity or event occurred. This serves as the start time for each activity. | ||
|
Description
This timestamp marks the completion of a specific task or the occurrence of an event in the quality process. It provides the chronological order necessary to reconstruct the process flow for each case. Timestamps are essential for all time-based process mining analyses. They are used to calculate cycle times, processing times, and waiting times between activities. This data is fundamental for building dashboards that analyze resolution times, identify bottlenecks, and measure handoff delays between departments.
Why it matters
This timestamp provides the chronological sequencing of events, which is critical for calculating process durations, identifying bottlenecks, and understanding process performance.
Where to get
This is a standard field in the audit trail or history log for quality event records in MasterControl, capturing when each action was recorded.
Examples
2023-10-26T10:00:00Z2023-10-27T14:35:10Z2023-11-05T09:15:00Z
|
|||
|
Effectiveness Status
EffectivenessStatus
|
The outcome of the verification check to determine if the implemented corrective and preventive actions were effective. | ||
|
Description
This attribute records the result of the effectiveness verification, which is a critical final step in the CAPA process. The status indicates whether the actions taken have successfully resolved the issue and prevented its recurrence. This data is the primary input for the 'CAPA Effectiveness Monitoring' dashboard and the 'CAPA Effectiveness Rate' KPI. Analyzing this outcome helps the organization understand the success rate of its problem-solving efforts and identify areas where corrective actions are failing.
Why it matters
Measures the success of implemented actions, which is crucial for calculating the CAPA effectiveness rate and driving continuous improvement.
Where to get
This is a result field within the effectiveness verification or closure section of the CAPA form, which is often linked to the quality event in MasterControl.
Examples
EffectiveIneffectiveRequires Monitoring
|
|||
|
End Time
EndTime
|
The timestamp indicating when an activity was completed. It is often the same as the StartTime for atomic events. | ||
|
Description
EndTime marks the conclusion of an activity. For many events logged in an audit trail, the start and end times are identical, representing a single point in time when the event occurred. However, for activities that have a measurable duration, this field can capture that information. This attribute is used in conjunction with StartTime to calculate the processing time of individual activities. This is crucial for identifying which steps consume the most time in the process, supporting the Quality Process Bottleneck Identification dashboard.
Why it matters
Enables the calculation of precise activity durations, helping to pinpoint which specific tasks are the most time-consuming within the quality management process.
Where to get
May be available in the MasterControl audit trail or history log. If not available, it can be derived as the StartTime of the next activity in the sequence.
Examples
2023-10-26T10:05:12Z2023-10-27T15:00:00Z2023-11-05T11:20:30Z
|
|||
|
Quality Event Status
QualityEventStatus
|
The current state of the quality event in its lifecycle, such as 'Open', 'Under Investigation', 'Pending Approval', or 'Closed'. | ||
|
Description
This attribute provides a snapshot of where a quality event stands at any given time. It is typically updated as the case progresses through major milestones. In process mining, the status can be used to filter for open or closed cases, which is essential for the 'Quality Event Throughput & Backlog' dashboard. Analyzing the time spent in each status can also help identify stages where events tend to stall.
Why it matters
Indicates the current state of a quality event, enabling analysis of backlogs, throughput, and the time spent in different stages of the lifecycle.
Where to get
This is a standard status field on the main quality event record in MasterControl.
Examples
OpenIn ProgressPending CAPA ApprovalClosedCancelled
|
|||
|
Quality Event Type
QualityEventType
|
The classification of the quality event, such as 'Non-Conformance', 'Customer Complaint', 'Audit Finding', or 'Deviation'. | ||
|
Description
This attribute categorizes the quality event based on its nature. Different types of events may follow distinct process paths or have different compliance requirements and target resolution times. Analyzing the process by Quality Event Type is fundamental to understanding variations in performance. Dashboards like the 'Quality Event Resolution Time Analysis' rely on this attribute to compare cycle times for different categories, helping to identify which types of issues are more complex or take longer to resolve.
Why it matters
Categorizes quality events, allowing for comparative analysis of process flows, cycle times, and outcomes across different issue types.
Where to get
This is a primary classification field on the quality event initiation form in MasterControl.
Examples
Non-Conformance Report (NCR)Customer ComplaintInternal Audit FindingSupplier Corrective Action Request (SCAR)
|
|||
|
Responsible Department
ResponsibleDepartment
|
The department or functional area responsible for the quality event or the current activity. | ||
|
Description
This attribute indicates which department, such as 'Manufacturing', 'Quality Assurance', or 'Engineering', is assigned ownership of the quality event or is performing a specific task. This is a critical dimension for analysis, as it allows for filtering and comparing process performance across different business units. It is essential for the Handoff Delay Analysis dashboard, as changes in this attribute between activities signify a handoff. Analyzing time gaps at these handoffs can reveal communication or coordination issues between departments.
Why it matters
Helps identify inter-departmental bottlenecks and analyze process performance by functional area, which is crucial for understanding handoff delays.
Where to get
This information is typically stored on the main quality event form in MasterControl and may be updated as the event moves through its lifecycle.
Examples
Quality AssuranceManufacturingResearch & DevelopmentRegulatory Affairs
|
|||
|
Root Cause Category
RootCauseCategory
|
The classification of the identified root cause of the quality event, such as 'Human Error', 'Equipment Failure', or 'Process Deficiency'. | ||
|
Description
After a root cause analysis is performed, the findings are typically categorized. This attribute stores that classification, providing structured data on why problems are occurring. This is a key attribute for strategic quality improvement. The 'Root Cause Analysis Consistency' dashboard uses this data to correlate root cause categories with the effectiveness of corrective actions and rework rates. This helps determine if certain types of root causes are more difficult to address or if the analysis itself is inconsistent.
Why it matters
Categorizes the underlying reason for quality issues, enabling targeted analysis to prevent recurrence and improve the effectiveness of corrective actions.
Where to get
This field is typically part of the Root Cause Analysis or Investigation section of the quality event form in MasterControl.
Examples
Equipment MalfunctionInadequate TrainingMaterial DefectProcedure Not Followed
|
|||
|
Target Resolution Date
TargetResolutionDate
|
The planned or required date by which the quality event should be closed. | ||
|
Description
This attribute defines the expected completion date for a quality event, often determined by its type, severity, or associated regulatory requirements. It serves as a deadline for the resolution process. This date is used to measure on-time performance and compliance with service level agreements (SLAs). It is essential for calculating the 'Compliance Adherence Rate' KPI and for creating a calculated 'SLA Status' attribute (e.g., 'On Time', 'Late'). This allows for analysis of which types of events or departments are most likely to miss their deadlines.
Why it matters
Sets a deadline for resolution, enabling the measurement of on-time performance and the calculation of SLA compliance rates.
Where to get
This is likely a date field on the main quality event form in MasterControl, which may be automatically calculated or manually entered.
Examples
2023-11-302024-01-152023-12-22
|
|||
|
User Name
UserName
|
The name of the user or resource who performed the activity. | ||
|
Description
This attribute identifies the individual responsible for executing a specific task in the process, such as approving a corrective action plan or closing a quality event. Analyzing the process by user helps in understanding workload distribution, identifying training needs, and discovering user-specific variations in process execution. It can be used to see if certain individuals are associated with rework or delays, providing insights for performance management and resource allocation.
Why it matters
Attributes activities to specific individuals, enabling analysis of workload, performance, and resource-related bottlenecks.
Where to get
This information is a standard part of the audit trail or history log for any quality event in MasterControl, typically captured as 'User' or 'Performed By'.
Examples
j.does.smithr.williams
|
|||
|
Associated Regulation
AssociatedRegulationStandard
|
The specific regulation or quality standard, like ISO 13485 or 21 CFR Part 820, that applies to the quality event. | ||
|
Description
This attribute links a quality event to a specific external regulation or internal quality standard. This is particularly important for companies in regulated industries like life sciences or manufacturing. For the 'Quality Process Compliance Adherence' dashboard, this attribute is critical. It allows analysts to filter for events related to a specific regulation and verify that the required process steps were followed. Deviations can be flagged as potential compliance risks, making this a key attribute for audit and risk management purposes.
Why it matters
Links quality events to specific compliance requirements, enabling analysis of adherence to regulatory standards and internal policies.
Where to get
This may be a selectable field on the quality event form in MasterControl, allowing users to tag events with applicable regulations.
Examples
ISO 1348521 CFR Part 820ICH Q10SOP-QA-001
|
|||
|
Corrective Action ID
CorrectiveActionId
|
The unique identifier for the corrective action plan (CAPA) linked to the quality event. | ||
|
Description
This attribute provides a direct link between a quality event and the specific corrective action(s) created to address its root cause. A single quality event may be linked to one or more corrective actions. Having this ID allows for a more detailed analysis that can join quality event data with data from the CAPA module. This enables a deeper investigation into the types of actions taken for different root causes and their subsequent effectiveness, supporting the CAPA Effectiveness Monitoring dashboard.
Why it matters
Links the quality event to its specific corrective actions, enabling a more granular analysis of action effectiveness and resolution strategies.
Where to get
This would be stored in a related records or linked objects section on the quality event form in MasterControl.
Examples
CA-2023-0199CA-2023-0204CA-2023-0210
|
|||
|
Handoff Wait Time
HandoffWaitTime
|
The idle time between two consecutive activities performed by different departments or teams. | ||
|
Description
This metric calculates the duration a case spends waiting after one department completes its task and before the next department begins theirs. It is calculated by identifying consecutive activities where the 'Responsible Department' changes and measuring the time gap between them. This calculated metric is the core of the 'Handoff Delay Analysis' dashboard and the 'Average Handoff Wait Time' KPI. It isolates waiting time caused by coordination or communication issues from active processing time, helping to pinpoint specific cross-functional bottlenecks that prolong the overall cycle time.
Why it matters
Isolates and quantifies waiting time between departments, directly exposing communication gaps and coordination bottlenecks in the process.
Where to get
This is calculated during data transformation by analyzing the timestamps and 'ResponsibleDepartment' values of consecutive activities within each case.
Examples
172800259200604800
|
|||
|
Is Rework
IsRework
|
A boolean flag indicating if an activity or a sequence of activities represents rework. | ||
|
Description
This flag is set to 'true' when an activity is repeated within the same case, or when the process loops back to an earlier stage. For example, if 'Root Cause Analysis Performed' happens twice for the same quality event, the second occurrence would be flagged as rework. This attribute directly supports the 'Rework and Re-investigation Overview' dashboard and the 'Root Cause Re-Investigation Rate' KPI. It simplifies the quantification of rework, making it easy to filter for and analyze cases with inefficient process flows.
Why it matters
Explicitly flags activities that are being repeated, making it simple to quantify and analyze the frequency, causes, and impact of rework.
Where to get
This flag is derived during data transformation by analyzing the sequence of activities for each case to detect repeated steps or process loops.
Examples
truefalse
|
|||
|
Last Data Update
LastDataUpdate
|
The timestamp indicating when the data was last extracted or refreshed from MasterControl. | ||
|
Description
This attribute records the date and time of the most recent data pull from the source system. It is a metadata field that applies to the entire dataset rather than individual events. This information is vital for understanding the freshness of the data being analyzed. It provides transparency to business users about how current the process dashboards and KPIs are, ensuring that decisions are based on data of a known age.
Why it matters
Provides crucial context on data freshness, ensuring users know how up-to-date the analysis is and when the next data refresh is expected.
Where to get
This timestamp is generated and added during the data extraction, transformation, and loading (ETL) process from MasterControl.
Examples
2024-05-20T02:00:00Z2024-05-21T02:00:00Z
|
|||
|
Preventive Action ID
PreventiveActionId
|
The unique identifier for the preventive action plan (PAPA) linked to the quality event. | ||
|
Description
Similar to the Corrective Action ID, this attribute links the quality event to any preventive actions that were created. Preventive actions are proactive measures intended to prevent the occurrence of similar potential issues in other areas. This ID is crucial for the 'Preventive Action Optimization' dashboard and the 'Preventive Action Rate' KPI. It allows for tracking how often quality events lead to proactive improvements and analyzing the impact of those preventive measures over time.
Why it matters
Connects the quality event to proactive preventive measures, allowing for analysis of the organization's ability to learn from issues and prevent future occurrences.
Where to get
This would be found in a related records section on the quality event or CAPA form in MasterControl, linking to a preventive action object.
Examples
PA-2023-0051PA-2023-0052PA-2023-0053
|
|||
|
Processing Time
ProcessingTime
|
The duration of time spent actively working on an activity. | ||
|
Description
Processing Time, also known as activity duration, is the time elapsed between the start and end of an activity. It represents the actual work time, as opposed to waiting time between steps. This calculated metric is a cornerstone of bottleneck analysis. By aggregating processing times for each activity, the 'Quality Process Bottleneck Identification' dashboard can clearly show which steps are the most time-consuming. This helps focus optimization efforts on the parts of the process where the most time is being spent.
Why it matters
Quantifies the active work duration for each task, directly highlighting the most time-consuming steps and key areas for efficiency improvements.
Where to get
This is calculated during data transformation by subtracting the StartTime from the EndTime of an activity. It requires both timestamps to be available.
Examples
864003600604800
|
|||
|
Product Affected
ProductAffected
|
The product, product line, or component that is the subject of the quality event. | ||
|
Description
This attribute identifies the specific product or material associated with the quality issue. This provides essential context for understanding the impact of quality events. Analyzing process data by product allows for the identification of products that have recurring quality issues. This can help prioritize improvement efforts, inform product design changes, and assess the performance of different manufacturing lines or suppliers.
Why it matters
Provides critical business context, allowing for analysis of quality issues by product line to identify recurring problems and trends.
Where to get
This would be a field on the quality event form where users can specify the product, often by selecting from a predefined list or entering a part number.
Examples
Product A - Lot 54321Component XYZAPI-001
|
|||
|
Site Location
SiteLocation
|
The manufacturing site, plant, or facility where the quality event originated. | ||
|
Description
This attribute specifies the physical location or site associated with the quality event. It provides a geographical or organizational context for the issue. This is a valuable dimension for comparative analysis. By filtering or grouping by site, management can compare the performance of the quality management process across different locations. This can highlight sites that are performing well and can share best practices, as well as sites that may need additional support or process improvements.
Why it matters
Allows for performance comparison across different manufacturing sites or facilities, helping to identify location-specific issues or best practices.
Where to get
This information is typically captured on the quality event initiation form, often as a dropdown list of company sites.
Examples
Austin, TXDublin, IrelandSingapore PlantSite A
|
|||
|
SLA Status
SlaStatus
|
Indicates whether the quality event was closed within its target resolution date. | ||
|
Description
This attribute is derived by comparing the actual closure date of a quality event with its 'Target Resolution Date'. The status is typically set to 'On Time' or 'Late'. This provides a clear, at-a-glance indicator of performance against deadlines. It is the basis for the 'Compliance Adherence Rate' KPI and allows for easy filtering and analysis of late cases. Understanding the drivers of late resolutions is key to improving overall process timeliness and meeting compliance goals.
Why it matters
Provides a simple indicator of whether cases are meeting their deadlines, which is essential for measuring and improving on-time performance.
Where to get
This is calculated during data transformation by comparing the timestamp of the 'Quality Event Closed' activity with the 'TargetResolutionDate' field for each case.
Examples
On TimeLateAt Risk
|
|||
|
Source System
SourceSystem
|
Identifies the system from which the data was extracted, which in this case is MasterControl. | ||
|
Description
This attribute provides context about the origin of the data. For this process, it will consistently hold a value indicating 'MasterControl'. While it may seem static, this attribute is crucial in enterprise environments where data from multiple systems might be combined for a broader analysis. It ensures data lineage and helps in isolating system-specific behaviors or data quality issues.
Why it matters
Clearly attributes the data to its source system, which is essential for data governance and for analyses that combine data from multiple applications.
Where to get
This is typically a static value added during the data extraction, transformation, and loading (ETL) process to label the origin of the dataset.
Examples
MasterControlMasterControl QMS
|
|||
Quality Management Activities
| Activity | Description | ||
|---|---|---|---|
|
Corrective Action Plan Approved
|
Signifies that the proposed CAPA plan has been formally reviewed and approved by the required stakeholders. This is a critical milestone often captured through an explicit electronic signature event in MasterControl's audit trail. | ||
|
Why it matters
Approval steps are common bottlenecks. Analyzing the time taken for this activity helps identify delays in the approval process and supports the Handoff Delay Analysis dashboard.
Where to get
This is an explicit event captured in the audit trail when a user with approval authority applies an electronic signature to the CAPA plan or the relevant workflow step.
Capture
Captured from electronic signature logs associated with the approval workflow step.
Event type
explicit
|
|||
|
Effectiveness of Action Verified
|
Marks the completion of the verification step, where evidence is gathered and reviewed to confirm that the implemented actions were effective. This is captured when the effectiveness check task is completed, often with an electronic signature. | ||
|
Why it matters
This activity is crucial for calculating the CAPA Effectiveness Rate and the Avg Action Verification Time. It confirms whether the solution worked, preventing issue recurrence.
Where to get
This can be an explicit event captured via an electronic signature for the verification step or inferred from the completion of the 'Effectiveness Check' task in the workflow.
Capture
Captured from electronic signature logs or task completion timestamps for the verification step.
Event type
explicit
|
|||
|
Investigation Initiated
|
Marks the formal start of the investigation phase to determine the scope and immediate impact of the quality event. This is typically captured when the event is officially assigned to an investigator and the status is updated to 'Under Investigation'. | ||
|
Why it matters
This is a key milestone for tracking the duration of the investigation phase. Delays here can significantly impact the overall resolution time and compliance deadlines.
Where to get
Inferred from a status change in the quality event record to a state like 'Investigation in Progress'. This change is logged in the MasterControl audit trail.
Capture
Inferred from the timestamp when the event status changes to 'Under Investigation'.
Event type
inferred
|
|||
|
Quality Event Closed
|
This is the final activity, marking the successful resolution and formal closure of the quality event in MasterControl. This event is captured when the record's status is changed to 'Closed', which is recorded with a timestamp in the audit trail. | ||
|
Why it matters
This activity marks the end of the process, which is essential for calculating the overall cycle time and throughput. It confirms the successful completion of the quality management case.
Where to get
This is an explicit event inferred from the final status change of the quality event record to 'Closed'. The timestamp of this status change is recorded in the audit log.
Capture
Inferred from the timestamp when the event status is updated to 'Closed'.
Event type
inferred
|
|||
|
Quality Event Created
|
This is the starting point of the quality management process, where a new quality event, such as a deviation, non-conformance, or complaint, is first logged in MasterControl. This activity is typically captured explicitly when a user creates a new quality event record, generating an audit trail entry with a timestamp. | ||
|
Why it matters
This activity marks the beginning of the case lifecycle, which is essential for measuring the overall Quality Event Cycle Time and analyzing event submission volumes.
Where to get
This is an explicit event recorded in the audit trail tables for the quality event module. It corresponds to the creation timestamp of the quality event record.
Capture
Recorded in the audit trail upon creation of a new Quality Event object.
Event type
explicit
|
|||
|
Corrective Action Implemented
|
Indicates that the tasks defined in the approved corrective action plan have been executed and completed. This is typically captured when the user responsible for the action marks the implementation task as complete in the system. | ||
|
Why it matters
This is a key milestone for tracking the start of the Avg Action Verification Time KPI. The duration of implementation reflects the complexity and efficiency of the corrective actions.
Where to get
Inferred from the audit trail when the status of the assigned corrective action tasks is changed to 'Completed' or 'Implemented'.
Capture
Derived from the completion timestamp of the linked CAPA implementation tasks.
Event type
inferred
|
|||
|
Corrective Action Plan Proposed
|
Represents the creation and submission of a Corrective Action and Preventive Action (CAPA) plan for review. This event is typically captured when a CAPA record is formally linked to the quality event and its status is set to 'Pending Approval'. | ||
|
Why it matters
This activity marks the transition from investigation to resolution planning. The duration between RCA and this step can indicate planning efficiency.
Where to get
Likely inferred from the creation of a linked CAPA object or a status change within the quality event to a state like 'CAPA Plan Proposed' or 'Pending Plan Approval'.
Capture
Derived from the creation of a linked CAPA plan record or a status change in the parent event.
Event type
inferred
|
|||
|
Effectiveness Deemed Ineffective
|
This activity represents a negative outcome of the verification step, where the implemented actions were found to be ineffective. This event triggers rework and is inferred when the verification step is failed, leading to a status change that re-opens the investigation or CAPA planning. | ||
|
Why it matters
This event is critical for identifying rework loops and calculating the First-Pass Resolution Rate. It highlights failures in the problem-solving process that need to be addressed.
Where to get
Inferred from a status change indicating verification failure, such as 'Effectiveness Check Failed' or a transition back to 'Under Investigation'.
Capture
Derived from a workflow transition that moves the case back to an earlier stage after verification.
Event type
inferred
|
|||
|
Final Review Conducted
|
Signifies that a final review of the entire quality event record, including all documentation and actions, has been completed by a quality assurance manager. This is typically captured through an explicit electronic signature event before the case can be closed. | ||
|
Why it matters
This is the final quality gate before closure. Delays at this stage can leave events open unnecessarily, affecting throughput metrics.
Where to get
Captured explicitly from the audit trail when a user applies an electronic signature for the 'Final Review' or 'QA Approval' step in the workflow.
Capture
Captured from electronic signature logs for the final approval workflow step.
Event type
explicit
|
|||
|
Initial Triage Completed
|
Represents the completion of the initial assessment where the quality event is categorized, assigned a severity level, and prioritized. This is often inferred from a status change in the system, for example, when the event status moves from 'New' to 'Under Assessment' or 'Investigation'. | ||
|
Why it matters
Analyzing the time taken for triage helps identify delays in the initial response to quality issues. It is a critical step that dictates the subsequent workflow and resource allocation.
Where to get
Inferred from the audit trail when the event's status field is updated to a post-triage state, or when required triage fields like 'Severity' and 'Priority' are first populated and saved.
Capture
Derived from a change in the event status field, e.g., from 'Submitted' to 'Assigned'.
Event type
inferred
|
|||
|
Preventive Action Implemented
|
Represents the completion of tasks defined in the preventive action plan, aimed at preventing future occurrences. This is captured when the assigned preventive action tasks are marked as complete in the system. | ||
|
Why it matters
Tracking this activity is essential for the Preventive Action Rate KPI, helping to evaluate how proactively the organization is addressing potential future issues.
Where to get
Inferred from the completion timestamp of tasks specifically designated as 'Preventive Actions' within the CAPA plan.
Capture
Derived from the completion timestamp of linked preventive action tasks.
Event type
inferred
|
|||
|
Quality Event Cancelled
|
Represents an alternative end state where a quality event is deemed invalid, a duplicate, or entered in error, and is therefore cancelled. This is captured by a status change to 'Cancelled' or 'Void'. | ||
|
Why it matters
Distinguishing between closed and cancelled events is important for accurate reporting on process outcomes. A high cancellation rate may indicate issues with event reporting or triage.
Where to get
Inferred from a terminal status change to a state like 'Cancelled' or 'Voided', which is logged in the audit trail.
Capture
Inferred from the timestamp when the event status is updated to 'Cancelled'.
Event type
inferred
|
|||
|
Root Cause Analysis Performed
|
This activity signifies the completion of the root cause analysis (RCA) and the documentation of the findings. It can be inferred when the 'Root Cause' or related analysis fields are filled and the associated task is marked as complete. | ||
|
Why it matters
Tracking this activity helps measure the time and quality of the RCA phase. It's crucial for KPIs like Root Cause Re-Investigation Rate, which identifies rework.
Where to get
Inferred from the audit trail entry showing the completion of the RCA task or the population and finalization of root cause category fields within the quality event form.
Capture
Inferred from the completion of the 'Root Cause Analysis' workflow step or task.
Event type
inferred
|
|||
|
Stakeholders Notified
|
Represents the formal communication of the quality event's resolution to all relevant stakeholders. This might be an explicit, logged action or inferred from the completion of a 'Final Notification' task in the workflow. | ||
|
Why it matters
Analyzing this step helps understand communication efficiency. It is also a key marker for calculating the Final Review & Closure Time KPI.
Where to get
This could be an explicit notification event logged in the system or inferred from the completion of a 'Notify Stakeholders' task before final closure.
Capture
Inferred from the completion of a dedicated notification task within the workflow.
Event type
inferred
|
|||