IX.2.2 Accuracy - Quality Control

Accuracy is the extent to which the data submitted match the information in the medical record and have been correctly coded. It encompasses accurate abstracting, correct application of coding rules, and correct entry into and retrieval from the computer.

Accuracy is evaluated using various methods:

The CCR's regional registries perform visual editing on a percentage of the abstracts submitted by hospital registries.  Feedback is provided to hospitals on the results of visual editing.

A visual editing accuracy rate was established at 97% in January 2000.  This rate applies to cancer reporting facilities and not to individual cancer registry abstractors.  The reporting facility is responsible for cancer reporting requirements, not specific individuals; therefore, an accuracy rate reflects the facility's compliance with regulations.  Please refer to the CCR web site at www.ccrcal.org for the current list of visually edited data items.

Non-analytic cases are included in the accuracy rate.  The regions visually edit them, although not as extensively as the analytic cases.  Review is limited to verifying that there is supporting documentation to validate the coded data fields.

Computer edits are also used to assess the quality of data submitted.  The CCR provides a standard set of edits for abstracting software.  These edits are performed on data at the time of abstracting.  The measure used to evaluate accuracy is the percent of a facility's cases that fail an edit.  CCR's cases must pass the interfield edits specified in Cancer Reporting in California: Data Standards for Regional Registries and California Cancer Registry (California Cancer Reporting System Standards, Volume III).

The CCR's edit set contains a number of edits that require review .  After review and confirmation that the abstracted information is correct, a flag must be set so that repeated review is not necessary and a case can be set to complete.  See Appendix T for a list of these over-rides.  Please follow the instructions provided by your facility abstracting software vendor for using these flags.

Another method of assessing accuracy is to reabstract cases in the facilities.  A sample of cases from each facility is reabstracted by specially trained personnel.  The measure used is the number of discrepancies found in related categories of items.

 

 

<< Prev.       Next >>