Skip navigation
CLN bookstore

Data Reliability - State Agencies’ Computer-Generated Data Varied in Their Completeness and Accuracy, California State Auditor, 2010

Download original document:
Brief thumbnail
This text is machine-read, and may contain errors. Check the original document to verify accuracy.
CALIFORNIA STATE AUDITOR
Bureau of State Audits
Data Reliability
State Agencies’ Computer-Generated Data Varied in
Their Completeness and Accuracy

August 2010 Report 2010-401

The first five copies of each California State Auditor report are free. Additional copies are $3 each, payable by
check or money order. You can obtain reports by contacting the Bureau of State Audits at the following address:
California State Auditor
Bureau of State Audits
555 Capitol Mall, Suite 300
Sacramento, California 95814
916.445.0255 or TTY 916.445.0033
OR
This report is also available on the World Wide Web http://www.bsa.ca.gov
The California State Auditor is pleased to announce the availability of an on-line subscription service. For
information on how to subscribe, please contact the Information Technology Unit at 916.445.0255, ext. 456,
or visit our Web site at www.bsa.ca.gov.
Alternate format reports available upon request.
Permission is granted to reproduce reports.
For questions regarding the contents of this report,
please contact Margarita Fernández, Chief of Public Affairs, at 916.445.0255.

Elaine M. Howle
State Auditor

CALIFORNIA STATE AUDITOR

Doug Cordiner
Chief Deputy

Bureau of State Audits

555 Capitol Mall, Suite 300

S a c r a m e n t o, C A 9 5 8 1 4

August 10, 2010	

916.445.0255

916.327.0019 fax

w w w. b s a . c a . g o v

2010‑401

The Governor of California
President pro Tempore of the Senate
Speaker of the Assembly
State Capitol
Sacramento, California 95814
Dear Governor and Legislative Leaders:
This letter report presents a summary of the results of the State Auditor’s Office assessments
of the reliability of data in a wide variety of databases and automated spreadsheets used by the
bureau for the purposes of its audits. Data reliability refers to the accuracy and completeness of
the data, given our intended purposes for the data’s use. The State uses these data in many ways,
which include reporting on its programs, tracking licensees and recipients of funds, disbursing
funds, and making program decisions. Although we disclosed these data reliability assessments
in 19 audit reports that we issued during 2008 and 2009, this report is intended to call attention
both to areas of concern, where important data are not always reliable, and to instances in
which information has been reliable. We have conducted our assessments in accordance with
the provisions of the federal Government Accountability Office’s Assessing the Reliability of
Computer‑Processed Data, which require us to assess and report on the reliability of the data
from automated systems we use to reach our audit conclusions. This report is the second in an
anticipated series of periodic reports on the subject.
Many systems had reliable data for our purposes, but some important systems did not. During
the 19 audits we assessed the reliability of specific data for 84 different purposes in 36 separate
database and spreadsheet systems. For 34 audit purposes, we concluded that the data were reliable
and that using the data would not weaken our analyses or lead us to incorrect or unintentional
messages. We found, for example, that the California Housing Finance Agency had reliable data,
allowing us to determine the amount of awards and disbursements for the School Facility Fee
Downpayment Assistance, California Homebuyer’s Downpayment Assistance, Homeownership
in Revitalization Areas, and Extra Credit Teacher programs.
However, for 31 audit purposes, we reported the data were not sufficiently reliable, meaning
that using the data would most likely lead to incorrect or unintentional messages and that the
data have significant or potentially significant limitations, given the audit topics and intended
uses of the data. For instance, at the California Department of Corrections and Rehabilitation
(Corrections), the Division of Addiction and Recovery Services’ database had some obviously
unreliable information. Specifically, we identified errors when attempting to trace data back to
a sample of source documents for the purpose of identifying the number of sex offenders that
Corrections placed in licensed and unlicensed facilities.
For 18 audit purposes, we were unable to determine the reliability of the data; therefore, we
concluded that use of the data could lead to incorrect or unintentional messages and that the data
have significant or potentially significant limitations, given the research questions and intended

2

California State Auditor Report 2010-401

August 2010

uses of the data. In some cases, our conclusions that data were
of undetermined reliability arose from issues that either were
beyond the control of the audited agencies or were not causes
for concern. For instance, the conclusion that Corrections’
accounting records had undetermined reliability was not a cause for
concern because we did not find material errors in our electronic
testing of required data elements. However, we did not conduct
accuracy or completeness testing because the source documents
required for this testing were stored at seven regional offices or
the 33 institutions located throughout the State, and it would not
have been cost‑effective to conduct such testing. Nevertheless,
without hard‑copy documentation, we were unable to assess the
accuracy of the accounting data. We also determined that the sex
offender registry of the California Department of Justice (Justice)
had undetermined reliability. However, we did not report a finding
because the registered sex offenders are responsible for contacting
their local law enforcement office to determine if they are required
to register, to provide registration information, and to update
their registration when needed. Thus, we were not able to direct a
recommendation to Justice.
For the remaining audit purpose that we reviewed, we did not
assess data reliability. Specifically, we did not assess the reliability
for the Department of General Services’ State Contract and
Procurement Registration System (SCPRS)—in which state
agencies are required to enter all contracts valued at $5,000 or
more—because our intent was only to use the data to provide
background information on the number of information technology
contracts. Therefore, a data reliability assessment was not required.
However, we needed to gain assurance that the population of
contracts from which we selected our sample was complete. For
this purpose, we found SCPRS to be incomplete.
The table on pages 9 through 13 summarizes selected information
from the pages referenced in the Appendix. The data reliability
assessment relates to the purpose for which we tested the system’s
data during the audit, as described in the Appendix. The agency’s
use of the system’s data usually, but not always, is similar to our use
of the system’s data.

California State Auditor Report 2010-401

August 2010

Introduction
Information technology (IT) systems are increasingly important for
efficient and effective business practices. The State has an ongoing
need for its IT systems to keep pace with technological changes and
to develop and use systems and databases where they have not
existed in the past. Equally important, however, is state agencies’
day‑to‑day use of existing IT systems for purposes that can have
significant impacts on the State’s operations, such as reporting on
programs, tracking and monitoring licensees, disbursing funds, and
reaching program decisions. In October 2008 we issued a report
titled Data Reliability: State Agencies’ Computer‑Generated Data
Varied in Its Reliability (Report 2008‑401) that addressed the
reliability of the data from systems we tested as part of audits issued
in 2006 and 2007. The reliability of the data from systems tested
during audits issued in 2008 and 2009 is the subject of this report.1
The federal Government Accountability Office
(GAO), whose standards we follow, requires
us to assess and report on the reliability of
computer‑processed data that we use during our
audits. Data reliability refers to the accuracy
and completeness of the data, given the intended
purposes for their use. The GAO defines the
three possible assessments we can make—
sufficiently reliable data, not sufficiently reliable
data, and data of undetermined reliability. (See the
text box for definitions.) In assessing data reliability,
we take several factors into consideration, including
the degree of risk involved in the use of the data
and the strength of corroborating evidence. A
single database may have different assessments
because information that we propose to use for
one purpose is accurate and complete, whereas data
fields needed for a separate purpose are not.

Definitions Used in Data Reliability Assessments
Sufficiently Reliable Data—Based on audit work,
an auditor can conclude that using the data would
not weaken the analysis or lead to an incorrect or
unintentional message.
Not Sufficiently Reliable Data—Based on audit work, an
auditor can conclude that using the data would most likely
lead to an incorrect or unintentional message and that the
data have significant or potentially significant limitations,
given the research question and the intended use of the data.
Data of Undetermined Reliability—Based on audit work,
an auditor can conclude that use of the data could lead
to an incorrect or unintentional message and that the data
have significant or potentially significant limitations, given
the research question and intended use of the data.

We may employ various procedures for determining the reliability
of computer‑processed data we report and use to reach audit
conclusions. For example, if we want to use data to determine
whether the State Bar of California processed disciplinary cases
promptly, we might test the disciplinary tracking system in the
following ways:
•	 Reviewing the system for illogical data. If we find entries listing
dates for completion preceding the dates that the cases were
received, we would question the adequacy of system controls.
1	

We also include data reliability information for one report issued in 2007 because it was not
included in the prior report.

3

4

California State Auditor Report 2010-401

August 2010

•	 Scanning the database for completeness of key data fields. If we
find numerous case files that omit the dates that the department
received the case, we might conclude that the data are so
incomplete that drawing conclusions would lead to an incorrect
or unintentional message.
•	 Comparing database records to source documents. Using
a sample of actual cases with original documents, we could
determine whether the corresponding database information,
such as entries for the dates received, is consistent with
such information as the date‑received stamps on the
original documents.
In the case of the State Bar of California, we tested its disciplinary
tracking system for all these elements and found it to be reliable for
the purposes of our audit.
To give the appropriate perspective about information derived from
computer‑based systems, GAO standards require us to disclose the
results of our data reliability testing and the limitations of the data
we use.

California State Auditor Report 2010-401

August 2010

Audit Results
Many Automated Systems Had Reliable Data for Our Purposes
In assessing 84 audit purposes for data reliability, we determined
that the data for 34 were reliable. Therefore, in these instances,
we were able to use the data to draw conclusions and to quote the
data without qualifications about the accuracy of the information.
For example, we were able to use the California Department of
Veterans Affairs’ Mitas database to identify the number of veterans
who receive benefits from the CalVet Home Loans program and to
identify recent trends in veterans’ participation in the program. We
also concluded that the Department of Alcohol and Drug Programs’
licensure data were sufficiently reliable for us to identify the number
of residential alcohol and substance abuse treatment facilities that
operate in the State. At the Department of Fish and Game, we were
able to calculate revenues from sales of the Bay‑Delta Sport Fishing
Enhancement Stamp Program because we found this department’s
License Agent System sufficiently reliable.
Many Automated Systems Were Not Sufficiently Reliable for Us to Use
the Information Recorded
For 31 data reliability assessments, we concluded that the data were
not sufficiently reliable. One primary reason for this conclusion
was that the errors caused by incomplete data exceeded the
acceptable number of errors we established for the audit data
to be deemed reliable for our purposes. For instance, we found
several errors during our testing of the radioactive materials
database (RAM2000), which the Department of Public Health’s
(Public Health) Radiologic Health Branch (branch) uses to track
its inspections of entities that possess radioactive material.
Specifically, we noted that data values in the priority‑code field were
incorrect for two of 16 sample items. Because this field defines the
required inspection interval for a given licensee, errors based on
these data could result in the branch’s scheduling of too frequent
or too few inspections. Without sufficiently reliable data within
its RAM2000 database, we could not use the branch’s data to
determine the size and extent of any backlog of radioactive
materials. At the Victim Compensation and Government Claims
Board (Victim Compensation Board), an inoperable reporting
system in the Compensation and Restitution System (CaRES)
prevented the Victim Compensation Board from providing us
with any useful reports that would enable us to identify the extent
to which a backlog of applications and bills awaiting a decision
exists. We also concluded that CaRES was not sufficiently reliable
to assess how long the Victim Compensation Board and the joint
powers units took to process completed applications and bills.

5

6

California State Auditor Report 2010-401

August 2010

Nevertheless, we present the results of the analysis in that
report because the data represented the best available source of
information. Further, because the reporting function in CaRES is
not working yet, the Victim Compensation Board is forced to use ad
hoc reports that are unreliable and that lack important information
that the board needs to manage its workload effectively. Without
such data, the Victim Compensation Board cannot ensure that
victims receive prompt assistance.
In some circumstances—when the audited agency is responsible
for the data problems and uses the data for purposes similar to
those we intended—we recommended that the audited agency take
corrective action. For example, to improve the accuracy of its data,
we recommended that the branch within Public Health compare
its existing files to the information recorded in the data systems.
In addition, we recommended that it improve its internal controls
over data entry so that it can maintain accurate data on an ongoing
basis. Furthermore, to ensure that the branch uses sufficiently
reliable data from its future data system to manage its workload, we
recommended that Public Health develop and maintain adequate
documentation related to data storage, retrieval, and maintenance.
Public Health stated that it plans to replace the systems it uses to
manage its inspection workload.
We Were Unable to Determine the Reliability of Data for Some of
Our Purposes
For 18 of the 84 purposes we reported, we concluded that the
data had undetermined reliability—that is, we were not able to
determine the extent of any inaccuracies or omissions. As a result,
either we were not able to use the data or we reported qualifications
about the data’s reliability. As in the cases of data that have
insufficient reliability, we recommend corrective action when the
department is responsible for the data problems and uses the data
for purposes similar to those we intended, potentially resulting in
undesirable outcomes. In some instances, we concluded that such
data arose from problems with audited agencies’ practices, but at
other times the causes were either beyond the agencies’ control or
not reasons for concern.
For example, data from three California Department of Corrections
and Rehabilitation (Corrections) systems had undetermined
reliability. We found no material errors in our electronic
testing of required data elements; however, we did not conduct
completeness testing for the three databases because, depending
on the data involved, the source documents required for this
testing are stored at seven regional offices or 33 institutions
located  throughout the State, making such testing cost‑prohibitive.

California State Auditor Report 2010-401

August 2010

For the same reason, testing two of the databases for accuracy
was too expensive. Because no other sources exist for obtaining
the information, we used the data from all three databases. We
used one database to determine the additional cost of striker
inmates (those incarcerated under the Three Strikes law) currently
housed in Corrections’ adult institutions and the controlling, or
longest, offenses for individual inmates—if the offenses related
to a Three Strikes case. Another database enabled us to calculate
the cost of incarcerating an inmate and to analyze and categorize
overtime‑related expenditure data. Finally, we calculated the
average daily population of inmates at a particular institution using
data from a third system.
At the Department of Health Care Services, we found data systems
utilized by Electronic Data Systems (EDS) to have undetermined
reliability for providing information on the amounts paid for
medical equipment by the California Medical Assistance Program
(Medi‑Cal) during fiscal year 2006–07. We performed electronic
testing of selected data elements to ensure that they contained
logical values, and we tested the accuracy of the data by tracing a
sample of records to supporting documentation. However, we were
unable to obtain assurance regarding the completeness of the data
because EDS indicated that it incorrectly extracted the data from
its records. The corrected data were not available in time for us to
verify its accuracy and to perform our planned procedures before
issuing our report.
For the remaining audit purpose we reported, we did not assess
data reliability. Specifically, we did not assess the reliability for the
Department of General Services’ State Contract and Procurement
Registration System (SCPRS)—in which state agencies are required
to enter all contracts valued at $5,000 or more—because we
intended only to use the data to provide background information
on the number of information technology (IT) contracts. Therefore,
a data reliability assessment was not required. However, we needed
to gain assurance that the population of contracts from which we
selected our sample was complete. For this purpose, we found
SCPRS to be incomplete. For example, our review of a sample of
29 contracts for Public Health found that three were not in the
SCPRS database. Further, during our audit we discovered an active
$3.9 million IT contract for the Department of Health Care Services
that initially did not appear to be in the SCPRS database. We later
found that SCPRS incorrectly identified the contract as grants and
subventions instead of IT.

7

8

California State Auditor Report 2010-401

August 2010

The Appendix Provides Specific Information About Each of the Data
Assessments That We Reported
The Appendix to this report contains tables that summarize the
results of the data reliability assessments for state‑administered
programs we discuss in audit reports issued in 2008 and 2009.
The tables in the Appendix are preceded by brief summaries of
their related reports and are organized by oversight agency, if
applicable, and date order of reports issued. They indicate the
agency audited and either the name of the database we examined or
a description of the data for those databases or spreadsheets with
no formal names. The tables also include the following:
•	 Our purpose (or intended use) in using the data, our assessment
based on our intended use, the audited agency’s purpose for
the data, and recommendations for corrective actions, if any.
Although our purpose is sometimes the same as that of the
agency, our purpose differs occasionally. When purposes differ,
we may have found that data had undetermined or insufficient
reliability for our purposes, but we made no recommendations
because our concerns do not affect the agency’s use of the
data. Nevertheless, we report the results of these assessments
as a caution to others who may try to use the data in the same
manner as we originally intended.
•	 The agency’s response to our recommendations. The response
date listed corresponds to the date noted in the annual report
to legislative subcommittees about the corrective actions that
the agency took to address our recommendations. We issued
our most recent report to the subcommittees in February 2010.
Therefore, since that time, some agencies may have taken
additional corrective actions that we do not report here.
Finally, when possible, the tables disclose information that
provides context about the significance of the data we have
assessed. For example, the Department of Veterans Affairs’ Mitas
database, which we used to identify the number of California
veterans who receive benefits from the CalVet Home Loans
Program, indicated that 12,518 veterans were participating in the
program as of March 31, 2009.
At the beginning of the Appendix we have included a table that
summarizes the data reliability assessments. This table lists the
agency and department associated with each database, our data
reliability assessment, the agency or department’s purpose for the
database, and the page number for each database’s data reliability
assessment table. In many cases we used a database for more
than one testing purpose and therefore tested the reliability of the
database for each purpose. If a database with multiple testing uses

9

California State Auditor Report 2010-401

August 2010

received the same rating more than once, we list that rating only
once in the summary table. For example, we found the McGeorge
School of Law case management database insufficiently reliable
for all four of our testing purposes; thus, in the table we list the
assessment simply as “No” to summarize that the database was not
reliable for our four audit purposes.
Table
Summary of Reliability Assessments for Audits Issued in 2008 and 2009
AGENCY

SYSTEM

RELIABLE FOR
AUDIT PURPOSES?

AGENCY PURPOSE OF DATA

PAGE

BUSINESS, TRANSPORTATION AND HOUSING
California Housing
Finance Agency (CHFA)

Lender Access System

Yes

To reserve, track, and purchase the CHFA’s
subordinate loans. The subordinate
loan programs include School Facility
Fee Downpayment Assistance Program,
California Homebuyer’s Downpayment
Assistance Program, Homeownership in
Revitalization Areas Program, and Extra Credit
Teacher Program.

16

Residential Development Loan
Program spreadsheet

Yes

To track total commitments, disbursements,
loan maturity dates, payments received, status
report deadlines, and other data related to
the program.

17

Undetermined

To track the review, approval, and disbursement
of School Facility Fee funds.

17

School Facility Fee System
Housing and Community
Development, Department of 

California State Accounting and
Reporting System (CalSTARS)

Yes

To satisfy the basic accounting needs of most
agencies of the State.

17

Spreadsheet of cumulative bond
awards under propositions 46 and 1C

Yes

To list the cumulative summary information—
including award information—of the
programs funded under the Housing
and Emergency Shelter Trust Fund acts
(propositions 46 and 1C).

17

Division of Addiction and Recovery
Services database

No

To track and evaluate the delivery of
substance abuse services to inmates and
parolees in an accurate, timely, and efficient
manner throughout all phases of the
correctional intervention.

19

Division of Adult Parole
Operations database

No

To track parolees and to maintain a complete
parolee history. The current system delivers
real‑time, local, and statewide parolee data
from a single source.

19

The Youthful Offender
Database Application

No

To track ward office assignments, duties, and
tasks of the Division of Juvenile Justice parole
agents and agent caseload and to help ensure
that parole agents are not overassigned.

20

CORRECTIONS AND REHABILITATION, CALIFORNIA DEPARTMENT OF
Corrections and Rehabilitation,
California Department of
(Corrections)

continued on next page . . .

10

California State Auditor Report 2010-401

August 2010

AGENCY

SYSTEM

RELIABLE FOR
AUDIT PURPOSES?

AGENCY PURPOSE OF DATA

PAGE

Offender Based Information
System (OBIS)

Yes

To capture and maintain all adult offender
information from the time that the offenders
are committed to Corrections through the
time of their discharge. OBIS subsystems track
the following: commitments at the receiving
centers, offender demographics, offender
movements, and release dates.

25

Database for contracts for goods

No

To track information related to all contracts for
goods that Corrections executes using state
contracting processes, including the ones
for information technology (IT) initiated by
California Prison Health Care Services (Prison
Health Services).

27

Database for contracts for services

No

To track information related to all contracts for
services that Corrections executes using state
contracting processes, including the ones for IT
initiated by Prison Health Services.

27

Cadet database

Yes

To track cadets who graduate from the
correctional officer training academy.

29

Corrections accounting records
data for fiscal years 2003–04
through 2007–08 (CalSTARS)

Undetermined

To satisfy the basic accounting needs of most
state agencies.

29

Distributed Data Processing System

Undetermined

To track the day‑to‑day operation of several
facilities in the prisons, including the
following: the Automated Visiting Information
System, the Clark Developmentally Disabled
Automated Tracking System, the Inmate Job
Assignment System, the Inmate Medical
Alert Application, the Inmate Mental
Health Identifier System, the Inmate Roster
Classification System, and the Inmate Roster
Movement System.

30

OBIS

Undetermined

To capture and maintain all adult offender
information from the time the offenders are
committed to Corrections through the time
of their discharge. OBIS subsystems track
the following: commitments at the receiving
centers, offender demographics, offender
movements, and release dates.

30

Yes

To process the State’s payroll and personnel
transaction documents.

31

Department of General Services
Office of Administrative Hearings
(Administrative Hearings) case
management database

Yes, No,
Undetermined

To compile the data included in quarterly
reports required by the Department of
Education. State law requires Administrative
Hearings to report on such factors as the
number of complaints, mediations unrelated
to hearing requests, and requests for special
education hearings.

33

Administrative Hearings Practice
Manager database

No

To compile quarterly reports required by
the Department of Education, including
information related to whether it is meeting
the 45‑day state and federal requirement to
issue a decision after a hearing is held, unless
an extension is granted.

33

McGeorge School of Law case
management database

No

To compile data included in quarterly reports.

35

State Controller’s Office (State
Controller) payroll system
EDUCATION, DEPARTMENT OF

11

California State Auditor Report 2010-401

August 2010

AGENCY

SYSTEM

RELIABLE FOR
AUDIT PURPOSES?

AGENCY PURPOSE OF DATA

PAGE

HEALTH AND HUMAN SERVICES
Alcohol and Drug Programs,
Department of

Facilities Licensure data

Yes

To track licensing and certification
provider data.

21

Developmental Services,
Department of

Client Master File

No

To list all consumers whom the 21 regional
centers placed into various residential
facilities. The regional centers are
responsible for providing developmental
services to their consumers.

21

State Controller payroll system

Yes

To process the State’s payroll system and
personnel transaction documents.

37

To process—through Electronic Data
Systems, a Health Care Services contractor—
reimbursements for the California Medical
Assistance Program (Medi-Cal).

39

Health Care Services,
Department of (Health
Care Services)

California Medicaid Management
Information System

Undetermined,
No

State Controller payroll system

Yes

To process the State’s payroll system and
personnel transaction documents.

41

Mental Health, Department of

State Controller payroll system

Yes

To process the State’s payroll system and
personnel transaction documents.

37

Public Health, Department of

A database that compiles data from
numerous sources on child fatalities
due to abuse and neglect

No

To gather the best available information on
child fatalities due to abuse and neglect and, as
a result, to reduce the number of preventable
child deaths.

43

California Mammography Information
System data on inspections of
mammography equipment

No

To track the Radiologic Health Branch’s
mammography machine inspections.

45

Social Services, Department of

Health Application Licensing
system data on inspections of
radiation‑emitting machines other
than mammography equipment

Undetermined

To record the Radiologic Health Branch’s
inspections of radiation‑emitting machines—
such as x‑ray machines—other than
mammography equipment.

46

Radioactive materials database data
related to the Radiologic Health
Branch’s inspections of entities that
possess radioactive material

No

To track the Radiologic Health Branch’s
inspections of entities it has licensed to possess
radioactive materials.

47

State Controller payroll records

Yes

To process the State’s payroll and personnel
transaction documents.

41

Undetermined

To track information about the facilities,
facilities personnel, caseloads of licensing
program analysts, criminal record clearances,
facility fee payments, and statistical reports
related to the facilities and about updates or
changes on LIS.

22

CalSTARS data

Yes

To satisfy the basic accounting needs of most
state agencies.

49

CalSTARS data

Yes,
Undetermined

To satisfy the basic accounting needs of most
state agencies.

51

License Agent System

Yes,
Undetermined

To record, among other things, Fish and Game’s
revenues from fish stamp sales.

51

Licensing Information System (LIS)

NATURAL RESOURCES
Fish and Game, Department of
(Fish and Game)

continued on next page . . .

12

California State Auditor Report 2010-401

August 2010

AGENCY

SYSTEM

RELIABLE FOR
AUDIT PURPOSES?

AGENCY PURPOSE OF DATA

PAGE

STATE AND CONSUMER SERVICES
General Services, Department of

State Contract and Procurement
Registry System

Incomplete

To provide a centralized location for
tracking the State’s contracting and
purchasing transactions.

41

OTHER DEPARTMENTS, BOARDS & COMMISSIONS53
53California Unemployment
In55surance Appeals Board
(App23eals Board)57

Spreadsheets known as blue‑slip logs,
which list personnel transactions

Yes

To summarize the Appeals Board’s hires,
promotions, and transfers.

53

State Controller management
information retrieval system

Yes

To generate various California Human
Resources staff reports, including position
inventory and employment history reports.

53

Employment Development
Department

Employment Development
Department accounting system

Yes

To process payments for the Appeals Board,
including reimbursements of travel claims
and payments for the procurement of
goods. In addition, the system maintains the
Appeals Board’s operating and equipment
expense records.

53

Justice, California Department of

State Controller DNA Identification
Fund database

Yes

To record the dollar amount of DNA
Identification Fund penalties that counties and
courts transfer to the State.

55

Sex offender registry

No

To track certain information, including the
addresses of all sex offenders required to
register in California, as state law mandates.

23

Disciplinary tracking system

Yes

To track cases brought against attorneys from
the public and other sources.

57

State Bar of California

13

California State Auditor Report 2010-401

August 2010

AGENCY

SYSTEM

California Board of Chiropractic
Examiners (Chiropractic Board)

Consumer Affairs System

Veterans Affairs,
California Department of
(Veterans Affairs)
State of California Victim
Compensation and
Government Claims Board

RELIABLE FOR
AUDIT PURPOSES?

AGENCY PURPOSE OF DATA

PAGE

Undetermined,
No

To record information about the Chiropractic
Board’s case files (complaints and licensing).

59

Mitas database maintained by
Veterans Affairs

Yes

To originate and service loans and to account
for bonds that Veterans Affairs has issued
through the CalVet Home Loans program.

61

Compensation and Restitution System

No

To process victim compensation applications
and bills.

63

Respectfully submitted,

ELAINE M. HOWLE, CPA
State Auditor

14

California State Auditor Report 2010-401

August 2010

Blank page inserted for reproduction purposes only.

California State Auditor Report 2010-401

August 2010

Appendix
The tables on the following pages detail the results of the Bureau
of State Audits’ assessments of the reliability of data discussed
in audits issued during 2008 and 2009, and in related follow‑up
reports. In addition, the tables briefly summarize the main
conclusions of each assessment.
Index
AGENCY

AUDIT NUMBER

PAGE NUMBER

2009‑037

16

Corrections and Rehabilitation, California Department of

2007‑115

18

Corrections and Rehabilitation, California Department of

2008‑104

24

Prison Health Care Services

2008‑501

26

2009‑107.1

28

2008‑109

32

Mental Health, Department of

2009‑608

36

Developmental Services, Department of

2009‑608

36

Health Care Services, Department of

2007‑122

38

Health Care Services, Department of

2009‑103

40

Public Health, Department of

2009‑103

40

Social Services, Department of

2007‑124

42

Public Health, Department of

2007‑114

44

Fish and Game, Department of

2008‑102

48

Fish and Game, Department of

2008‑115

50

Unemployment Insurance Appeals Board

2008‑103

52

Justice, Department of

2007‑109

54

State Bar of California

2009‑030

56

Chiropractic Examiners, Board of

2007‑117

58

Veterans Affairs, Department of

2009‑108

60

Victim Compensation and Government Claims Board

2008‑113

62

Business, Transportation and Housing
Housing and Community Development, Department of
Corrections and Rehabilitation, California Department of

Corrections and Rehabilitation, California Department of
Education
Education, Department of
Health and Human Services

Natural Resources

Other Departments, Boards, and Commissions

15

16

California State Auditor Report 2010-401

August 2010

DEPARTMENT OF HOUSING AND COMMUNITY DEVELOPMENT
Housing Bond Funds Generally Have Been Awarded Promptly and in Compliance With Law,
but Monitoring Continues to Need Improvement
Date: November 10, 2009	

Report: 2009‑037

BACKGROUND

In an effort to aid low‑ to moderate‑income and homeless populations in securing housing and shelter, the Legislature
proposed and voters approved, nearly $5 billion in housing bonds—Housing and Emergency Shelter Trust Fund Act
bonds. These bond funds provide for the development of affordable rental housing, emergency housing shelters, and
down‑payment assistance to low‑ to moderate‑income home buyers. The Department of Housing and Community
Development (HCD) has final responsibility for the housing bond funds and directly administers the majority of the
housing bond programs. The California Housing Finance Agency (CHFA) also manages some of the programs funded
by the housing bonds.
KEY FINDINGS

During our review of the Housing and Emergency Shelter Trust Fund acts of November 2002 and 2006, we noted
the following:
•	 As of December 2008 HCD and CHFA had awarded nearly all of the November 2002 bond funds. Although HCD
and CHFA awarded housing bond funds authorized in November 2006 for eight programs, it has not issued any
awards for two other programs.
•	 Both agencies generally have processes in place to ensure that recipients, primarily individuals and local entities
that ultimately receive the funds awarded, meet legal requirements before disbursing housing bond awards to them.
However, as we reported in September 2007, HCD continues to advance funds to recipients at amounts greater
than the established limit for this program.
•	 Because of state budget difficulties, HCD restricted the amount of travel for performing on‑site visits beginning
in July 2008; thus, it has not met the goals it established for conducting on‑site visits for its CalHome, Emergency
Housing and Assistance, and Supportive Housing programs.
•	 Finally, HCD has not yet completed its verification of data transferred to its new Consolidated Automated Program
Enterprise System (CAPES), which it uses to administer and manage the housing bond programs.
KEY RECOMMENDATIONS

We made several recommendations to HCD, including that it follow its procedures on restrictions of advances and
ensure that it receives and reviews required status reports for its CalHome Program. In addition, we recommended
that HCD adopt a risk‑based, on‑site monitoring approach for two of its programs. We also recommended that HCD
complete its review of the accuracy of the data transferred to CAPES.

California Housing Finance Agency
Description of Data

Agency Purpose of Data

California Housing Finance Agency (CHFA)	
Lender Access System

To reserve, track, and purchase the CHFA’s subordinate loans. The subordinate
loan programs include the California Homebuyer’s Downpayment Assistance
Program—School Facility Fee, the California Homebuyer’s Downpayment
Assistance Program, the Homeownership in Revitalization Areas Program, and the
Extra Credit Teacher Program.

Purpose of Testing

Data Reliability Determination

To determine the amount of awards and disbursements
by program.

Sufficiently reliable.

California State Auditor Report 2010-401

August 2010

Description of Data

Agency Purpose of Data

CHFA Residential Development Loan Program
(RDLP) spreadsheet

To track total commitments, disbursements, loan maturity dates, payments received,
status report deadlines, and other data related to the RDLP.
As of December 31, 2008, the RDLP had $44 million allocated from Proposition 46
bond funds.

Purpose of Testing

Data Reliability Determination

To determine the amount of awards and disbursements
by program.

Sufficiently reliable.

Description of Data

Agency Purpose of Data

CHFA School Facility Fee System

To track the review, approval, and disbursement of School Facility Fee funds.
The Homebuyer Downpayment Assistance Program—School Facility Fee—had
$50 million allocated from Proposition 46 bond funds as of December 31, 2008.

Purpose of Testing

Data Reliability Determination

To determine the amount of awards and disbursements
by program.

Undetermined reliability—We were unable to fully test the data for completeness
because we were unable to select a sample of awards to trace into the system and
could not identify another method that we could use to test completeness.

Agency Response Date

N/A

Corrective Action Recommended

Status of Corrective Action

We did not recommend corrective action because we
did not identify a problem with the system. Additionally,
we were unable to test the data’s completeness because
we could not select a sample of awards to trace into
the system.

N/A

Housing and Community Development, Department of
Description of Data

Agency Purpose of Data

Department of Housing and Community Development
(HCD) California State Accounting and Reporting System
(CalSTARS)

To satisfy the basic accounting needs of most state agencies.

Purpose of Testing

Data Reliability Determination

To determine the amount of disbursements by program.

Sufficiently reliable.

Description of Data

Agency Purpose of Data

Spreadsheet of cumulative propositions 46 and 1C bond
awards under HCD

To list the cumulative summary information—including award information—of
the programs funded under the Housing and Emergency Shelter Trust Fund acts
(propositions 46 and 1C).
Proposition 46 authorizes $2.1 billion for housing bond programs. Proposition 1C
authorizes $2.85 billion for housing and development programs.

Purpose of Testing

Data Reliability Determination

To determine the amount of awards by program.

Sufficiently reliable.

17

18

California State Auditor Report 2010-401

August 2010

SEX OFFENDER PLACEMENT
State Laws Are Not Always Clear, and No One Formally Assesses the Impact
Sex Offender Placement Has on Local Communities
Date: April 17, 2008	

Report: 2007‑115

BACKGROUND

Fifty nine thousand registered sex offenders live in California communities, yet only 8,000 are supervised and
monitored by the California Department of Corrections and Rehabilitation (Corrections). Laws dictate where and
with whom paroled sex offenders can reside and when they must register with local law enforcement agencies. Some
registered sex offenders reside in residential facilities, licensed by the Department of Social Services (Social Services)
and the Department of Alcohol and Drug Programs, but most reside in facilities that do not require a license. The
Department of Justice (Justice) maintains a registry that contains addresses of sex offenders; however, it is not required
to, nor does it, indicate whether or not the address is a licensed facility.
KEY FINDINGS

Our review of the placement of registered sex offenders in communities found that:
•	 Departments responsible for licensing residential facilities are not required to, nor do they, consider the criminal
background of potential clients they serve, including sex offenders, nor do they track whether individuals residing at
these facilities are registered sex offenders.
•	 Our comparison of the databases from the two licensing departments with Justice’s database of registered sex
offenders showed that at least 352 licensed residential facilities housed sex offenders.
•	 We also found 49 instances in which the registered addresses in Justice’s database for sex offenders were the same as
the official addresses of facilities licensed by Social Services that serve children such as family day care homes.
•	 State law prohibits a paroled sex offender from residing with other sex offenders unless they reside in a “residential
facility.” However, we found more than 500 instances in which two or more sex offenders on parole were listed as
residing at the same address. At least 332 of these addresses appear to belong to hotels or apartment complexes, and
2,038 sex offenders were listed as residing at those addresses. Further, it is unclear whether “residential facilities”
includes those that do not require licenses, such as sober living facilities.
•	 Local law enforcement agencies told us they have not performed formal assessments of the impact sex offenders
have on their resources or communities. Further, Corrections does not always notify local law enforcement about
paroled sex offenders.
KEY RECOMMENDATIONS

We recommend the Legislature consider clarifying the laws related to where registered sex offenders may reside.
Further, we recommend that Corrections monitor the addresses of paroled sex offenders and that departments
collaborate to ensure proper residence. In addition, Justice and Social Services should share information to ensure that
registered adult sex offenders are not residing in licensed facilities that serve children.

California State Auditor Report 2010-401

August 2010

Corrections and Rehabilitation, California Department of
Description of Data

Agency Purpose of Data

California Department of Corrections and Rehabilitation
(Corrections) Division of Addiction and Recovery Services
database (database)

To track and evaluate the delivery of substance abuse services to inmates and
parolees in an accurate, timely, and efficient manner throughout all phases of
correctional interventions.
Corrections’ Division of Addiction and Recovery Services’ community‑based continuing
care program had 33 participating sex offenders as of September 30, 2007.

Purpose of Testing

Data Reliability Determination

To identify the number of sex offenders whom Corrections
placed in licensed and unlicensed facilities by obtaining
data on individuals placed by Corrections and comparing
the addresses for these sex offenders to the addresses of
licensed facilities.

Not sufficiently reliable—We identified errors when tracing data back to a sample of
source documents. Data are qualified because we concluded that Corrections’ Division
of Addiction and Recovery Services database was not sufficiently reliable.

To identify the number of adult and juvenile sex offenders
on parole residing at the same residence by identifying
duplicate addresses in the database obtained from
Corrections’ Adult Parole and the Division of Juvenile
Justice (Juvenile Division).

Not sufficiently reliable—See above.

Agency Response Date

N/A

Corrective Action Recommended

Status of Corrective Action

We did not recommend corrective action because our audit
purpose required use of an insignificant amount of data
from the database and because the system is used for an
entirely different purpose than what we used it for as part
of the audit. While the database contained 137,000 records,
we limited our sample for data reliability testing to 33 sex
offender registrant parolees. Therefore, we did not believe
it was appropriate to develop a finding based on this
limited testing.

N/A

Description of Data

Agency Purpose of Data

Corrections’ Division of Adult Parole Operations
database (database)

To maintain a complete parolee history. The current parole tracking system delivers
real‑time, local and statewide parolee data from a single source.
Corrections’ Division of Adult Parole Operations was responsible for supervising 8,000
sex offenders on parole as of November 5, 2007.

Purpose of Testing

Data Reliability Determination

To identify the number of sex offenders whom Corrections
placed in licensed and unlicensed facilities by obtaining
data on individuals placed by Corrections, the Department
of Mental Health, and the Department of Developmental
Services and comparing the addresses for these sex
offenders to the addresses of facilities licensed by the
Department of Social Services (Social Services) and
the Department of Alcohol and Drug Programs.

Not sufficiently reliable—We identified errors when tracing data back to a sample of
source documents. These data are qualified because we concluded that the database
had undetermined reliability.

continued on next page . . .

19

20

California State Auditor Report 2010-401

August 2010

To identify the number of adult and juvenile sex offenders
on parole residing at the same residence by identifying
duplicate addresses in the databases obtained from
Corrections’ Adult Parole and the Juvenile Division.

Not sufficiently reliable—See above.

Agency Response Date

N/A

Corrective Action Recommended

Status of Corrective Action

We did not recommend corrective action because we
believed Corrections was taking the necessary steps
to make the database as accurate as possible. We also
concluded that because Corrections stores documents at
various facilities throughout the State, we were unable
to pull a haphazard sample of source documents for
completeness testing. Thus, we decided not to pursue
completeness testing.

N/A

Description of Data

Agency Purpose of Data

Corrections’ Juvenile Division Youthful Offender Database
Application (YODA) database

To track ward office assignments, duties, and tasks of the Juvenile Division parole
agents, and agent caseload and to help ensure that agents are not overassigned.
The Juvenile Division was responsible for 154 sex offenders on parole as of
November 29, 2007.

Purpose of Testing

Data Reliability Determination

To identify sex offenders who are parolees under the
Juvenile Division’s supervision by comparing Social Security
numbers in the Juvenile Division’s database with the
Department of Justice’s (Justice) sex offender registry.

Not sufficiently reliable—The Juvenile Division listed no Social Security number for
over 22 percent of the active parolees in its database, and 6 percent did not have
a criminal investigation and identification number listed. The data are qualified
because we concluded that Corrections’ Juvenile Division YODA database was not
sufficiently reliable.

To identify the number of sex offenders Corrections placed
in licensed and unlicensed facilities by obtaining data
on individuals placed by Corrections, the Department of
Mental Health, and the Department of Developmental
Services and comparing the addresses for these sex
offenders to the addresses of facilities licensed by
Social Services and the Department of Alcohol and
Drug Programs.

Not sufficiently reliable—See above.

To identify the number of adult and juvenile sex offenders
on parole residing at the same residence by identifying
duplicate addresses in the databases obtained from
Corrections’ Adult Parole and the Juvenile Division.

Not sufficiently reliable—See above.

Agency Response Date

April 2009

Corrective Action Recommended

Status of Corrective Action

To ensure that it maintains all necessary data to carry out
its functions, Corrections’ Juvenile Division should update
its YODA database to include the Social Security numbers
and criminal investigation and identification numbers for
all juvenile offenders under its jurisdiction.

Corrective action taken—Corrections noted that it issued a memorandum requiring
supervisors to review the Juvenile Division’s YODA database to determine which
parolees are missing criminal investigation and identification numbers. Corrections
indicated that this process was completed by December 30, 2008.

California State Auditor Report 2010-401

August 2010

Alcohol and Drug Programs, Department of
Description of Data

Agency Purpose of Data

Department of Alcohol and Drug Programs’ (Alcohol and
Drug) facilities licensure data

To track licensing and certification provider data.
Alcohol and Drug had 906 licensed residential facilities as of November 1, 2007.

Purpose of Testing

Data Reliability Determination

To identify the number of residential alcohol and substance
abuse treatment facilities that operate in the State.

Sufficiently reliable.

To identify the number of sex offenders Corrections
placed in licensed and unlicensed facilities by obtaining
data on individuals placed by Corrections and comparing
the addresses for these sex offenders to the addresses of
facilities licensed by Alcohol and Drug.

Sufficiently reliable.

Developmental Services, Department of
Description of Data

Agency Purpose of Data

Department of Developmental Services’  (Developmental
Services) Client Master File database

To list all consumers whom the 21 regional centers placed into various residential
facilities. The regional centers are responsible for providing the developmental
services to their consumers.
Developmental Services had 395 clients who were also sex offenders who were living
in a community setting as of November 1, 2007.

Purpose of Testing

Data Reliability Determination

To identify the sex offenders who are receiving services
from Developmental Services, we attempted to use Social
Security numbers by comparing Developmental Services’
data to Justice’s sex offender registry.

Not sufficiently reliable—Developmental Services listed no Social Security numbers
for 16 percent of the individuals in its database. The data are qualified because we
concluded that Developmental Services’ database was not sufficiently reliable.

To identify the sex offenders placed in licensed and
unlicensed facilities by Developmental Services by
comparing addresses for these sex offenders to the
addresses of facilities licensed by Social Services and
Alcohol Drug.

Not sufficiently reliable—See above.

Agency Response Date

N/A

Corrective Action Recommended

Status of Corrective Action

We did not recommend corrective action because based
on our understanding of discussions with Developmental
Services’ staff we did not believe that Social Security
numbers were essential to the database.

N/A

continued on next page . . .

21

22

California State Auditor Report 2010-401

August 2010

Social Services, Department of
Description of Data

Agency Purpose of Data

Social Services’ Licensing Information System (LIS)

To track information about the facilities, facilities personnel, caseloads of licensing
program analysts, criminal record clearances, facility fee payments, and statistical
reports related to the facilities and about updates or changes on LIS.
Social Services had 14,555 licensed residential facilities as of November 28, 2007.

Purpose of Testing

Data Reliability Determination

To identify the number of sober living facilities, residential
care facilities serving six or fewer individuals, and group
homes operating in the State.

Undetermined reliability—We were not able to verify the completeness of the data.
Because Social Services stores source documents at various facilities throughout
the State, we were unable to pull a haphazard sample of source documents for
completeness testing. Therefore, we were unable to determine if the LIS data included
a complete listing of licensed facilities. Data are qualified because we concluded that
Social Services’  LIS was of undetermined reliability.

To identify the number of sex offenders that Corrections
placed in licensed and unlicensed facilities by obtaining
data on individuals placed by Corrections and comparing
the addresses for these sex offenders to the addresses of
facilities licensed by Social Services.

Undetermined reliability—See above.

Agency Response Date

N/A

Corrective Action Recommended

Status of Corrective Action

We did not recommend corrective action because the
designation of undetermined reliability was not due to a
weakness in the database; rather, the designation was due
to our decision not to test the database for completeness
because Social Services stores documents at various
facilities throughout the state. Thus, we were unable
to pull a haphazard sample of source documents for
completeness testing.

N/A

California State Auditor Report 2010-401

August 2010

Justice, Department of
Description of Data

Agency Purpose of Data

Justice’s sex offender registry

To track certain information, including the addresses of all sex offenders required to
register in California, as mandated by state law.
More than 59,000 sex offenders were registered in Justice’s database as of
December 13, 2007.

Purpose of Testing

Data Reliability Determination

To determine the number of sex offenders residing at
licensed facilities by comparing the databases containing
the addresses of such facilities to Justice’s sex offender
registry database.

Not sufficiently reliable—Records may be outdated and might not contain accurate
address information. Five percent of registrants had unknown addresses, and an
additional 14 percent identified as possibly living in California communities were in
violation of requirements to update their registration information annually. Finally,
Justice’s sex offender registry lacked Social Security numbers for more than 4 percent
of the registrants who may have been living in California communities. The data
are qualified because we concluded that Justice’s sex offender registry was not
sufficiently reliable.

Agency Response Date

N/A

Corrective Action Recommended

Status of Corrective Action

We did not report a finding related to data reliability
because registered sex offenders are responsible
for contacting their local law enforcement office to
determine if they are required to register, for providing
the registration information, and for updating their
registration when needed. Thus, we were not able to direct
a recommendation to Justice.

N/A

23

24

California State Auditor Report 2010-401

August 2010

CALIFORNIA DEPARTMENT OF CORRECTIONS AND REHABILITATION
It Does Not Always Follow Its Policies When Discharging Parolees
Date: August 26, 2008	

Report: 2008‑104

BACKGROUND

The California Department of Corrections and Rehabilitation (Corrections) is generally required to release on parole its
prison inmates upon completion of their prison terms. Subsequently, parolees must be discharged from parole within
30 days of completing their required period of continuous parole unless Corrections’ Board of Parole Hearings (board)
approves retaining the parolee. Adult parole is divided into four regions within California and the regions encompass
25 districts and 179 parole units. The parole agent responsible for supervising a parolee recommends whether to retain
or discharge the parolee. The agent’s supervisor can discharge parolees in many cases, while in other cases, the district
administrator or the board must. Corrections discharged 38,565 felon parolees during 2006 and 44,078 during 2007.
KEY FINDINGS

In our review of Corrections’ adult parole discharge practices between January 1, 2007, and March 31, 2008, we
found that:
•	 Of the 56,329 parolees discharged, parole agents did not submit discharge review reports for 2,458 deported
parolees, and 2,523 other parolees. Thus, Corrections lost jurisdiction over these individuals and the opportunity
to recommend that the board retain these parolees, including 775 individuals originally convicted of violent or
serious offenses.
Corrections does not require:
»» Discharge review reports for deported parolees even though parole staff may recommend that these individuals
be retained because of certain case factors based on their review. Without the review reports, we could not
confirm if staff reviewed criminal history reports and other case factors before relinquishing jurisdiction.
»» Unit supervisors to verify that parole agents complete discharge review reports for eligible parolees.
•	 Of the 503 central files containing discharge review reports that we reviewed to determine whether appropriate
personnel prepared a discharge review, district administrators only participated in 156 discharge reviews. In
20 percent of these cases, district administrators discharged parolees against both the parole agents’ and unit
supervisors’ recommendations to retain them and often did not provide written justification for discharging parolees
contrary to staff recommendations.
•	 Corrections did not always ensure that the appropriate authority participated in discharge decisions. District
administrators or the board should have evaluated six of 83 discharge reviews that we examined for compliance with
policies, yet due to staff errors, the appropriate authority did not participate in these discharges and ultimately all
six were discharged despite staff recommendations to retain three of the parolees.
•	 As a result of internal investigations and findings since December 2007, Corrections stated it plans to implement a
number of changes to improve its discharge processes. However, it did not provide us any evidence to demonstrate
that it has implemented any of its draft policies and regulations.
KEY RECOMMENDATIONS

We made several recommendations to Corrections including that it ensure discharge review reports are completed
promptly for all eligible parolees to prevent their automatic discharge, and that it ensure the appropriate authority is
involved in discharging or retaining parolees. Further, we recommended that Corrections finalize and implement its
new draft policies, procedures, and regulations governing its parole discharge process and that staff handling case
records receive additional training on discharge practices to ensure compliance with discharge policies.

California State Auditor Report 2010-401

August 2010

Corrections and Rehabilitation, California Department of
Description of Data

Agency Purpose of Data

The California Department of Corrections and
Rehabilitation (Corrections) Offender Based Information
System (OBIS)

To capture and maintain all adult offender information from the time that the
offenders are committed to Corrections through the time of their discharge. OBIS
subsystems track the following: commitments at the receiving centers, offender
demographics, offender movements, and release dates.
Corrections discharged 56,329 parolees between January 1, 2007, and March 31, 2008.

Purpose of Testing

Data Reliability Determination

To determine whether district administrators discharged
parolees in accordance with staff recommendations.

Sufficiently reliable.

To assess the frequency with which parolees were
discharged contrary to staff recommendations.

Sufficiently reliable.

Agency Response Date

August 2009

Corrective Action Recommended

Status of Corrective Action

Although we found the data sufficiently reliable, we
recommended that Corrections more accurately determine
whether its staff completed discharge reports by ensuring
that staff members properly code in its database the
reasons for parolees’ discharges. Further, to better identify
the entities that make final discharge decisions for given
cases, we recommended Corrections establish a more
precise method for maintaining information about which
entity made the final discharge decision, such as a new
discharge reason code or a new data field that will track
this information.

Corrective action taken—Corrections reported that its Case Records Office redefined
the manner in which discharged cases are entered into its database. According
to Corrections, Case Records Office staff have also been trained on new recording
procedures for entering the appropriate discharge reason and code into the database.

25

26

California State Auditor Report 2010-401

August 2010

CALIFORNIA PRISON HEALTH CARE SERVICES
It Lacks Accurate Data and Does Not Always Comply With State and Court‑Ordered Requirements
When Acquiring Information Technology Goods and Services
Date: January 29, 2009	

Report: 2008‑501

BACKGROUND

State law gives the Bureau of State Audits (bureau) the authority to audit contracts entered into by public entities that
involve the expenditure of public funds in excess of $10,000 whenever the public entities request such an audit to be
performed. The United States District Court appointed a receiver to administer, control, manage, operate, and finance
the health care system in California prisons. California Prison Health Care Services (Prison Health Services), the entity
created by the receiver to perform those duties, requested that the bureau conduct an audit of contracts that it initiated
for information technology (IT) goods and services. Prison Health Services, working with the California Department
of Corrections and Rehabilitation (Corrections), is required to make such acquisitions either in compliance with state
contracting laws or by using one of three alternative contracting methods prescribed by the federal court.
KEY FINDINGS

Our review of Prison Health Services’ IT contracts revealed the following:
•	 It may not be able to identify all IT contracts it initiates because it lacks reliable data—the databases that
Corrections maintains often contain inaccurate and incomplete data.
»» We found that two IT contracts that together were valued at $735,000 were incorrectly recorded as being for
non‑IT services. In another instance, a contract’s value was underreported by $425,000.
»» The new enterprise‑wide business information system may contain inaccurate and incomplete data since it
includes data from the existing databases we found were not sufficiently reliable.
•	 It failed to consistently adhere to state contracting requirements when entering into contracts for IT goods and
services. Of the 21 contracts we reviewed, we found 24 instances of noncompliance in 16 of the contracts.
»» Eight contracts, or 39 percent of the contracts we reviewed, lacked required certifications justifying the purchase.
»» Four contracts did not comply with applicable bidding and evaluation requirements.
»» We could not determine that the appropriate individuals reviewed and approved 11 of the contracts.
•	 It has no written policies surrounding the rationale for using alternative contracting methods. Further, Prison
Health Services did not comply with court‑imposed requirements in executing five of six IT‑related contracts,
valued at almost $28 million, which were approved using an alternative contracting method.
KEY RECOMMENDATIONS

We recommended Prison Health Services exercise proper internal controls over data entered into the new business
information system and that it ensure the accuracy of key fields for all contract‑related data that has already been
migrated from its old databases to the new system. Also, we recommended that Prison Health Services ensure
appropriate staff are aware of and adhere to applicable state contracting requirements and related policies and
procedures for IT goods and services. Moreover, Prison Health Services should develop written policies for when and
how to use alternative contracting methods. Further, we recommended that Prison Health Services develop a tracking
system for contracts executed using alternative methods.

Corrections and Rehabilitation, California Department of
Description of Data

Agency Purpose of Data

The California Department of Corrections and Rehabilitation’s
(Corrections) information related to all contracts for goods

To track information related to all contracts for goods that Corrections executes
using state contracting processes, including the ones for information technology (IT)
initiated by California Prison Health Care Services (Prison Health Services).
According to Corrections’ database, Prison Health Services’ acquisitions of IT goods
from January 2007 through June 2008 totaled $5.8 million.

California State Auditor Report 2010-401

27

August 2010

Purpose of Testing

Data Reliability Determination

To identify all IT contracts for goods executed between
January 1, 2007, and June 30, 2008, by Corrections on behalf of
Prison Health Services and the related dollar amounts.

Not sufficiently reliable—We reviewed key data fields for a sample of contracts and
found inaccurate data in some of these fields, such as those that would identify
whether purchases were for IT‑related goods and services, the amounts paid for the
purchases, and the dates that the contracts were approved. In addition, we identified
a contract incorrectly listed as a contract for IT goods. The data are qualified because
we concluded that Corrections’ data were not sufficiently reliable.
Prison Health Services’ chief information officer stated that the agency was in the
process of implementing a new enterprise‑wide business information system that
would house future contract information and that would have appropriate controls to
limit inaccurate data.

Agency Response Date

June 2009

Corrective Action Recommended

Status of Corrective Action

To ensure that it has complete and accurate information
related to its contracts, Prison Health Services should ascertain
that the internal controls over the data entered into the new
enterprise‑wide business information system work as intended.
For contract‑related data that staff have already migrated
from old contract databases to the new system, it needs to
ensure the accuracy of key fields such as the ones for contract
amounts, service types, and the data fields that identify
contracts initiated by Prison Health Services by comparing the
data stored in its new database to existing hard‑copy files.

Corrective action taken—Prison Health Services stated that it had implemented the
processes required to ensure complete and accurate contract information. It had also
established one certified trainer and two certified power users to ensure the new
enterprise‑wide system is used to its highest potential. Further, according to Prison
Health Services, to ensure that staff have migrated complete and accurate IT contract
information to the new enterprise‑wide system, it had established various internal
controls, such as comparing the hard‑copy contracts to an internal tracking log in the
enterprise‑wide system and reviewing key fields in the new enterprise‑wide system
upon receiving a copy of an executed agreement.

Description of Data

Agency Purpose of Data

Corrections’ database for contracts for services

To track information related to all contracts for services that Corrections executes
using state contracting processes, including the ones for IT initiated by Prison
Health Services.
According to Corrections’ database, Prison Health Services’ acquisitions of IT services
from January 2007 through June 2008 totaled $4.3 million. However, data are qualified
because we concluded that Corrections’ data were not sufficiently reliable.

Purpose of Testing

Data Reliability Determination

To identify all IT contracts for services executed between
January 1, 2007, and June 30, 2008, by Corrections on behalf of
Prison Health Services and related dollar amounts.

Not sufficiently reliable— We reviewed key data fields for a sample of contracts and
found inaccurate data in some fields, such as those that identify whether purchases
were for IT‑related goods and services, the amounts of the purchases, and the dates
that the contracts were approved. In addition, we identified a contract incorrectly
listed as a contract for IT goods.
Prison Health Services’ chief information officer stated that it was in the process
of implementing a new enterprise‑wide business information system that would
house future contract information and would have appropriate controls to limit
inaccurate data.

Agency Response Date

June 2009

Corrective Action Recommended

Status of Corrective Action

To ensure that it has complete and accurate information
related to its contracts, Prison Health Services should ascertain
that the internal controls over the data entered into the new
enterprise‑wide business information system work as intended.
For contract‑related data that staff have already migrated from
old contract databases to the new system, it needs to ensure
the accuracy of key fields, such as those for contract amounts,
service types, and the data fields that identify contracts
initiated by Prison Health Services by comparing the data
stored in its new database to existing hard‑copy files.

Corrective action taken—Prison Health Services stated that it had implemented the
processes required to ensure complete and accurate contract information. It had also
established one certified trainer and two certified power users to ensure the new
enterprise‑wide system is used to its highest potential. Further, according to Prison
Health Services, to ensure that staff have migrated complete and accurate IT contract
information to the new enterprise‑wide system, it established various internal
controls, such as comparing the hard‑copy contracts to an internal tracking log in the
enterprise‑wide system and reviewing key fields in the new enterprise‑wide system
upon receiving a copy of an executed agreement.

28

California State Auditor Report 2010-401

August 2010

CALIFORNIA DEPARTMENT OF CORRECTIONS AND REHABILITATION
It Fails to Track and Use Data That Would Allow It to More Effectively Monitor and Manage Its Operations
Date: September 8, 2009	

Report: 2009‑107.1

BACKGROUND

With annual expenditures at nearly $10 billion—10 percent of the State’s General Fund—the California Department
of Corrections and Rehabilitation (Corrections) is responsible for nearly 168,000 inmates, 111,000 parolees, and more
than 1,600 juvenile wards of the State. Corrections oversees 33 adult correctional institutions, conservation camps,
community correctional facilities, and contracts to house inmates in out‑of‑state facilities. Further, Corrections
provides health care to inmates at each adult facility and through external contractors. The inmate health care function
transitioned to a federal court‑appointed receiver and is now known as California Prison Health Care Services (Health
Care Services). Corrections is also responsible for implementing rehabilitative strategies to successfully reintegrate
offenders into communities.
KEY FINDINGS

During our evaluation of the effect of California’s prison population on the State’s budget and review of Corrections’
operations, we noted the following:
•	 While inmate population decreased by 1 percent in the last three years, Corrections’ expenditures increased by
almost 32 percent during the same time period.
•	 Corrections lacks the data necessary to determine how factors such as overcrowding, the transition of the inmate
health care function, escalating overtime, or aging inmates impact the cost of its operations.
•	 The cost per inmate varied significantly among institutions. For example, although the average cost per inmate
was $49,300 in fiscal year 2007–08, for two institutions having additional medical and mental health units the
per‑inmate cost exceeded $80,000.
•	 Nearly 25 percent of the inmate population is incarcerated under the three strikes law—which requires individuals
to serve longer terms. We estimate the cost to the State of the increase in sentence length for these inmates will
total $19.2 billion over the duration of their sentences.
•	 Overtime for custody staff—correctional officers, sergeants, and lieutenants—totaled $431 million in fiscal
year 2007–08 largely due to vacant positions and increases in custody staff salaries. Overtime was so prevalent that
we identified more than 8,400 correctional officers whose total pay for fiscal year 2007–08 exceeded the top pay
rate of supervisors two levels above them.
•	 Hiring a new correctional officer costs slightly more than paying overtime to existing staff because of the
training they receive and the increases in the cost of the State’s contribution for the retirement benefits of
correctional officers.
•	 Although Corrections’ budget for academic and vocational programs totaled more than $208 million for fiscal
year 2008–09, it is unable to assess the success of these programs in reducing inmate recidivism.
KEY RECOMMENDATIONS

To be more cost‑effective and improve its management, we recommended that Corrections collect and use data
associated with factors that affect the cost of its operations. We also recommended that Corrections develop a staffing
plan allocating teacher and instructor positions for its education and vocational programs at each institution based on
inmates’ needs and to track and use historical inmate program assignment and waiting list data to measure program
success. Additionally, we recommended that Corrections encourage the Department of Personnel Administration to
exclude provisions in bargaining unit agreements that would permit any type of leave to be counted as time worked
for the purpose of computing overtime compensation and negotiate a reduction in the amount of voluntary overtime
correctional officers are allowed to work.

California State Auditor Report 2010-401

August 2010

Corrections and Rehabilitation, California Department of
Description of Data

Agency Purpose of Data

California Department of Corrections and Rehabilitation’s
(Corrections) cadet database

To track cadets who graduate from the correctional officer training academy.
In fiscal year 2007–08, the Bureau of State Audits calculated a cadet equivalent
of 2,950. This information was not specifically cited in the report but was used in
calculating estimates of training costs and turnover.

Purpose of Testing

Data Reliability Determination

Although we found the cadet database to be reliable,
because Corrections stated that it was unable to provide
us with complete information on turnover, we calculated
our own estimate by first identifying the number of filled
correctional officer positions through a comparison
of the number of authorized and vacant positions in
the governor’s budget. We also used the number of
correctional officers whom Corrections informed us that it
had appointed.

Sufficiently reliable.

To allocate training and recruiting costs, we obtained
information on the number of correctional officers who
graduated from the correctional academy.

Sufficiently reliable.

Description of Data

Agency Purpose of Data

Corrections’ accounting records data for fiscal	
years 2003–04 through 2007–08 (CALSTARS)

To satisfy the basic accounting needs of most state agencies.
The average cost to incarcerate an inmate in fiscal year 2007–08 was $49,300.
Corrections spent $431 million on overtime for custody staff in fiscal year 2007–08.

Purposes of Testing

Data Reliability Determination

To calculate the cost of incarcerating an inmate.

Undetermined reliability—This determination is based on the fact that we found
no material errors in our electronic testing of required data elements. However, we
did not conduct accuracy or completeness testing because the source documents
required for this testing are stored at seven regional offices or the 33 institutions
located throughout the State. To obtain some assurance regarding the completeness
of this information, we compared the total expenditures in the records we received
for fiscal years 2006–07 and 2007–08 to paper records. However, we did not perform
this procedure for earlier fiscal years in our analysis because we were unable to obtain
the relevant information for prior fiscal years. The data are qualified because we
concluded that Corrections’ accounting records data were of undetermined reliability
for our audit purposes.

To analyze and categorize overtime‑related
expenditure data.

Undetermined reliability—See above.

continued on next page . . .

29

30

California State Auditor Report 2010-401

August 2010

Description of Data

Agency Purpose of Data

Corrections’ Distributed Data Processing System (DDPS)

To track the day‑to‑day operation of several facilities in the prisons, including the
following: the Automated Visiting Information System, the Clark Developmentally
Disabled Automated Tracking System, the Inmate Job Assignment System, the Inmate
Medical Alert Application, the Inmate Mental Health Identifier System, the Inmate
Roster Classification System, and the Inmate Roster Movement System.
In fiscal year 2007–08, the average daily population of male inmates was 152,359 and
the average daily population of female inmates was 10,831.

Purpose of Testing

Data Reliability Determination

To calculate the average daily population of inmates at a
particular institution.

Undetermined reliability—This determination is based on the fact that we found
no material errors in our electronic testing of required data elements. However, we
did not conduct accuracy or completeness testing because the source documents
required for this testing are stored at the 33 institutions located throughout the
State. The data are qualified because we concluded that Corrections’ DDPS was of
undetermined reliability for our purposes.

Description of Data

Agency Purpose of Data

Corrections’ Offender Based Information System (OBIS)

To capture and maintain all adult offender information from the time that the
offenders are committed to Corrections through the time of their discharge. OBIS
subsystems track the following: commitments at the receiving centers, offender
demographics, offender movements, and release dates.
As of April 2009 Corrections housed more than 43,500 inmates incarcerated under the
Three Strikes law (striker inmates).

Purpose of Testing

Data Reliability Determination

To determine the additional cost of striker inmates, we
used OBIS to identify those currently housed in Corrections’
adult institutions and the sentence for the controlling
offense—if it was related to a Three Strikes case or the
longest sentence related to a Three Strikes case—and
compared the estimated sentence length for the offenses
to an estimated sentence length if the inmates had not
been sentenced under Three Strikes, including applicable
enhancements. Based on this comparison, we calculated
the average number of additional years striker inmates
were sentenced to and multiplied that by the average cost
of incarceration for fiscal year 2007–08.

Undetermined reliability—We assessed the reliability of OBIS by performing electronic
testing of key data elements and by testing the accuracy of the data. To test the
accuracy of the data we selected a random sample of inmates and traced key
data elements to source documents. However, we did not conduct completeness
testing because the source documents required for this testing are stored at the
33 institutions located throughout the State. Therefore we concluded that these data
were of undetermined reliability for the purposes of this audit.

California State Auditor Report 2010-401

August 2010

State Controller’s Office
Description of Data

Agency Purpose of Data

State Controller’s Office payroll system

To process the State’s payroll and personnel transaction documents.

Purpose of Testing

Data Reliability Determination

To present data on overtime and the cost of a new
correctional officer. In reviewing the amount of overtime
worked by correctional officers, we determined that more
than 4,700 correctional officers were each paid for more
than 80 hours of overtime in at least one month during
fiscal year 2007–08 and that more than 8,400 correctional
officers each received more in gross pay than did a
correctional lieutenant—the level that is two ranks above
a correctional officer—at the lieutenant’s top pay rate.
However, we also determined that due to the costs of
benefits and training, hiring new correctional officers
to reduce overtime would actually increase Corrections’
total costs.

Sufficiently reliable.

31

32

California State Auditor Report 2010-401

August 2010

CALIFORNIA DEPARTMENT OF EDUCATION
Although It Generally Provides Appropriate Oversight of the Special Education
Hearings and Mediations Process, a Few Areas Could Be Improved
Date: December 16, 2008	

Report: 2008‑109

BACKGROUND

The special education programs within California schools serve nearly 680,000 children, between the ages from birth to
22 years old, who have disabilities that include speech or language impairments, autism, and specific learning disabilities.
To ensure that these children receive a free appropriate public education as required by federal and state laws, the
California Department of Education (Education) established procedures by which a school district, the parents of such
a student, or—in certain cases—a person assigned as a surrogate for such parents can present a complaint related to
the disabled student’s education. Education, through a June 2005 interagency agreement, currently uses the Office of
Administrative Hearings (Administrative Hearings) in the Department of General Services to administer the hearings and
mediations process for special education cases. Between 1989 and December 2005, the University of the Pacific’s McGeorge
School of Law (McGeorge) administered this process.
KEY FINDINGS

Our review of Education’s oversight of the special education hearings and mediations process from fiscal years 2002–03
through 2007–08 revealed the following:
•	 Administrative Hearings spent an average of $3,272 per special education case while McGeorge spent an average of
$2,867 on each case, yet on average, took less time to close a case in the special education hearings and mediations
process—McGeorge averaged 185 days to close cases while Administrative Hearings averaged 118 days.
•	 Neither Education nor any other entity consistently tracks the number and cost of special education appeals, and the law
does not require them to do so.
•	 Education could tighten its oversight of Administrative Hearings. We found that Administrative Hearings:
»» Did not consistently include all information in its quarterly reports to Education as required by its interagency
agreement and state law—some of which is needed for annual reporting to the federal government.
»» Could not demonstrate that its administrative judges were receiving all the required training. We reviewed training
records for 15 administrative judges for two classes and could only verify that five administrative judges had attended
both required courses.
»» Has not always issued hearing decisions within the legally required time frame. It reported that it issued only
29 percent and 57 percent of its hearing decisions on time in the third and fourth quarters of fiscal year 2005–06,
respectively, and 72 percent in the first quarter of fiscal year 2006–07. Untimely hearing decisions could lead to
sanctions by the federal government.
KEY RECOMMENDATIONS

To ensure Administrative Hearings complies with state and federal laws and the interagency agreement, we recommended
that Education provide stronger oversight and ensure Administrative Hearings submits all the required information in its
reports, require training information to be maintained and periodically review the information, and continue to monitor
Administrative Hearings to ensure decisions are timely.

California State Auditor Report 2010-401

August 2010

Education, California Department of
Description of Data

Agency Purpose of Data

Department of General Services’ Office of
Administrative Hearings (Administrative Hearings) case
management database

To compile the data included in quarterly reports required by the California
Department of Education (Education). Education requires Administrative Hearings
to provide quarterly reports so that Education can manage and report to the federal
government all of the State’s hearing and mediation activities related to special
education. In addition, Education is required to report certain data and information
to the federal government regarding the progress of special education hearings and
mediations. Accordingly, state law requires Administrative Hearings to report on such
factors as the number of complaints, mediations unrelated to hearing requests, and
requests for special education hearings.
Administrative Hearings closed a total of 5,482 cases during fiscal years 2006–07
and 2007–08. Data came from unaudited quarterly reports and invoices from
Administrative Hearings.

Purpose of Testing

Data Reliability Determination
We assessed the reliability of Administrative Hearings’ data by performing electronic
testing of key data elements, by tracing a statistically random sample of 29 cases
to supporting documents, and by ensuring that a haphazardly selected sample of
hard‑copy case files were found in the data. We found logic errors in several data
fields needed for our analysis and inaccurate entries in the reason‑for‑closure field.
Additionally, we found that the case‑open date for some sampled cases could not
be tested.

To identify the number of cases closed.

Sufficiently reliable—We used alternative audit procedures to assess the reliability of
this data.

To identify the number of cases closed before
administrative judges issued hearing decisions.

Not sufficiently reliable—See above.

To identify the number of hearing decisions in favor of
each party.

Not sufficiently reliable—See above.

To identify the average time taken to close cases.

Undetermined reliability—See above.

Agency Response Date

N/A

Corrective Action Recommended

Status of Corrective Action

We did not recommend corrective action to address
Administrative Hearings’ case management database
because Administrative Hearings began using a new
database, Practice Manager System, on August 13, 2007.

N/A

Description of Data

Agency Purpose of Data

Administrative Hearings Practice Manager System database

To compile quarterly reports required by the Education, including information related
to whether Education is meeting the 45‑day state and federal requirement to issue a
decision after a hearing is held, unless an extension is granted.
Administrative Hearings closed 5,482 cases from fiscal years 2006–07 through
2007–08. Data came from unaudited quarterly reports and invoices from McGeorge
School of Law (McGeorge) and from Administrative Hearings.
continued on next page . . .

33

34

California State Auditor Report 2010-401

August 2010

Purpose of Testing

Data Reliability Determination

To determine whether the information included
in Administrative Hearings’ new database—the
Practice Manager System, which it began using on
August 13, 2007—contained reliable data for the purpose
of determining the percentage of cases that were closed
within the legally required time frame of 45 days, excluding
any extensions.

Not sufficiently reliable—We assessed the reliability of the data for cases closed
between October 1, 2007, and June 30, 2008. We found inaccuracies in the sampled
records in the fields for the dates that the cases were opened, the dates that the cases
were closed, the reasons for closure, and whether extensions were granted.

Agency Response Date

December 2009

Corrective Action Recommended

Status of Corrective Action

Education, in its oversight role, should continue to work
with Administrative Hearings to ensure that it reports all
the required information in its quarterly reports and that its
database contains accurate and complete information for
reporting purposes.

Partial corrective action taken—According to Education, it was working with
Administrative Hearings to ensure that the required information is included in
the quarterly reports. Education indicated that it compared information from the
electronic reporting Practice Manager System with hard‑copy files at Administrative
Hearings on January 22, 2009, June 3, 2009, and November 24, 2009. According to
Education, its review of a sample of 20 records found that Administrative Hearings
accurately and completely reported information in the following fields: (1) student
name, (2) case name, (3) subject matter type, (4) subject matter number, (5) date case
opened, and (6) case jurisdiction.

California State Auditor Report 2010-401

August 2010

Description of Data

Agency Purpose of Data

McGeorge case management database

To compile data included in quarterly reports.
McGeorge closed a total of 6,360 cases during fiscal years 2002–03 through 2003–04.
Data came from unaudited quarterly reports and invoices from McGeorge.

Purpose of Testing

Data Reliability Determination
We assessed the reliability of McGeorge’s data by performing electronic testing of
key data elements, tracing a statistically random sample of 29 records to supporting
documents and ensuring that data for a haphazardly selected sample of hard‑copy
case files appeared in the McGeorge database. We performed these procedures for
cases that followed the standard hearing process and on the data for cases that were
filed for mediations only. We found logic errors in both sets of data and inaccurate
entries in the closure‑date field in the data for cases that followed the standard
hearing process. We also found instances in which the supporting documentation
could not be located for the filing‑date and closure‑date fields in the data for cases
that followed the standard hearing process.

To identify the number of cases closed.

Not sufficiently reliable—See above.

To identify the number of cases closed before
administrative law judges issued hearing decisions.

Not sufficiently reliable—See above.

To identify the number of hearing decisions in favor of
each party.

Not sufficiently reliable—See above.

To identify the average time taken to close cases.

Not sufficiently reliable—See above.

Agency Response Date

N/A

Corrective Action Recommended

Status of Corrective Action

We did not recommend corrective action because
Education ceased contracting with McGeorge for special
education hearings in 2005 and mediations in 2006.

N/A

35

36

California State Auditor Report 2010-401

August 2010

HIGH RISK UPDATE—STATE OVERTIME COSTS
A Variety of Factors Resulted in Significant Overtime Costs at the
Departments of Mental Health and Developmental Services
Date: October 20, 2009	

Report: 2009‑608

BACKGROUND

In a February 2009 report on areas that present high risk to the State, the State Auditor’s Office identified the state budget
as a high‑risk area and the significant amount of overtime compensation the State pays to its employees contributes
to this risk. We identified five state entities, excluding the Department of Corrections and Rehabilitation, which paid
$1.3 billion of the more than $2.1 billion in overtime payments to state employees during fiscal years 2003–04 through
2007–08. We selected the departments of Mental Health (Mental Health) and Developmental Services (Developmental
Services) to test since they had numerous employees in two job classifications who earned a large portion of their total
earnings in overtime. Mental Health and Developmental Services provide services to their patients and consumers
24 hours a day, seven days a week.
KEY FINDINGS

During our review of Mental Health’s and Developmental Services’ overtime costs, we noted the following:
•	 Since the bargaining unit agreements (agreements) do not provide a method for distributing voluntary overtime,
a disproportionate amount of overtime can be worked by a relatively small number of employees, a situation we
observed at Napa State Hospital (Napa) and Sonoma Developmental Center (Sonoma).
•	 The Department of Finance concluded that Mental Health’s current staffing model might not adequately reflect the
hospitals’ workload and noted that some level‑of‑care staff were performing administrative functions not directly
related to patient care that could be performed by lower‑paid staff.
•	 California Government Code, Section 19844.1, enacted in February 2009, permits new agreements to once again
contain provisions that allow employees’ leave time to be counted as time worked when computing overtime.
•	 Annual authorized positions for Mental Health and Developmental Services do not account for circumstances that
necessitate an increased level of care for patients and consumers.
•	 Based on our analysis, it appears that the hourly overtime rates paid to registered nurses–safety at Napa and
psychiatric technician assistants at Sonoma are comparable to the cost of hiring a new employee for either of
those positions.
KEY RECOMMENDATIONS

We made numerous recommendations to Mental Health and Developmental Services to ensure that overtime hours
are necessary and to protect the health and safety of its employees and patients or consumers. Some of the steps
we recommended included that the departments should encourage the Department of Personnel Administration
(Personnel Administration) to include provisions in future agreements to cap the number of voluntary overtime
hours an employee can work and/or to require employee overtime hours be distributed more evenly among
staff. We also recommended that the departments encourage Personnel Administration to resist the inclusion of
provisions in agreements that permit any type of leave to be counted as time worked for the purpose of computing
overtime compensation.

California State Auditor Report 2010-401

August 2010

Developmental Services and Mental Health, Departments of
Description of Data

Agency Purpose of Data

State Controller’s Office payroll system

To process the State’s payroll and personnel transaction documents.

Purpose of Testing

Data Reliability Determination

To present data on overtime and the cost of a new
nurse and psychiatric technician assistant. Between
fiscal years 2003–04 and 2007–08, the State paid
more than $2.1 billion in overtime to state employees
at 141 state entities. Of this amount, $1.3 billion was
paid to the employees of five entities, including the
Department of Mental Health and the Department of
Developmental Services.

Sufficiently reliable.

37

38

California State Auditor Report 2010-401

August 2010

DEPARTMENT OF HEALTH CARE SERVICES
Although Notified of Changes in Billing Requirements, Providers of Durable Medical
Equipment Frequently Overcharged Medi‑Cal
Date: June 17, 2008	

Report: 2007‑122

BACKGROUND

The California Medical Assistance Program (Medi‑Cal), administered by the Department of Health Care Services
(Health Care Services), provides medical assistance to more than six million beneficiaries each month. Medi‑Cal
covers health care needs including durable medical equipment (medical equipment), such as wheelchairs, bathroom
equipment, and hospital beds that are prescribed by licensed practitioners. For fiscal year 2007–08, the State’s General
Fund provided roughly 40 percent of Health Care Services’ budget for Medi‑Cal expenditures, with the remainder
coming mostly from federal funds. Health Care Services is responsible for reimbursing Medi‑Cal providers for supplying
medical equipment using a system designed by both federal and state governments.
KEY FINDINGS

In our review of Health Care Services’ Medi‑Cal billing system for medical equipment, we reported the following:
•	 Although Health Care Services’ policies and procedures regarding reimbursement methodologies for medical
equipment appear to comply with state law and federal requirements and are adequately communicated to providers,
providers often do not bill at the allowable amounts, which are the lowest cost options.
•	 Health Care Services has not identified a practical means to monitor and enforce billing and reimbursement
procedures it implemented in 2003. As such, Health Care Services has overpaid providers. In its review of
21 providers of wheelchairs and accessories with listed Medicare prices, Health Care Services determined that
it had overpaid about $1.2 million, or 25 percent of the $4.9 million billed during September 1, 2005, through
August 31, 2006.
•	 Although Health Care Services has recovered almost $960,000 of the $1.2 million in overpayments, it does not know
the extent to which other providers may have overbilled for medical equipment. Further, its review did not include
billings for equipment without listed Medicare prices. In our review of billings without listed prices, we found that
providers of wheelchairs and accessories typically charged (and Health Care Services reimbursed) the manufacturer’s
suggested price without sufficient evidence to support it was the lowest‑priced option.
•	 Although Health Care Services intends to use post‑payment audits to enforce price controls, its current payment
error rate studies of overall Medi‑Cal payments do not provide adequate audit coverage of medical equipment
payments to effectively ensure compliance. Further, while its 21 audits in 2007 and 2008 focusing on providers of
wheelchairs and accessories with listed Medicare prices effectively identified noncompliance with the billing and
reimbursement procedures, Health Care Services has not identified plans or resources to conduct additional focused
audits of medical equipment providers.
KEY RECOMMENDATIONS

We recommended that Health Care Services take the following actions:
•	 Develop a means of monitoring and enforcing its current billing and reimbursement procedures for medical
equipment, including giving consideration to developing reimbursement caps in order to maintain control over
reimbursement costs.
•	 Design and implement a cost‑effective approach to address the risk of overpayment and ensure all providers are
potentially subject to an audit in order to provide a deterrent for noncompliance.

California State Auditor Report 2010-401

August 2010

Health Care Services, Department of
Description of Data

Agency Purpose of Data

The Department of Health Care Services (Health Care
Services) uses the California Medicaid Management
Information System (CA‑MMIS) to maintain health care
codes and reimbursement rates for medical purchases,
including payments to providers for supplying medical
equipment.

To process—through Electronic Data Systems (EDS), a Health Care
Services’ contractor—reimbursements for the California Medical Assistance
Program (Medi‑Cal).

Purpose of Testing

Data Reliability Determination

In federal fiscal year 2006–07, Health Care Services reimbursed $92.8 million for
medical equipment supplied to Medi‑Cal beneficiaries, the majority of which was paid
through medical type claims.
We performed electronic testing of selected data elements to ensure they contained
logical values and tested the accuracy of the data by tracing a sample of records
to supporting documentation. We were unable to obtain assurance regarding the
completeness of the data.

To provide information on the amount paid for medical
equipment by Medi‑Cal during federal fiscal year 2006–07.

Undetermined reliability—See above.

To provide information on the amount reimbursed
for all medical equipment with and without listed
Medicare prices.

Undetermined reliability—See above.

To provide information on the amount of medical
equipment reimbursed by type.

Undetermined reliability—See above.

To select a sample of medical equipment reimbursements
without listed Medicare prices for additional review.

Undetermined reliability—See above.

To evaluate the existence of fraud in Medi‑Cal claims by
using recipient identification information to determine
whether recipients had obtained medical equipment for
which they were not eligible.

Not sufficiently reliable—We found that the recipient identification information had
inaccurate values.

Agency Response Date

N/A

Corrective Action Recommended

Status of Corrective Action

We did not recommend corrective action. EDS indicated
that it incorrectly extracted the data from its records;
therefore, we were unable to determine if data weaknesses
were due to the incorrect extraction of the data or due to
intrinsic problems with the data. After repeated attempts
to obtain correct data, Health Care Services offered to
provide it. However, the corrected data were not available
in time for us to verify their accuracy and to perform our
planned procedures before issuing our report.

N/A

39

40

California State Auditor Report 2010-401

August 2010

DEPARTMENTS OF HEALTH CARE SERVICES AND PUBLIC HEALTH
Their Actions Reveal Flaws in the State’s Oversight of the California Constitution’s Implied Civil Service
Mandate and in the Departments’ Contracting for Information Technology Services
Date: September 10, 2009	

Report: 2009‑103

BACKGROUND

The Department of Health Care Services (Heath Care Services), previously known as the Department of Health
Services, and the Department of Public Health (Public Health)—established on July 1, 2007—have similar goals in
preserving, improving, or optimizing the health of Californians. Both departments use various forms of information
technology (IT) to carry out their programs and responsibilities, and enter into personal services contracts with private
consulting firms to assist in developing and supporting their IT systems. State agencies are prohibited from contracting
with private entities to perform work the State has historically and customarily performed and can do so adequately
and competently. However, under certain circumstances, state agencies may enter into personal services contracts with
private vendors, but these contracts are subject to review by the State Personnel Board (board).
KEY FINDINGS

During our review of Health Care Services’ and Public Health’s use of IT consulting and personal services contracts (IT
contracts), we noted the following:
•	 A state employees’ union challenged 23 executed IT contracts over the past five years—however, two contracts
expired before the union challenge. The board’s executive officer disapproved 17 of the 21 remaining IT contracts
she reviewed.
»» Of those contracts disapproved:
—	 Eleven expired either prior to the board’s executive officer’s decision or the board’s appeal decisions. The
board’s executive officer took between 64 and 152 days to review the 21 contracts—much longer than
the 45 days established by the regulations.
—	 The departments terminated only three of the six disapproved IT contracts still active at the time of the
decisions. The departments experience no repercussions because the State does not have a mechanism for
determining whether or not state agencies carry out board decisions.
»» For nine of the 17 disapproved contracts, the departments entered into subsequent contracts for substantially the
same services as those in the disapproved contracts.
•	 Although Health Care Services saved more than an estimated $1.7 million between October 2006 and July 2009
by replacing IT consultants with state employees, it did not have budget approval to create any new, permanent IT
positions and inappropriately funded the new positions with funds intended for temporary positions.
•	 Although the departments generally complied with procurement requirements for the 14 IT contracts we reviewed,
they did not obtain some required approvals and some employees that engaged in contracting activities did not file
financial interest statements.
KEY RECOMMENDATIONS

We made several recommendations to the Legislature for creating more substantive results from the reviews
conducted by the board, such as clarifying that state agencies must terminate disapproved contracts and prohibiting
them from entering into subsequent contracts for substantially the same services without first notifying the board
and unions. We also made numerous recommendations to the departments including changes to ensure timely
communication to contract managers regarding decisions rendered on contracts challenged, and for the departments’
legal services to review proposed personal services contracts deemed high risk. Other recommendations were aimed
at ensuring compliance with procurement requirements and contract provisions.

California State Auditor Report 2010-401

August 2010

State Controller’s Office
Description of Data

Agency Purpose of Data

State Controller’s Office payroll system

To process the State’s payroll and personnel transaction documents.

Purpose of Testing

Data Reliability Determination

We obtained the Social Security numbers of the
consultants who worked on the information technology
(IT) contracts in our sample and compared the numbers
against payroll records maintained by the State
Controller to identify whether either the Department of
Public Health (Public Health) or the Department of Health
Care Services  (Health Care Services) previously employed
these consultants as state employees.

Sufficiently reliable.

General Services, Department of
Description of Data

Agency Purpose of Data

The State Contract and Procurement Registry System
(SCPRS) of the Department of General Services (General
Services)

To provide a centralized location for tracking the State’s contracting and purchasing
transactions The State Contracting Manual requires that state agencies enter into
SCPRS all contracts valued at $5,000 or more.
As of March 13, 2009, Health Care Services had 52 active IT service contracts that
exceeded $5,000. The total amount of these contracts was $56 million. Public Health
had 32 such contracts totaling $24.2 million.

Purpose of Testing

Data Reliability Determination

To identify all active IT contracts at Public Health and
Health Care Services.

Incomplete—It was our intent to use SCPRS to select a sample of IT contracts and
to provide background on the number of IT contracts. Therefore, a data reliability
assessment was not required. Instead we needed to gain assurance that the
population of contracts from which we selected our sample was complete. For this
purpose, we found SCPRS to be incomplete.
Our review of a sample of 29 Public Health contracts found that three were not
in SCPRS. Further, although we were able to locate our sample of 29 Health Care
Services’ contracts in SCPRS, during our audit we discovered an active $3.9 million of IT
contracts that did not appear in SCPRS initially. We subsequently found that in SCPRS
the contract type was incorrectly identified as grants and subventions instead of IT.

Agency Response Date

November 2009

Corrective Action Recommended

Status of Corrective Action

To ensure that reporting into General Services’ contracts
database is accurate and complete, Health Care Services
and Public Health should establish a review and approval
process for entering their contract information into
the database.

Health Care Services’ Action: Partial corrective action taken—Health Care Services stated
that it reiterated to staff the importance of entering accurate information into General
Services’ database, provided additional instruction, and performed spot checks of
data entered into the system in August and September 2009. Health Care Services
indicated that because the latter activity resulted in the detection of a few errors, it
implemented a new procedure that involves the preparation of a data‑entry form
by supervisory or analytical staff. Further, Health Care Services stated that it plans
to continue to perform spot checks to ensure the accuracy of the data in General
Services’ database.
Public Health’s Action: Partial corrective action taken—Public Health stated that it
established a new procedure for staff to enter information into General Services’
database and will have a staff person conduct a review to ensure that the procedure
is reliable.

41

42

California State Auditor Report 2010-401

August 2010

SAFELY SURRENDERED BABY LAW
Stronger Guidance From the State and Better Information for the Public Could Enhance Its Impact
Date: April 29, 2008	

Report: 2007‑124

BACKGROUND

California’s Safely Surrendered Baby Law (safe‑surrender law) allows parents or other persons with lawful custody to
surrender an infant 72 hours old or younger to safe‑surrender sites without facing prosecution for child abandonment.
Statistics from the Department of Social Services (Social Services) indicate a general increase in the number of babies
surrendered under this law each year since its inception. State agencies have limited responsibilities associated with
the safe‑surrender law. State law required Social Services to report data annually from 2003 to 2005; the Department
of Health Care Services is to instruct counties on the process to be used on behalf of surrendered babies to
determine their eligibility for Medi‑Cal benefits; and since 2003, school districts are to include information about the
safe‑surrender law if they choose to provide comprehensive sexual health education.
KEY FINDINGS

We reported numerous concerns about the State’s implementation of the safe‑surrender law including:
•	 Since 2006 state agencies have had virtually no legal obligations under the safe‑surrender law—Social Services’
only involvement is compiling information that counties must submit when their designated sites accept
surrendered babies.
•	 No state agency currently publicizes the safe‑surrender law nor has consistent funding been provided for raising the
public’s awareness of the law. Social Services conducted a media campaign from October 2002 to December 2003,
but has not developed any further goals for conducting additional activities.
•	 Safe‑surrender sites are violating state law by disclosing confidential information on parents who surrendered
babies. Of the 218 babies surrendered since 2001, county files contained confidential information in 24 cases,
including 16 of the 176 cases occurring after the Legislature amended the law to protect personal identifying
information on persons who surrender babies.
•	 Counties have incorrectly classified babies as safely surrendered or abandoned. Children improperly classified as
safely surrendered may not be allowed access to information on their parents even though they may have the legal
right to the information.
•	 The majority of surrendered babies—72 percent—may not have access to key medical information later in life
because safe‑surrender sites have difficulties in obtaining vital information on their families’ medical histories.
•	 All 15 counties surveyed reported that they have taken steps to implement the safe‑surrender law, but their efforts
vary widely.
KEY RECOMMENDATIONS

We made recommendations to the Legislature and Social Services, including:
•	 The Legislature consider amending the law to specify the agency that should administer the safe‑surrender law
and provide direction as to its responsibilities. Further, the Legislature consider providing or identifying funding to
support efforts to promote awareness of the law.
•	 Social Services should clarify directions provided to counties to ensure that individuals who surrender babies
receive proper protection under the safe‑surrender law. Moreover, Social Services should work with counties to
leverage existing models and tools to enhance the safe‑surrender law currently in use in California.

California State Auditor Report 2010-401

August 2010

Public Health, Department of
Description of Data

Agency Purpose of Data

Department of Public Health (Public Health) database that
compiles data from numerous sources on child fatalities
due to abuse and neglect

To gather the best available information on child fatalities due to abuse and neglect
and, as a result, to reduce the number of preventable child deaths.

Purpose of Testing

Data Reliability Determination

To determine if the Department of Social Services (Social
Services) had underreported the number of deceased or
abandoned babies.

Not sufficiently reliable—We found missing and duplicative information. For example,
we discovered that certain records related to our analysis of deceased or abandoned
babies contained blank fields for the birth dates of the children. Without knowing the
birth dates, we could not determine whether children listed in the database met our
age criterion of one year old or younger.

Agency Response Date

April 2009

Corrective Action Recommended

Status of Corrective Action

To ensure that it is aware of and can appropriately react
to changes in the number of abandoned babies, Social
Services should work with Public Health and county
agencies to gain access to the most accurate and complete
statistics on abandoned babies.

Social Services’ Action: Partial corrective action taken—According to Social Services,
a Safely Surrendered Baby Law subcommittee continues to meet on a regular basis
with representatives from Public Health and county agencies to determine areas to
improve the quality of data on safely surrendered babies. Topics discussed at these
meetings include the following:
•	 Analysis of existing data on safely surrendered and abandoned babies extracted
from the Child Welfare Services/Case Management System.
•	

Identifying other data sources for abandoned babies.

•	

Clarifying the feasibility and resources needed to collect additional data on
abandoned babies.

•	

Developing a memorandum of understanding to share data between Social
Services and Public Health.

43

44

California State Auditor Report 2010-401

August 2010

LOW‑LEVEL RADIOACTIVE WASTE
The State Has Limited Information That Hampers Its Ability to Assess the Need for a Disposal Facility
and  Must Improve Its Oversight to Better Protect the Public
Date: June 12, 2008	

Report: 2007‑114

BACKGROUND

Hospitals, industry, and other institutions use radioactive materials that produce low‑level radioactive waste (waste).
Federal law requires these waste generators to dispose of the waste at licensed facilities. The Department of Public
Health (department) plays an important role in licensing those who use radioactive materials or radioactive‑emitting
machines in their work and overseeing the proper disposal of low‑level radioactive waste. This oversight includes
the decommissioning of equipment or facilities where radioactive materials have been used so that the location may
be used for other purposes. In 1987 California joined a four‑state compact governed by the Southwestern Low‑Level
Radioactive Waste Commission (Southwestern Commission), which is charged with ensuring that low‑level radioactive
waste is safely disposed of and managed within the compact region. As the “host” state, California is charged with
establishing a licensed low‑level radioactive waste disposal facility that will accommodate the disposal needs of the
compact region.
KEY FINDINGS

In our review of the State’s approach to managing low‑level radioactive waste, we reported the following:
•	 Despite joining the compact in 1987, California has yet to establish a low‑level radioactive waste disposal facility for
use by the compact region. In the absence of such a facility:
»» Generators must export low‑level radioactive waste for disposal or store it on site. In June 2008 waste generators
in California will lose access to one of the two disposal facilities that are currently in use.
»» The Southwestern Commission’s role is largely one of approving requests to export low‑level radioactive waste
out of the compact region.
•	 The Southwestern Commission’s processes for approving requests to export waste do not comply with
federal law. For example, rather than approving the exportation of low‑level waste by a two‑thirds vote of the
Southwestern Commission as mandated, the Southwestern Commission delegates impermissibly this authority
to the executive director. Further, it allows waste generators to determine whether their low‑level waste meets
recycling requirements.
•	 The department has some serious shortcomings in its oversight of low‑level radioactive material and waste:
»» More than five years after being directed to do so, the department has yet to adopt certain decommissioning
standards that define when a physical location is sufficiently clean from harmful radiation.
»» The department’s Radiologic Health Branch (branch) cannot demonstrate that its inspections of those that
possess radioactive material and radiation‑emitting machines are performed timely in accordance with federal
and state requirements.
»» More than five years after the effective date of the law, the branch is still unable to provide required information
on the amount of low‑level waste generated in California.
KEY RECOMMENDATIONS

The report provided many recommendations to the department regarding its oversight responsibilities. Such
recommendations included improvements to its planning processes, data collection, inspections, and providing the
Legislature with needed information.

California State Auditor Report 2010-401

August 2010

Public Health, Department of
Description of Data

Agency Purpose of Data

The California Mammography Information System
(CAMIS) maintains data about inspections of
mammography equipment

To track the mammography machine inspections by the Radiologic Health Branch
(branch) of the Department of Public Health (Public Health).

Purpose of Testing

Data Reliability Determination

To evaluate whether the branch had backlogged and
untimely inspections of mammography equipment.

Not sufficiently reliable—Our review of a sample of 30 inspection records for
mammography equipment found that the branch was unable to provide five
inspection records that were still within its 10‑year record retention policy.
Additionally, we identified an instance in which an inspection record did not include
an entry for the inspection date. Additional interviews of data‑entry staff suggested
weak controls over data entry. We did not present data from CAMIS in the audit report
because the data were not sufficiently reliable for our intended purpose.

Agency Response Date

June 2009

Corrective Action Recommended

Status of Corrective Action

To ensure that the branch uses sufficiently reliable data
from its future data system to manage its inspection
workload, Public Health should develop and maintain
adequate documentation related to data storage, retrieval,
and maintenance.

Partial corrective action taken—Public Health ultimately plans to replace with an
Enterprise‑wide Online Licensing (EOL) system the systems it uses to manage its
inspection workload. Public Health stated that it had received administrative and
legislative approval for the EOL system and that it expects to award a contract for the
new system in July 2011.

To make certain that the branch uses sufficiently reliable
data from its current systems to manage its inspection
workload, Public Health should do the following:

Public Health indicated that it had instituted additional quality control procedures
over data entry into the CAMIS. The branch has limited users’ access to the CAMIS,
indicating which user groups should have the ability to make changes in the data
versus their having a “read‑only” status. Further, the branch requires that any change
to the CAMIS be approved beforehand. The branch provided a CAMIS Change Request
form that it uses to allow its staff to request specific changes to CAMIS data, to explain
the reason for the change, and to document the branch’s approval.

•	

Improve the accuracy of the branch’s data for
inspection timeliness and priority level. The branch can
do so by comparing existing files to the information
recorded in the data systems.

•	

Improve its internal controls over data entry so that
it can maintain accurate data on an ongoing basis.
Such controls might include developing a quality
assurance process that periodically verifies the contents
of licensee files to the data recorded electronically.
Other controls might include formalizing data‑entry
procedures to include managerial review or directing
the information technology staff to perform periodic
logic checks of the data.

continued on next page . . .

45

46

California State Auditor Report 2010-401

August 2010

Description of Data

Agency Purpose of Data

The Health Application Licensing (HAL) system records
data on inspections of radiation‑emitting machines other
than mammography equipment

To record the branch’s inspections of radiation‑emitting machines—such as X‑ray
machines—other than mammography equipment.

Purpose of Testing

Data Reliability Determination

To evaluate whether the branch of Public Health had
backlogged and untimely inspections of radiation‑emitting
machines other than mammography equipment.

Undetermined reliability— We were unable to obtain assurance about the reliability of
the system because of Public Health’s outdated documentation for the HAL system,
staff members’ inability to fully explain which data they extracted from the system and
why they extracted that information, and the lack of coordination between the branch
and its information technology support staff. Moreover, we were unable to obtain
the information necessary for us to use the system for identifying late inspections.
We did not present data from the HAL system in the audit report because we were
unable to obtain assurance about the reliability of the system and how to identify late
inspections in the system.

Agency Response Date

June 2009

Corrective Action Recommended

Status of Corrective Action

To ensure that the branch uses sufficiently reliable data
from its future data system to manage its inspection
workload, Public Health should develop and maintain
adequate documentation related to data storage, retrieval,
and maintenance.

Partial corrective action taken—Public Health ultimately plans to replace the data
systems it uses to manage its inspection workload with an Enterprise-wide Online
Licensing (EOL) system. Public Health stated that it received administrative and
legislative approval for the EOL system and that it expects to award a contract for the
new system in July 2011.

To make certain that the branch uses sufficiently reliable
data from its current systems to manage its inspection
workload, Public Health should do the following:

For the HAL system, Public Health formed a Quality Assurance Unit (QAU), which
is responsible for tracking inspection‑related data and ensuring that staff enter
inspection‑related data into HAL accurately. Public Health provided documentation
showing that it is actively tracking errors found as a result of the QAU process and that
the error rate is declining. For example, in the third quarter of 2008, the QAU found
errors with 21 inspection files for every 100 files it reviewed. By the third quarter
of 2009 this error rate had dropped to 15 inspection files per 100 files reviewed.

•	

•	

Improve the accuracy of the branch’s data for
inspection timeliness and priority level. The branch can
do so by comparing existing files to the information
recorded in the data systems.
Improve its internal controls over data entry so that
it can maintain accurate data on an ongoing basis.
Such controls might include developing a quality
assurance process that periodically verifies the contents
of licensee files to the data recorded electronically.
Other controls might include formalizing data‑entry
procedures to include managerial review or directing
the information technology staff to perform periodic
logic checks of the data.

Finally, Public Health is engaged in bimonthly meetings with its Information
Technology Services Division, which have helped to resolve problems with
certain data fields while identifying other needs that still require evaluation
and implementation.

California State Auditor Report 2010-401

August 2010

Description of Data

Agency Purpose of Data

The radioactive materials database (RAM2000) contains
data related to inspections by the branch at Public Health
of entities that possess radioactive material

To track the branch’s inspections of entities that it has licensed to possess
radioactive materials.

Purpose of Testing

Data Reliability Determination

To evaluate whether the branch had backlogged
and untimely inspections of entities that possess
radioactive materials.

Not sufficiently reliable—To determine the accuracy of the data in this system, we
selected a sample of 29 inspections from the RAM2000 database to validate the
information in key fields. The supporting documentation for 13 licenses had been
destroyed in accordance with record retention policies; however, for two of our
remaining sample items, we found that the RAM2000 database contained inaccurate
data in the priority code field. This field notes the inspection frequency standard
applied to a given licensee. With the existence of other errors, such as missing
inspection dates and poor management controls over data entry, we concluded that
these data were not sufficiently reliable for our intended purpose.

Agency Response Date

June 2009

Corrective Action Recommended

Status of Corrective Action

To ensure that the branch uses sufficiently reliable data
from its future data system to manage its inspection
workload, Public Health should develop and maintain
adequate documentation related to data storage, retrieval,
and maintenance.

Partial corrective action taken—Public Health ultimately plans to replace the systems it
uses to manage its inspection workload with an EOL system. Public Health stated that
it had received administrative and legislative approval for the EOL system and that it
expects to award a contract for the new system in July 2011.

To make certain that the branch uses sufficiently reliable
data from its current systems to manage its inspection
workload, Public Health should do the following:
•	 Improve the accuracy of the branch’s data for
inspection timeliness and priority level. The branch can
do so by comparing existing files to the information
recorded in the data systems.

To address specific problems we identified in the RAM2000 data, Public Health stated
that it conducted a 100 percent quality assurance review to validate inspection
data shown in the system. After finding few errors, the branch now performs a
quality assurance review for 50 percent of the data entered into the system. The
branch indicated it is tracking the data‑entry error rate and will consider performing
more reviews if this rate increases. The branch provided examples of its quality
assurance reviews.

•	

Improve its internal controls over data entry so that
it can maintain accurate data on an ongoing basis.
Such controls might include developing a quality
assurance process that periodically verifies the contents
of licensee files to the data recorded electronically.
Other controls might include formalizing data‑entry
procedures to include managerial review or directing
the information technology staff to perform periodic
logic checks of the data.

47

48

California State Auditor Report 2010-401

August 2010

OFFICE OF SPILL PREVENTION AND RESPONSE
It Has Met Many of Its Oversight and Response Duties, but Interaction With Local Government,
the Media, and Volunteers Needs Improvement
Date: August 28, 2008	

Report: 2008‑102

BACKGROUND

Marine oil spills, such as the November 2007 oil spill resulting when an outbound container ship—the Cosco
Busan—hit a support on the San Francisco–Oakland Bay Bridge and released 53,600 gallons of oil into the bay, are
multijursdictional events and typically require a coordinated response by federal, state, and private entities. The
Department of Fish and Game’s Office of Spill Prevention and Response (spill office), along with contingency plans
it oversees, fits into a national framework for preventing and responding to oil spills, with entities at every level
of government, as well as private entities, handling some aspect of the planning effort. Thus, a three‑part unified
command consisting of representatives from the spill office, the party responsible for the spill, and the U.S. Coast
Guard responded to the Cosco Busan oil spill.
KEY FINDINGS

Our review of the planning, oversight, and administrative activities of the spill office and the coordinated response of
the spill office, Office of Emergency Services (Emergency Services), and private entities to the Cosco Busan oil spill in
the San Francisco Bay, revealed the following:
•	 The spill office maintains a state plan for responding to oil spills, but it has not updated the plan since 2001.
Moreover, the plan is missing required elements and does not contain references to regional and area contingency
planning documents that contain those elements.
•	 Few local governments participate in oil spill contingency planning activities. While 21 counties and one city with
marine waters have oil spill contingency plans, 10 plans have not been updated for 10 to 15 years. Further, local
governments have attended few oil spill response drills or planning meetings over the last few years.
•	 Although the spill office, Emergency Services, and private entities responding to the Cosco Busan oil spill met their
fundamental responsibilities, there were weaknesses in the spill office’s immediate response efforts.
»» A shortage of communications equipment during the critical second and third days limited
communication efforts.
»» Lack of trained liaison officers and public information officers experienced in oil spill response during the early
days of the response hindered the spill office’s efforts to communicate specific and timely information with local
governments and volunteers.
»» The spill office’s lack of urgency in reporting its measurement of the oil spill quantity, as well as understated
spill amounts reported by others, may have delayed deployment of additional resources and notification of
local governments.
•	 We found several instances in which certain staff performed activities unrelated to oil spill prevention, yet were paid
almost entirely from fees assessed for the Oil Spill Prevention and Administration Fund.
KEY RECOMMENDATIONS

We made numerous recommendations in our report including that the spill office update the state plan and
incorporate references to the regional and area contingency plans. Moreover, we recommended the spill office work
with local governments to improve participation and better integrate local plans with the response activities on an
up‑to‑date basis. Further, the spill office should ensure it has adequate procedures and a sufficient number of trained
staff for all activities including performing liaison duties, spill volume calculations, and other recovery activities.
Additionally, the spill office should ensure the proper use of its funds earmarked for oil spill prevention activities.

California State Auditor Report 2010-401

August 2010

Fish and Game, Department of
Description of Data

Agency Purpose of Data

California State Accounting and Reporting System data
from Department of Fish and Game financial reports

To satisfy the basic accounting needs of most state agencies.
The fund reserve as of June 30, 2007, was $17.6 million, which equates to about
50 percent of budgeted expenditures for fiscal year 2007–08.

Purpose of Testing

Data Reliability Determination

To examine and trend the sources and uses of the Office of
Spill Prevention’s Oil Spill Prevention and Administration
Fund since 2001, determining the reasons for any
significant fluctuations and whether any surpluses exist.

Sufficiently reliable.

49

50

California State Auditor Report 2010-401

August 2010

DEPARTMENT OF FISH AND GAME
Its Limited Success in Identifying Viable Projects and Its Weak Controls Reduce the Benefit
of Revenues From Sales of the Bay‑Delta Sport Fishing Enhancement Stamp
Date: October 16, 2008	

Report: 2008‑115

BACKGROUND

Since January 2004, a person must first purchase a fish stamp—the Bay‑Delta Sport Fishing Enhancement Stamp
(fish stamp)—to sportfish in the San Francisco Bay and Delta. Fees collected from fish stamp sales are deposited in
a restricted account within the preservation fund, which is administered by the Department of Fish and Game (Fish
and Game), and can only be used for activities that promote sportfishing opportunities or that provide long‑term,
sustainable benefits either to the primary sportfishing population or to anglers in the areas defined as bay‑delta
regulated waters. A fish stamp advisory committee (committee) identifies and recommends projects, while Fish and
Game administers all the fees, recommends and approves projects for funding, and funds and monitors the projects.
KEY FINDINGS

In our review of Fish and Game’s administration of the fish stamp program, we reported the following:
•	 Fish and Game has been slow in using the fees collected from fish stamp sales.
»» During the first two years of the program, fish stamp sales generated $2.9 million, yet Fish and Game did not
seek authority to use the funds in those two years.
»» Fish and Game was slow in identifying and approving projects—by the end of the third year of the program, it
had approved only three projects and spent just $160,000 of the $4.3 million in total fish stamp fees collected at
that time.
»» As of June 2008, Fish and Game has generated $8.6 million in revenue and interest since the inception of the
program, yet it has only approved 17 projects and has only spent $1.6 million—leaving a surplus of $7 million.
•	 Fish and Game does not adequately monitor fish stamp project activity. Project expenditures are difficult to
reconcile and have been incorrectly charged to other funding sources. Further, periodic reports that Fish and Game
provides to the committee do not include project expenditures or detailed information on project status.
•	 During fiscal years 2005–06 through 2007–08, Fish and Game inappropriately charged an estimated $201,000 in
costs to the fish stamp account for activities unrelated to the fish stamp program.
KEY RECOMMENDATIONS
We made several recommendations to Fish and Game including that it work with the committee in developing a spending
plan to identify, approve, and fund viable projects. We also recommended that Fish and Game adequately track and report
project costs within its accounting system and ensure that its project managers reconcile their files to the accounting records.
Moreover, Fish and Game should provide the committee with accurate financial and project information, such as actual
project costs, detailed information on project status, and administrative expenditures. Finally, Fish and Game should ensure
only appropriate activities are paid with fish stamp revenue and it should correct inappropriate charges it previously made.

California State Auditor Report 2010-401

August 2010

Fish and Game, Department of
Description of Data

Agency Purpose of Data

California State Accounting and Reporting System data for
the Department of Fish and Game (Fish and Game)

To satisfy the basic accounting needs of most state agencies.
Of the $8.6 million in revenues and interest generated from the Bay-Delta Sport
Fishing Enhancement Stamp (fish stamp) sales through fiscal year 2007–08, Fish and
Game had approved 17 projects and spent only $1.6 million in funding.

Purpose of Testing

Data Reliability Determination

To calculate expenditures from the fish stamp account.

Sufficiently reliable—We assessed the accuracy of the financial information presented
through February 29, 2008.
Undetermined reliability—We did not test the data presented for the period of
March 1, 2008, through June 30, 2008 because it was not available at the time of our
testing. Therefore, we cannot conclude on the reliability of these data.

Description of Data

Agency Purpose of Data

Fish and Game License Agent System

To record revenues from fish stamp sales, among other purposes.
Since the inception of the fish stamp program in 2004 through fiscal year 2007–08,
Fish and Game sold nearly 1.5 million annual fish stamps, generating $8.6 million in
revenue and interest.

Purpose of Testing

Data Reliability Determination

To calculate revenues from fish stamp sales.

Sufficiently reliable—We assessed the accuracy of the financial information presented
through February 29, 2008.
Undetermined reliability—We did not test the data presented for the period of
March 1, 2008, through June 30, 2008 because it was not available at the time of our
testing. Therefore, we cannot conclude on the reliability of these data.

51

52

California State Auditor Report 2010-401

August 2010

CALIFORNIA UNEMPLOYMENT INSURANCE APPEALS BOARD
Its Weak Policies and Practices Could Undermine Employment Opportunity and Lead
to the Misuse of State Resources
Date: November 20, 2008 	

Report: 2008‑103

BACKGROUND

Created in 1953 to conduct hearings and issue decisions to resolve disputed unemployment and disability
determinations and tax‑liability assessments made by the Employment Development Department (department),
the quasi‑judicial agency, the California Unemployment Insurance Appeals Board (appeals board) operates fairly
independently. According to statute, the appeals board hires/appoints, directs, and controls its own employees and
prepares its own budget, while receiving some business support from the department. Further, a seven‑member
full‑time board or its authorized deputies or agents oversee the appeals board and its staff.
KEY FINDINGS

Our review of the appeals board’s hiring, procurement, and administrative practices revealed the following:
•	 Managers did not consistently document the basis for their hiring decisions, leaving the appeals board vulnerable to
allegations that its hiring decisions are unfair and exclusive. We found several deficiencies in the hiring process for
the 27 advertised positions we reviewed such as:
»» No explanation as to why the appeals board selected the candidate in 21 cases.
»» No evidence that reference checks occurred for 19 hires.
»» No documentation that eight hiring interviews took place.
•	 Nearly half of the employees who responded to our survey believe that familial relationships or employee favoritism
compromised hiring and promotion practices. Further, the appeals board’s past practice of hiring board members for
civil service jobs could undermine its employees’ faith in the civil service selection process. Moreover, new policies
related to nepotism and hiring former board members are not fully enforceable because the appeals board did not
obtain approval from the State’s Office of Administrative Law.
•	 Weak controls over travel expenses resulted in questionable uses of state resources.
»» Of the 20 travel expense reimbursements we reviewed, we found that the business purpose of the trip for
seven was not sufficiently documented and thus we could not determine if the travel was in the best interest of
the State.
»» We noted instances in which the former executive director may have inappropriately claimed and received
more than $2,200 in reimbursements for expenses that appear to be associated with travel between his
home and headquarters.
•	 The appeals board maintains 35 parking spaces at a cost of approximately $5,000 per month, yet has no policies or
procedures to ensure that these spaces are used only for appropriate purposes.
KEY RECOMMENDATIONS

We made numerous recommendations to the appeals board to ensure its hiring decisions are, and are perceived to
be, fair. Some of the steps we recommended include adopting a comprehensive hiring manual and documenting the
basis for the appeals board’s hiring decisions. We also recommended that the appeals board strengthen its travel
manual by requiring supervisors to preapprove travel plans and ensure that all travel is in the State’s best interest and
in compliance with regulations. Moreover, the appeals board should review travel‑related payments made to its former
executive director and seek recovery for any travel reimbursements that do not comply with state regulations.

California State Auditor Report 2010-401

August 2010

Unemployment Insurance Appeals Board
Description of Data

Agency Purpose of Data

Unemployment Insurance Appeals Board (appeals board)
spreadsheets known as blue‑slip logs, which list personnel
transactions

To summarize the appeals board’s hires, promotions, and transfers.

Purpose of Testing

Data Reliability Determination

To select our sample of hires, promotions, and transfers
and to determine if each one complied with applicable
laws, regulations, policies, and procedures.

Sufficiently reliable.

Description of Data

Agency Purpose of Data

A complete listing of staff employed by the appeals board
as of April 23, 2008, based on a report that it generated
from the management information retrieval system of the
State Controller’s Office.

To generate various reports for California Human Resources staff, including position
inventory and employment history reports.

Purpose of Testing

Data Reliability Determination

To ensure that we had a complete listing of all staff
employed by the appeals board as of April 23, 2008.

Sufficiently reliable.

The appeals board hired, promoted, or transferred 265 employees from April 2006
through April 2008.

The appeals board had 639 employees and seven board members as of April 23, 2008.

Employment Development Department
Description of Data

Agency Purpose of Data

Employment Development Department accounting
system reports.

To process payments for the appeals board, including reimbursements of travel
claims and payments for the procurement of goods. In addition, the system maintains
the appeals board’s operating and equipment expense records. The appeals board
requested the accounting system reports from the Employment Development
Department that we used to pull our sample of equipment, furniture, and
travel expenses.
From July 2005 through March 2008, the appeals board operating and equipment
expenses totaled $35 million, of which $25 million, or 71 percent, was for travel costs,
office space rent, office equipment, and information technology and communications
equipment. Travel expenses totaled $2.5 million.

Purpose of Testing

Data Reliability Determination

To select a sample of office equipment and furniture
procurements and travel expense reimbursements and
test their compliance with applicable laws and other
requirements, Department of Personnel Administration
regulations, and the appeals board’s travel policies
and procedures.

Sufficiently reliable.

53

54

California State Auditor Report 2010-401

August 2010

DNA IDENTIFICATION FUND
Improvements Are Needed in Reporting Fund Revenues and Assessing and Distributing DNA Penalties, but
Counties and Courts We Reviewed Have Properly Collected Penalties and Transferred Revenues to the State
Date: November 29, 2007	

Report: 2007‑109

BACKGROUND

The voter‑approved DNA act of 2004 expanded the existing statewide program that created a database and data bank
of DNA samples for certain qualifying offenses. State, county, and municipal law enforcement agencies identify persons
qualifying for entry into the state DNA database and data bank, collect DNA samples, and send the samples to the
Department of Justice (Justice) to process and store the information. To offset the cost of increased DNA testing, the
DNA act also levies a penalty on all fines, penalties, or forfeitures imposed and collected by the courts for all criminal
offenses and traffic violations. Counties collect the revenue and deposit the payments into a DNA Identification Fund
(DNA fund) and quarterly transfer the appropriate percentage, plus interest earned, to the state DNA fund.
KEY FINDINGS

Our review of the DNA fund revealed that the counties we visited appropriately used their DNA funds. Our audit did
identify several issues including:
•	 Reporting data on county DNA funds needs to be improved.
»» Counties are not required to include all DNA fund revenues in their annual report; thus, the State cannot be fully
assured that counties are assessing and collecting all required DNA penalties.
»» Many counties (22 and 24) failed to submit annual reports in 2005 and 2006, yet Justice did not follow up with
those nonreporting counties.
•	 Justice’s Web site is incorrect—it indicates that nonreporting counties did not collect and transfer DNA fund money
to the State when, in fact, the counties transferred $1.6 million and $3.8 million, respectively, in those years.
•	 Judicial discretion and state laws can affect the amount and timing of DNA penalties assessed and collected.
»» The State does not receive DNA fund money for every criminal and traffic violation—courts can waive the
penalties under certain circumstances, and in others the penalty does not apply.
»» Court decisions and state law can allow several months to lapse before fines must be paid and transferred to the
State—it took between 114 to 250 days from the date of the citation to the date the county transferred the funds
to the State in our sample of 48 items.
•	 Some weaknesses exist in some courts’ automated case management systems and internal controls.
KEY RECOMMENDATIONS

We recommended that the Legislature consider revising state law to require counties to report on all DNA penalties
as part of their annual report. Additionally, we made numerous recommendations to Justice to ensure data on county
DNA fund activities are accurate. We also made other recommendations to the Administrative Office of the Courts,
which is developing a statewide case management system for all counties.

California State Auditor Report 2010-401

August 2010

Justice, California Department of
Description of Data

Agency Purpose of Data

State Controller’s Office (State Controller) DNA
Identification Fund (DNA fund) database

To record the amount of DNA fund penalties that counties and courts transfer to the
State. Each county must make a quarterly transfer of money from its DNA fund to
the State Treasurer’s Office for deposit in the state DNA fund. At the same time, each
county must submit a Report to State Controller of Remittance to State Treasurer to
notify the State Controller of the amount transferred. Counties contributed $8 million
to the state DNA fund for 2005, and $14.6 million to the fund for 2006.

Purpose of Testing

Data Reliability Determination

To determine if counties were transferring DNA fund
money to the State.

Sufficiently reliable.

To ensure that counties were correctly transferring DNA
fund money to the State and reporting the appropriate
amounts in their annual reports.

Sufficiently reliable.

55

56

California State Auditor Report 2010-401

August 2010

STATE BAR OF CALIFORNIA
It Can Do More to Manage Its Disciplinary System and Probation Processes
Effectively and to Control Costs
Date: July 21, 2009	

Report: 2009‑030

BACKGROUND

With a membership of more than 217,000 attorneys, the State Bar of California (State Bar) is responsible for admitting
new members, investigating and resolving complaints against members, disciplining attorneys who violate laws
or rules, and performing various administrative and support duties. Each year the State Bar collects an annual
membership fee plus additional fees that fund specific programs—in 2009, each active member paid $410 in required
fees. Approximately 80 percent of the State Bar’s general fund revenue goes toward financing the costs of the attorney
disciplinary system: receiving complaints, investigating cases, prosecuting a case, and trying a case in the State Bar
Court. The Office of Probation (probation office) monitors disciplined attorneys.
KEY FINDINGS

During our review of the State Bar’s attorney disciplinary system, we noted the following:
•	 It does not track its discipline costs by key disciplinary function and thus, cannot determine how efficiently it
operates or what impact salary increases or policy changes have on each function.
•	 The total costs for its disciplinary system have increased by 30 percent or $12 million from 2004 through 2008—
outpacing both inflation and growth in the State Bar’s active membership—while the number of inquiries that the
State Bar opened declined.
»» Salaries for staff have risen significantly over the past five years.
»» The number of cases that proceeded to trial has increased.
»» The investigation processing time has increased from an average of 168 days in 2004 to 202 days in 2007.
•	 Information it reports annually regarding case processing time and backlog of disciplinary cases is misleading. Its
methodology for calculating its average processing time has led to understating the average processing time, and its
approach for determining the backlog has resulted in incomplete and inconsistent information from year to year.
•	 It has not updated its formula to bill for discipline costs since 2003 despite the 30 percent increase in costs. Further,
it does not consistently include due dates when billing disciplined attorneys. In 2007 and 2008, the State Bar
reported that it collected an average of 63 percent of the amount it billed for those years. However, only an average
of 17 percent of the amount received was billed in that same year.
•	 The number of attorney disciplinary cases the probation office monitors has grown nearly 10 percent in the
five‑year period ending in 2008, yet the number of probation deputies was only recently increased by one.
•	 It still needs to fully implement recommendations made in a consultant’s report, in the periodic audits conducted
by its internal audit and review unit, and in our 2007 audit.
KEY RECOMMENDATIONS

We made numerous recommendations to the State Bar to separately track expenses associated with its disciplinary
system to allow it to explain and justify cost increases and measure the efficiency of the system. We also outlined
several changes to improve its billing process and to maximize the amounts that it could recover to defray the expense
of disciplining attorneys. Further, we identified other improvements for its probation office and control processes.

California State Auditor Report 2010-401

August 2010

State Bar of California
Description of Data

Agency Purpose of Data

State Bar of California (State Bar) disciplinary
tracking system

To track cases brought against attorneys from the public and other sources.
The State Bar processes most cases from the intake stage through the investigation
stage within six months. The number of inquiries opened at the intake stage declined
slightly from 2004 to 2007, and the average intake case processing time has decreased
in recent years.  
The State Bar had 867 probation cases at the end of 2008.

Purpose of Testing

Data Reliability Determination

To review case processing times and the disciplinary
case backlog. Our analysis demonstrates that the length
of time to process cases proceeding beyond intake is
generally increasing.

Sufficiently reliable.

57

58

California State Auditor Report 2010-401

August 2010

STATE BOARD OF CHIROPRACTIC EXAMINERS
Board Members Violated State Laws and Procedural Requirements, and Its Enforcement, Licensing, and
Continuing Education Programs Need Improvement
Date: March 25, 2008	

Report: 2007‑117

BACKGROUND

The State Board of Chiropractic Examiners (chiropractic board) was created in December 1922 through an initiative
measure approved by the voters of California. In general, the chiropractic board is a policy‑making and administrative
review body consisting of seven members (board members)—five professional and two public members, each
appointed by the governor. The board’s paramount responsibility is to protect California consumers from fraudulent,
negligent, or incompetent practices among providers of chiropractic care.
KEY FINDINGS

We reported numerous concerns about board members’ actions and the chiropractic board’s administration of its
enforcement, licensing, and continuing education programs including:
•	 Board members violated some Bagley‑Keene Open Meeting Act requirements.
•	 Board members invited ex parte communication and inappropriately inserted themselves into the
enforcement process.
•	 Board members inappropriately delegated responsibility to approve or deny licenses to chiropractic board staff.
•	 The enforcement program has significant weaknesses:
»» Lack of standard procedures and management oversight resulted in unexplained and unreasonable delays in
processing and resolving complaints and may have contributed to staff processing complaints inconsistently.
»» The chiropractic board’s prioritization system for its complaint review process is seriously flawed. It frequently
fails to designate complaints as having priority or process them promptly. Of 11 complaints we reviewed that
should have been classified as having priority, only one received such a designation and staff took from one to
three years to investigate and close nine, including the single case designated as having priority.
•	 The chiropractic board did not ensure that its designated employees, including board members, complied with the
financial reporting requirements of the Political Reform Act.
•	 Although the chiropractic board has some effective regulations and processes to ensure the quality of continuing
education, it does not follow them.
KEY RECOMMENDATIONS

•	 Continue to work with legal counsel to ensure compliance with applicable state laws and regulations.
•	 Establish benchmarks and more structured procedures for processing complaints.
•	 Establish a process to properly categorize complaints, promptly resolve them, and ensure that management
monitors the status of open complaints.
•	 Ensure that its continuing education program complies with current regulations.

California State Auditor Report 2010-401

August 2010

California Board of Chiropractic Examiners
Description of Data

Agency Purpose of Data

California Board of Chiropractic Examiners (Chiropractic
Board) data related to complaints entered into the
Consumer Affairs System

To record information about the Chiropractic Board’s case files (complaints
and licensing) .
In fiscal year 2006–07, 708 complaints were opened and 576 were closed. The
Chiropractic board issued 292 new chiropractic licenses in fiscal year 2006–07.

Purpose of Testing

Data Reliability Determination

To select a sample of complaints closed in fiscal year
2006–07 and one complaint closed in fiscal year 2007–08.

Undetermined reliability—We could not review the accuracy of some records.
Thus, a potential existed for errors that could have a material effect on the number
of complaints that the data indicate were opened, closed, or referred to an
investigator in fiscal years 2005–06 and 2006–07 and on the number of complaints
opened and closed against board members in fiscal years 2005–06, 2006–07, and
2007–08 (through August 31, 2007). Because the data could have led to incorrect or
unintentional messages, these weaknesses were potentially significant. Therefore,
the data are qualified because we concluded that the Chiropractic Board’s data was of
undetermined reliability or insufficient reliability for our purposes.

To determine the number of complaints opened,
complaints closed, complaints opened and referred to
contracted investigators, and those complaints that board
staff referred to contracted investigators in fiscal years
2005–06 and 2006–07 that were closed.

Undetermined reliability—See above.

To select samples of licenses for testing, for determining
the number and types of licenses issued in fiscal
year 2006–07, and for determining the number and types
of licenses active as of June 30, 2007.

Not sufficiently reliable—Our testing identified errors that could have had a
material effect on the number of licenses that the data indicated were issued in
fiscal year 2006–07 or the number of licenses active as of June 30, 2007; therefore, the
data could have led to incorrect or unintentional messages.

Agency Response Date

N/A

Corrective Action Recommended

Status of Corrective Action

The Chiropractic Board uses the Consumer Affairs System
to record information about its complaint and licensing
case files. However, it does not own that system; therefore,
we did not pursue data issues further.

N/A

59

60

California State Auditor Report 2010-401

August 2010

CALIFORNIA DEPARTMENT OF VETERANS AFFAIRS
Although It Has Begun to Increase Its Outreach Efforts and to Coordinate With Other Entities, It
Needs to Improve Its Strategic Planning Process, and Its CalVet Home Loan Program Is Not Designed
to Address the Housing Needs of Some Veterans
Date: October 27, 2009	

Report: 2009‑108

BACKGROUND

As of September 2008 the U.S. Department of Veterans Affairs (federal VA) estimated that approximately 2.1 million
veterans resided in California, making up nearly 9 percent of the total estimated national veteran population. The
mission of the California Department of Veterans Affairs (department) is to serve these veterans and their families,
and it generally organizes its efforts into three divisions—the Veterans Homes division (Veterans Homes), the CalVet
Home Loan program (CalVet program), and the Veterans Services division (Veterans Services). The department
receives funding from various sources, including the State’s General Fund, federal funds, and special funds, and spends
approximately 98 percent of the funding that it receives on its Veterans Homes and CalVet program.
KEY FINDINGS

During our review of the department’s efforts to address the needs of California veterans, we noted the following:
•	 The department relies on other entities to provide many of the direct services that veterans need, such as homeless
or mental health services, and has only recently decided that Veterans Services should take a more active role in
informing veterans about available benefits and coordinating with other entities that provide such services.
•	 With the State’s participation in federal disability compensation and pension benefits (C&P benefits) below the
national average, the department has made increasing veterans’ participation in these benefits a primary goal for
Veterans Services. However, Veterans Services’ ability to meet this goal is hampered by various barriers, including
veterans’ lack of awareness of the benefits, the complexity of the claims process, and its lack of coordination with
the County Veterans Service Officer programs (CVSOs).
•	 The department has not formally assessed veterans’ needs or included key stakeholders such as the CVSOs in its
strategic planning process, and it has not effectively measured its progress towards meeting the goals and objectives
in its strategic plan.
•	 As of March 2009 the CalVet program served 12,500 veterans; however, the program is generally not designed to
serve homeless veterans or veterans in need of multifamily or transitional housing.
KEY RECOMMENDATIONS

•	 The department should ensure that Veterans Services continues its various initiatives related to gathering veterans’
contact information, and increasing veterans’ awareness of the benefits available to them. It should also ensure that
Veterans Services continues its efforts to collaborate with other entities and implements a more systematic process
for identifying and prioritizing the entities with which it collaborates.
•	 Veterans Services should formally communicate its goal to increase veterans’ participation in C&P benefits to the
CVSOs. It should also require the CVSOs to submit information on the number of C&P benefit claims filed in their
offices, and use this and other available data to better coordinate outreach efforts with the CVSOs.
•	 The department should conduct a formal assessment of veterans needs, including soliciting input from the CVSOs,
and should develop measurable strategic plan goals and objectives that are directly aligned with veterans’ needs.
•	 If the Legislature believes that the department should play a larger role in funding multifamily housing for
veterans, providing transitional housing to veterans, or addressing the housing needs of homeless veterans
through the CalVet program, it should modify or clarify state law to authorize the department to provide
such services.

California State Auditor Report 2010-401

August 2010

Veterans Affairs, California Department of
Description of Data

Agency Purpose of Data

Mitas database maintained by the California Department of
Veterans Affairs (Veterans Affairs)

To originate and service loans and to account for bonds that issued through the CalVet
Home Loan program.
As of March 31, 2009, 12,518 veterans were participating in the CalVet Home
Loan program.

Purpose of Testing

Data Reliability Determination

To identify the number of California veterans who
receive benefits from the CalVet Home Loan program
and to identify recent trends in veterans’ participation in
the program.

Sufficiently reliable.

61

62

California State Auditor Report 2010-401

August 2010

VICTIM COMPENSATION AND GOVERNMENT CLAIMS BOARD
It Has Begun Improving the Victim Compensation Program, but More Remains to Be Done
Date: December 9, 2008	

Report: 2008‑113

BACKGROUND

Medical and dental care, mental health services, and lost wages or support are just some of the eligible services the
Victim Compensation Program (program) can cover for victims of crime. Administered by the Victim Compensation and
Government Claims Board (board), the program is financed through restitution fines, penalty assessments, and other
amounts collected by the State and counties and through a federal grant. The board contracts with 21 joint powers (JP)
units throughout the State to aid in approving or denying applications and bills. The JP units are located within the victim
witness assistance centers (assistance centers), which oversee a variety of services to victims and provide outreach for the
board and the program. Verifying entities, such as law enforcement, physicians, or hospitals, provide proof of a crime or
an injury resulting from a crime.
KEY FINDINGS

Our review of the board and program’s funding structure and accessibility of services to victims of crimes revealed
the following:
•	 Total payments to victims and/or service providers from fiscal years 2001–02 through 2004–05 sustained a 50 percent
decrease—from $123.9 million to $61.6 million. Despite this significant decline, the cost the board incurs to support
the program increased.
•	 The board did not always process applications and bills promptly. Specifically, the board:
»» Did not make a determination within its own maximum deadline of 180 days for two applications of the 49
that we tested.
»» Took more than 250 days to resolve appeals for four of five denied applications that we reviewed and, as of
October 2008, had yet to resolve the fifth after more than one year.
»» Took more than 90 days to pay 23 bills of 77 paid bills that we reviewed.
•	 The board’s follow‑up procedures for and communications with verifying entities lack detail and lead to
inconsistencies. Moreover, at times verifying entities did not cooperate in providing prompt responses to the
board and JP units.
•	 The board has experienced numerous problems with its new system for processing applications and bills, including:
»» Processing delays led to a reported increase in complaints.
»» Unbeknownst to the board, data in the system related to payments appeared erroneous.
»» Needed documentation for the new system has yet to be created, hampering efforts to resolve problems
cost‑effectively.
•	 The board’s current process for managing workload lacks benchmarks, performance measures, or any
written procedures.
•	 The board has not established a comprehensive outreach plan to assist in focusing on those in need of
program services.
KEY RECOMMENDATIONS

We made various recommendations to the board that include establishing goals that create a target fund balance and are
designed to measure its success in maximizing assistance to victims and their families. We also recommended that the
board develop specific procedures for following up with verifying entities. Moreover, the board should continue to correct
system problems, develop and maintain system documentation, and develop written procedures for managing workload.
Further, to develop a comprehensive and focused outreach plan, the board should seek input from key stakeholders
regarding underserved and vulnerable populations.

California State Auditor Report 2010-401

August 2010

Victim Compensation and Government Claims Board
Description of Data

Agency Purpose of Data

The Compensation and Restitution System (CaRES) of
the Victim Compensation and Government Claims Board
(Victim Compensation Board), which includes data on
application and bill processing

To process victim compensation applications and bills.

Purpose of Testing

Data Reliability Determination

To determine whether the Victim Compensation Board has
a backlog of applications and bills awaiting decisions.

Not sufficiently reliable—The reporting function in CaRES is not operable. As a result,
the Victim Compensation Board was unable to provide us with any useful reports
that would enable us to identify the extent to which a backlog exists. Although we
attempted to present inventory information for fiscal year 2007–08 using the board’s
electronic data from both its old system, VOX, and CaRES, some applications existed
in both systems, and determining the total population of applications without
duplicating them was not possible. Therfore, the data are qualified because we
concluded that the board’s CaRES data were not sufficiently reliable.

To assess how long the Victim Compensation Board and
joint powers units took to process completed applications
and bills that had been entered into CaRES.

Not sufficiently reliable—We assessed the reliability of the Victim Compensation
Board’s data entered into CaRES by performing electronic testing of selected
data elements and testing the accuracy and completeness of the data. To test
the completeness of the data, we reviewed it to identify gaps in the sequence of
application numbers. To test the accuracy of the application and billing data, we
traced key data elements to source documentation for 29 items. Based on that testing,
we concluded that the data were not sufficiently reliable for determining the length of
time taken to process applications and bills.

Agency Response Date

May and December 2009

Corrective Action Recommended

Status of Corrective Action

To ensure that the Victim Compensation Board has accurate
information to measure its success in meeting statutory
deadlines for processing applications, it should correct the
problems with the accepted‑date field in CaRES.

Corrective action taken—In its one‑year response, the Victim Compensation Board
stated that programming for the accepted‑date field had been completed, tested, and
installed in CaRES.

To ensure that it maximizes its use of CaRES, the Victim
Compensation Board should do the following:

Partial corrective action taken—In its six‑month response, the Victim Compensation
Board reported that it implemented monitoring tools to measure key performance
indicators of CaRES system health and that the measures are tracked daily to provide
real‑time and trend information on CaRES performance. Additionally, the board
reported that it completed the data dictionary for CaRES.

•	

Develop goals, objectives, and benchmarks related
to the functions it carries out under CaRES that will
allow it to measure its progress in providing prompt,
high‑quality service.

•	

Continue identifying and correcting problems with the
system as they arise.

•	

Address the structural and operational flaws that
prevent identification of erroneous information and
implement edit checks and other system controls
sufficient to identify errors.

The joint powers units and the Victim Compensation Board made an eligibility
determination for 47,260 applications processed solely through CaRES between
June 30, 2006, and June 30, 2008.

In its one‑year response, the Victim Compensation Board stated that it was continuing
its effort to maximize its use of CaRES. It stated that it had developed a corrective
action plan it uses for identifying issues that must be addressed and that it was
tracking the progress of issues. Additionally, the board stated that it hired a database
architect to identify structural problems and to provide detailed recommendations
on how to address these issues in CaRES. It expected the architect’s final assessment
and recommendations in December 2009. The Victim Compensation Board further
stated that it established a CaRES Change Control Board to review and prioritize
modifications and that this is an ongoing process. The board also reported that it
is in the process of developing system documentation and dependency diagrams
of CaRES.

continued on next page . . .

63

64

California State Auditor Report 2010-401

August 2010

•	

Seek input from and work with relevant parties, such
as assistance centers and joint powers units, to resolve
issues with the transition.

•	

Develop and maintain system documentation
sufficient to allow the board to address modifications
and questions about the system more efficiently
and effectively.

To ensure that the Victim Compensation Board effectively
manages the program workload and can report useful
workload data, it should do the following:
•	

Develop written procedures for its management
of workload.

•	

Implement the reporting function in CaRES as soon
as possible.

•	

Establish benchmarks and performance measures
to evaluate whether it is effectively managing
its workload.

•	

Review the applications and bills converted to CaRES
from VOX that are showing excessively lengthy
processing periods and determine whether problems
with the data exist or whether the board has significant
time‑processing problems.

Finally, the Victim Compensation Board reported that it continues to work closely with
joint powers office staff to resolve CaRES issues as they arise. It stated that it conducts
regular conference calls with county joint powers offices and that problems relative to
CaRES are communicated and tracked in a biweekly operational meeting. The board
also stated that it actively solicits feedback from a cross‑section of representatives
about CaRES performance problems.

Corrective action taken—In its one‑year response, the Victim Compensation Board
reported that it had developed an inventory monitoring system that identified
minimum and maximum workload acceptable at each processing center and the
steps to take if any of the centers are outside of the normal processing parameters.
The board stated that program managers meet periodically to discuss the workload
and to transfer work among centers using established transfer criteria. Additionally,
the board stated that its joint powers offices and its headquarters staff are
monitoring the number of applications and bills processed and that beginning in
early November 2009, management have met weekly to evaluate the inventory
and production across the entire program. The board also reported that CaRES is
now capable of and is producing reports as needed. Finally, the board stated that
it identified 1,655 bills converted from VOX that needed additional review after the
conversion to CaRES and that all of these bills have been addressed.

California State Auditor Report 2010-401

August 2010

cc:	

Members of the Legislature
Office of the Lieutenant Governor
Milton Marks Commission on California State
Government Organization and Economy
Department of Finance
Attorney General
State Controller
State Treasurer
Legislative Analyst
Senate Office of Research
California Research Bureau
Capitol Press

65

 

 

Disciplinary Self-Help Litigation Manual - Side
Advertise here
PLN Subscribe Now Ad