Skip navigation
PYHS - Header

Juvenile Justice Realignment - Limited Information Prevents a Meaningful Assessment of Realignment’s Effectiveness, California State Auditor, 2012

Download original document:
Brief thumbnail
This text is machine-read, and may contain errors. Check the original document to verify accuracy.
Juvenile Justice Realignment
Limited Information Prevents a Meaningful
Assessment of Realignment’s Effectiveness
September 2012 Report 2011-129

Independent NONPARTISAN
TRANSPARENT Accountability

The first five copies of each California State Auditor report are free. Additional copies are $3 each, payable by
check or money order. You can obtain reports by contacting the Bureau of State Audits at the following address:
California State Auditor
Bureau of State Audits
555 Capitol Mall, Suite 300
Sacramento, California 95814
916.445.0255 or TTY 916.445.0033
OR
This report is also available on the World Wide Web http://www.auditor.ca.gov
The California State Auditor is pleased to announce the availability of an on-line subscription service. For
information on how to subscribe, please contact the Information Technology Unit at 916.445.0255, ext. 456,
or visit our Web site at www.auditor.ca.gov.
Alternate format reports available upon request.
Permission is granted to reproduce reports.
For questions regarding the contents of this report,
please contact Margarita Fernández, Chief of Public Affairs, at 916.445.0255.

Elaine M. Howle
State Auditor

CALIFORNIA STATE AUDITOR

Doug Cordiner
Chief Deputy

Bureau of State Audits

555 Capitol Mall, Suite 300

S a c r a m e n t o, C A 9 5 8 1 4

September 11, 2012	

916.445.0255

916.327.0019 fax

w w w. a u d i t o r. c a . g o v

2011-129

The Governor of California
President pro Tempore of the Senate
Speaker of the Assembly
State Capitol
Sacramento, California 95814
Dear Governor and Legislative Leaders:
As requested by the Joint Legislative Audit Committee, the California State Auditor (state auditor)
presents this audit report concerning juvenile justice realignment and the Youthful Offender Block
Grant (block grant) that state law established to compensate counties for the increased costs related
to detaining and providing services to realigned juvenile offenders.
The report concludes that limited information and a lack of clear goals prevent a meaningful
assessment of the outcomes of juvenile justice realignment. In particular, as part of the realignment
law, the Board of State and Community Corrections (board) is required to issue annual reports
regarding counties’ use of block grant funds. Although not specifically required by state law, we
would expect the reports to allow the Legislature to make assessments regarding the outcomes of
realignment. However, the board’s reports are based on a flawed methodology and, therefore, should
not be used for this purpose. Moreover, the board’s reports could mislead decision makers about the
effectiveness of realignment by making it appear that realignment has not been effective when this
may not be the case. Because of the problems we identified with the board’s reports, we did not use
them to assess the outcomes of realignment. Instead, we attempted to use juvenile justice data from
the counties as well as from the Department of Justice and the California Department of Corrections
and Rehabilitation; however, we discovered limitations to these data that further impeded our ability
to draw conclusions about realignment.
Furthermore, the realignment law did not clearly specify the goals or intended outcomes of
realignment. Without clear goals, measuring whether realignment has been successful is challenging.
Nonetheless, the chief probation officers of the four counties we visited all believe that realignment
has been effective based on various indicators, such as a reduction in juvenile crime, new and
enhanced services, and reduced state costs. In support of these assertions, we found evidence
suggesting that realignment may have had positive outcomes for many juvenile offenders and thus
for the State. Although these indicators are encouraging, the limited—and potentially misleading—
juvenile justice data that are currently available makes any measurement of realignment outcomes
arbitrary and may not fully represent the impact realignment has had on juvenile offenders and the
State as a whole.
Respectfully submitted,

ELAINE M. HOWLE, CPA
State Auditor

Blank page inserted for reproduction purposes only.

California State Auditor Report 2011-129

September 2012

Contents
Summary	1
Introduction	7
Chapter 1	
Available Data Related to Realignment Are Limited and Could
Be Misleading 	21
Recommendations	37
Chapter 2	
Because State Law Does Not Clearly Define the Goals of Realignment,
Measuring Its Effectiveness Is Challenging	

41

Recommendations	53
Appendix A	
Youthful Offender Block Grant Funding Formula 	

55

Appendix B	
Juvenile Justice Statistics by County	

59

Responses to the Audit	
Board of State and Community Corrections	

65

California State Auditor’s Comments on the Response From
the Board of State and Community Corrections	

77

California Department of Corrections and Rehabilitation	

83

Department of Justice	

85

California State Auditor’s Comments on the Response From
the Department of Justice	

89

vii

viii

California State Auditor Report 2011-129

September 2012

Blank page inserted for reproduction purposes only.

California State Auditor Report 2011-129

September 2012

Summary
Results in Brief

Audit Highlights . . .

The California Department of Corrections and Rehabilitation’s
(Corrections) Division of Juvenile Justice (Juvenile Justice) has
historically operated secure detention facilities for many of
California’s juvenile offenders. However, in 2007 the Legislature
enacted a law that required the State to transfer all nonviolent
juvenile offenders to county facilities, a process referred to as
realignment in this report. As a result, the number of juvenile
offenders under Juvenile Justice’s supervision decreased from about
5,400 in June 2007 to nearly 2,500 in June 2011. To compensate
counties for the increased costs related to detaining and providing
services to these realigned juvenile offenders, state law established
the Youthful Offender Block Grant (block grant). According to the
Board of State and Community Corrections (board), counties can
use their share of the approximately $90 million annual allocation
for nearly any activity related to their juvenile justice systems.

Our audit of juvenile justice realignment
and the Youthful Offender Block Grant
(block grant) highlighted the following:

The realignment law does not establish clear goals, and the limited
information that is currently available regarding the outcomes of
realignment can be misleading. State law authorizes the board to
monitor programs supported by block grant funds and requires
the board to issue annual reports to the Legislature regarding the
outcomes for juveniles who receive block grant‑funded services
and programs. However, the board’s reports are based on a flawed
methodology and, therefore, should not be used for this purpose.
Although the law does not specifically require the board’s reports
to include an assessment of the outcomes of realignment, because
the board is the only state administering body referenced in the law
that realigned juvenile offenders, we would expect that its annual
reports would give the Legislature information with which to make
such an assessment.
Specifically, the board’s reports, as required by law, focus primarily
on the counties’ use of block grant funds rather than on their
juvenile justice systems as a whole. Attempting to assess the
outcomes of realignment through an examination of counties’ use
of block grant funds is not meaningful for several reasons. First,
according to the board, state law does not require counties to spend
block grant funds only for juvenile offenders who might have been
sentenced to Juvenile Justice prior to realignment; rather, the board
allows the counties to use the funds to serve nearly all juvenile
offenders or potential offenders. In addition, counties use other
sources of funds in addition to the block grant to serve realigned
juvenile offenders. As a result, the outcomes of realignment
cannot be directly correlated to the block grant. The board could

»» The number of juvenile offenders
under the Division of Juvenile Justice’s
supervision decreased from about 5,400 in
June 2007 to nearly 2,500 in June 2011.
»» The Board of State and Community
Corrections’ (board) annual reports to the
Legislature regarding outcomes are based
on a flawed methodology and should not
be used to assess outcomes.
•	 The reports focus primarily on the
counties’ use of block grant funds
rather than on their juvenile justice
systems as a whole.
•	 The reports could mislead decision
makers and the public because
outcomes for juvenile offenders cannot
always be directly correlated to the
block grant and also because counties
use other sources of funds to serve them.
»» The board does not give sufficient
guidance to counties and does not
adequately verify the accuracy of the
information it collects from them.
»» We discovered several limitations in using
juvenile justice data from the counties and
state departments in attempting to assess
outcomes of realignment.
»» The four chief probation officers of
Los Angeles, Sacramento, San Diego, and
Yuba counties all believe realignment
has been effective based on a reduction
in juvenile crime, improved services, and
reduced costs.

1

2

California State Auditor Report 2011-129

September 2012

address some of these weaknesses and improve the usefulness of its
reports by working with the counties and relevant stakeholders to
determine the data that counties should report.
Because of the methodology the board employs, its reports could
mislead decision makers about the effectiveness of realignment
by making it appear that realignment has not been effective when
this may not be the case. For example, the board indicated, in
both of the reports it has issued regarding block grant outcomes,
that a significantly higher percentage of juvenile offenders who
receive block grant‑funded services had a new felony adjudication
compared to those who did not receive block grant‑funded services.
This statement implies that the block grant actually increases
the likelihood that a juvenile offender will reoffend, when a more
plausible explanation is that some counties have focused their block
grant funds on high‑risk offenders. Although the reports state
that caution must be taken in drawing conclusions regarding the
differences in the outcomes for juvenile offenders who receive block
grant services and those who do not, we question why the board
chose to present this sort of comparison at all.
The usefulness of the board’s reports is further diminished because the
board does not ensure that the data it receives from counties are
consistent or accurate. For example, the board asks counties to
report the services that they provided to a sample of juvenile
offenders over a one‑year period but does not specify how counties
should determine when a juvenile offender has received a service.
As a result, our review revealed that Sacramento County reports
that a juvenile offender receives a service—such as a drug treatment
program—if he or she participates for at least one day, while
San Diego County reports that a juvenile offender receives a service
only if he or she successfully completes that service. Further, even
though the board attempts to verify some of the data it collects
from counties, we found that three of the four counties we visited
had submitted inaccurate information, suggesting that the board’s
efforts are not very effective. If these types of inconsistencies
and inaccuracies occur frequently, the board’s reports could be
significantly misinforming readers about key criminal justice
outcomes. According to the board’s field representative, the board
has not received approval for funding to monitor the counties’ use
of block grant funds.
Because of the problems we identified with the board’s reports, we
did not use them to assess the outcomes of realignment. Instead,
we attempted to use juvenile justice data from the counties as
well as from state departments; however, we discovered several
limitations to these data that further impeded our ability to
draw conclusions about realignment. Specifically, three of the
four counties we visited are not easily able to provide data that can

California State Auditor Report 2011-129

September 2012

be used to measure realignment outcomes. Further, although the
Department of Justice (Justice) maintains two systems that track
juvenile justice‑related data—the Juvenile Court and Probation
Statistical System (JCPSS) and the Automated Criminal History
System (criminal history system)—we could not use either to fully
assess certain outcomes of realignment because of the limitations
we observed with both of them.1
Moreover, the law does not clearly specify the goals or intended
outcomes of realignment. Rather, the law asserts that local juvenile
justice programs are better suited to provide rehabilitative services
than state‑operated facilities. In addition, a Senate floor analysis,
written while the realignment law was being considered by the
Legislature, noted that a projected impact of the law would be
to decrease the number of juvenile offenders housed in Juvenile
Justice. However, these goals are both vague and non‑specific.
Without clear goals, measuring whether realignment has been
successful is challenging.
Despite the limitations we encountered in attempting to determine
whether realignment has been effective, the four chief probation
officers of the counties we visited—Los Angeles, Sacramento,
San Diego, and Yuba—all believe that realignment has been
effective based on various indicators, such as a reduction in
juvenile crime, improved services, and reduced costs, suggesting
that it is possible to develop goals that would indicate the success
or failure of realignment. These indicators could be used to assess
realignment’s effectiveness. One potential indicator could be the
reduction in offenses committed by juveniles. When we analyzed
the JCPSS’s data using this indicator, we found evidence suggesting
that realignment may have had positive outcomes for many juvenile
offenders and thus for the State. However, because we did not
assess the reliability of the JCPSS’s data, we cannot be certain of
our conclusions. For example, the JCPSS data show that counties
may have reduced the number of juvenile offenders who receive
dispositions2 by over 21 percent from fiscal year 2007–08—the year
realignment began—to fiscal year 2010–11.
Another means of measuring outcomes could be to consider
the number and types of services that counties have been able
to provide since realignment. Subsequent to realignment and
1	

Please refer to the Introduction’s Scope and Methodology for the California State Auditor’s
assessment of the reliability of these data.
2	 A disposition is an action taken by a probation officer or juvenile court, such as committing
the juvenile to probation or to incarceration in a local or state facility, after a juvenile has been
referred to the probation department for an alleged behavior such as truancy. Our analysis
included those juveniles who received the following types of dispositions: direct file in adult
court, diversion, probation, remanded to adult court, or wardship. Some offenders could be
counted more than once if they received dispositions for multiple referrals.

3

4

California State Auditor Report 2011-129

September 2012

the infusion of block grant funds, the four counties we reviewed
reported having generally been able to provide new or enhanced
services to juvenile offenders compared to the services they
provided previously. For instance, San Diego County uses its block
grant funds for a program that rehabilitates high‑risk offenders,
and Yuba County uses its funds to target at‑risk youth. At the same
time that counties began providing new or enhanced services
to juvenile offenders, Juvenile Justice’s expenditures significantly
decreased, another potential measure of the effectiveness of
realignment. Specifically, Juvenile Justice’s expenditures for fiscal
year 2006–07—the year prior to realignment—were $481 million
compared to $294 million for fiscal year 2010–11, a reduction of
about $187 million. Furthermore, if all other factors remain constant
and the State continues to spend at levels similar to the fiscal
year 2010–11 amount, including the annual block grant allocation,
realignment could result in an annual savings of $93 million.
Although these indicators are encouraging, the limited—and
potentially misleading—juvenile justice data that are currently
available prevented us from providing a meaningful assessment of
realignment outcomes. Until the Legislature and the board take
steps to refine the information collected from counties and to
define the goals of realignment, any measurement of realignment
outcomes is arbitrary and may not fully represent the impact
realignment has had on juvenile offenders and the State as a whole.
Recommendations
To ensure that it has the information necessary to meaningfully
assess the outcomes of juvenile justice realignment, the Legislature
should consider amending state law to require counties to collect
and report countywide performance outcomes and expenditures
related to juvenile justice as a condition of receiving block grant
funds. In addition, the Legislature should require the board to
collect and report these data in its annual reports, rather than
outcomes and expenditures solely for the block grant.
To maximize the usefulness of the information it makes available to
stakeholders and to increase accountability, the board should do
the following:
•	 Create policies and procedures that include clear, comprehensive
guidance to counties about all aspects of performance outcome
and expenditure reporting.
•	 Consider verifying the counties’ data by conducting regular site
visits on a rotating basis or by employing other procedures to
verify data that counties submit.

California State Auditor Report 2011-129

September 2012

Justice should take additional steps to ensure the accuracy and
completeness of data the counties enter into the JCPSS.
To assess the outcomes of realignment, the Legislature should
consider revising state law to specify the intended goals of
juvenile justice realignment. To assist the Legislature in this effort,
the board should work with relevant stakeholders to propose
performance outcome goals that can be used to measure the
success of realignment.
Agency Comments
Although the board generally agreed with our observations and
stated that it would address the shortcomings we identified if
additional resources were available, it disagreed with several of our
conclusions and recommendations. In addition, Corrections agrees
with our conclusions and stated that it will take steps to implement
our recommendation. Finally, although Justice disagreed with our
assessment of the data limitations associated with the JCPSS, it
generally agreed with our recommendations.

5

6

California State Auditor Report 2011-129

September 2012

Blank page inserted for reproduction purposes only.

California State Auditor Report 2011-129

September 2012

Introduction
Background
Juveniles who enter California’s juvenile justice
system follow a path that can result in a variety of
outcomes. Depending upon the juvenile’s age and
the severity of his or her offenses, the probation
department and the district attorney may file a
petition against the juvenile offender in either
juvenile or adult courts. Both courts conduct
hearings of the juvenile offenders’ cases and decide
their dispositions. As shown in Figure 1 on the
following page, these dispositions can result in
committing juveniles to probation, or incarceration
in a county facility, state juvenile justice facility, or
adult prison.
Among other options, both juvenile and adult
courts can send juvenile offenders to the California
Department of Corrections and Rehabilitation’s
(Corrections) Division of Juvenile Justice (Juvenile
Justice), formerly known as the California Youth
Authority.3 Juvenile Justice operates secure
detention facilities and provides education and
treatment to offenders under the age of 25.4 It also
houses juvenile offenders under 18 years of age who
are convicted in adult court because Corrections’
practice is not to house any juvenile under the age
of 18 in an adult institution, which we verified by
reviewing Corrections’ records. Further, state law
mandates that prosecutors must try juveniles for
certain offenses in adult courts, while prosecutors
have the discretion to determine where they wish
to try certain other offenses. Although hearings in
either type of court can lead to the same types of
outcomes for juvenile offenders, juvenile and adult
courts use different terminology to describe similar
concepts. The text box defines the juvenile court
terminology we use in this report and relates it to
similar terms used in adult courts.

3	

Juvenile Court Terminology
Petition: The formal presentation to a juvenile court of
information related to a juvenile’s alleged offense.
Adult court term: Criminal complaint
True Finding: A finding by a judge that there is adequate
evidence to prove that a juvenile did what he or she is
accused of doing.
Adult court term: Guilty verdict
Adjudicated: The judge’s decision concluding that the
juvenile committed the act for which he or she is charged.
Adult court term: Convicted
Disposition: An action taken by a probation officer or
juvenile court because of a referral.
Adult court term: Sentencing
Direct File: The transfer of a juvenile offender who is
alleged to have committed certain serious violent or
sexual offenses to adult court.
Adult court term: Not applicable
Remand to Adult Court: A disposition resulting from a
fitness hearing that finds a juvenile unfit for the juvenile
system and transfers that juvenile to the adult system.
Adult court term: Not applicable
Delinquent Act: An act committed by a juvenile for which
an adult could be prosecuted in a criminal court.
Adult court term: Criminal act
Referral: A juvenile who is brought to the attention of the
probation department for alleged behavior such as truancy,
failure to obey reasonable and proper orders of his or her
parents, or a violation of the law.
Adult court term: Not applicable
Sources:  California Division of Juvenile Justice’s Juvenile Justice
in 2010 report, the California Courts Web site, U.S. Department of
Justice’s Office of Juvenile Justice and Delinquency Prevention
Web site, San Diego County Probation Department’s Guide
to Understanding the Juvenile Justice System brochure, and
California Welfare and Institutions Code.

As a result of a reorganization of California correctional agencies in 2005, the California Youth
Authority became Juvenile Justice.
4	 For juveniles committed to Juvenile Justice on or after July 1, 2012, this is lowered to age 23.

7

California State Auditor Report 2011-129

September 2012

Figure 1
California’s Juvenile Justice System

Arrest or Referral

NEW CASE

The following entities may make referrals to a county probation department:
• Law enforcement
• Public agencies
• Private agencies
• Schools
• Parents
The county probation department
and the district attorney
decide how to handle the case

Juvenile is transferred or referred
to a county probation department

COUNTY

Possible dispositions include:
• Closed at intake
• Informal probation
• Diversion*
• Transfer†
• Petitions filed in juvenile court
• Direct file in adult court

COUNTY

PROBATION

Adult court conducts
hearings for the juvenile

Possible dispositions include:

Possible dispositions include:

• Dismissal
• Diversion,* deferred entry of
judgment, or transfer†
• Informal probation
• Nonward probation
• Remanded to adult court‡
• Wardship

• Acquitted
• Dismissed
• Diversion*
• Certified to juvenile court
• Convicted

• Own or relative’s home
• Secure county facility
• Nonsecure county facility
• Other public or private agency

COUNTY

• Probation
• Probation with adult jail
• Adult jail

• Division of Juvenile Justice

STATE

Types of adult court convictions

COUNTY

Types of wardship in juvenile court

COUNTY

COUNTY

Juvenile court conducts
hearings for the juvenile

STATE

8

• Division of Juvenile Justice
• Adult prison

Source:  Office of the Attorney General’s 2010 Report for Juvenile Justice in California.
*	 Diversion services, such as community service or counseling, are an alternative to more formal actions within the juvenile justice and
education systems.
†	 Transfers includes cases in which the juvenile is deported or sent to traffic court.
‡	 A juvenile can be sent to adult court if a judge determines at a fitness hearing that the juvenile will not benefit from juvenile court services.

California State Auditor Report 2011-129

September 2012

Juvenile Justice Realignment
In June 1996 the number of juvenile offenders in institutions or on
parole at Juvenile Justice reached a high point of 16,300. In that
year the Legislature amended state law to increase the amounts
counties pay to house lower‑level juvenile offenders in Juvenile
Justice facilities. According to the Legislative Analyst’s Office
(legislative analyst), this legislation was designed to give counties
an incentive to manage less serious offenders locally. Despite
the drop in Juvenile Justice’s population that resulted from the
1996 law, in 2003 the Farrell 5 lawsuit alleged that Juvenile Justice
failed to provide adequate care and services for juvenile offenders
in its facilities. The State entered into a consent decree in 2004 in
which it agreed to address the issues raised in the lawsuit regarding
confinement conditions in Juvenile Justice facilities.
According to the legislative analyst, as a result of
this lawsuit and others, the cost to house a juvenile
Examples of Offenses That Result in
offender at Juvenile Justice increased to $245,000
Juvenile Offenders Being Admitted to the
per juvenile in fiscal year 2008–09.
Division of Juvenile Justice
As costs continued to rise, in 2007 Senate Bill 81 was
enacted as the original juvenile justice realignment
law. Under this law, the State transferred, or
realigned, the responsibility and expense for housing
certain nonserious and nonviolent juvenile offenders
who are not registered sex offenders to the counties.
The Legislature declared that local communities
were better suited than the State to provide certain
juvenile offenders with the programs they need.
Under the law, juvenile courts are prohibited
from sending juveniles adjudicated on or after
September 1, 2007, to Juvenile Justice facilities unless
the adjudication was for certain serious, violent, or
sexual offenses. The text box lists examples of these
types of offenses. Current law also allows juvenile
courts, upon recommendation by the county’s chief
probation officer, to transfer to county supervision
juveniles previously sent to Juvenile Justice.

•	 Murder

As shown in Figure 2 on the following page,
the number of juvenile offenders supervised
by Juvenile Justice decreased significantly
after 2003 and has continued to decline after
realignment from about 5,400 in June 2007
to nearly 2,500 in June 2011. Futher, about
670 nonserious and nonviolent juvenile offenders

Sources:  California Penal Code, Section 290.008, and Welfare
and Institutions Code, sections 707(b) and 733(c).
Note:  Under state law, any person 14 years of age or older
who is alleged to have committed murder or certain types of
sex offenses must be prosecuted in adult court. Further, under
certain circumstances, minor offenders 16 years or older may
also be directly filed and tried in adult court at the discretion of
the prosecutor’s office for an alleged felony violation.

5	

•	 Arson
•	 Robbery
•	 Rape with force, violence, or threat of great bodily harm
•	 A lewd or lascivious act
•	 Certain kidnapping offenses
•	 Attempted murder
•	 Certain offenses committed with the use of a firearm
•	 A violent felony committed in association with criminal
street gang activity
•	 Carjacking while armed with a dangerous or
deadly weapon
•	 Voluntary manslaughter
•	 Offenses requiring the person to register as a sex offender

Case number RG03079344, Superior Court for the State of California, County of Alameda.

9

10

California State Auditor Report 2011-129

September 2012

who are not sex offenders have transferred from Juvenile Justice
to counties since realignment.6 According to Juvenile Justice’s
Web site, as of July 2012, the population in Juvenile Justice’s facilities
represents less than 1 percent of the 225,000 juveniles arrested in
California each year.
Figure 2
Number of Juvenile Offenders Supervised by the Division of Juvenile Justice
2003 Through 2011
6,000

Realignment

Number of Juveniles

5,000
4,000
3,000
Serious, violent, or
sex offenses

2,000
1,000

Other offenses

0
2003

2004

2005

2006

2007

2008

2009

2010

2011

Years
Source:  California State Auditor’s (state auditor) analysis of data obtained from the California Department of Corrections and Rehabilitation’s
Offender‑Based Information Tracking System. Please refer to the Introduction’s Scope and Methodology for the state auditor’s assessment of the
reliability of these data.
Notes:  Data are as of June 30th of each year.
The total number of juvenile offenders includes those in institutions and on parole.
The data did not show juvenile offenders in Alpine County under the Division of Juvenile Justice’s supervision during our reporting period.

Youthful Offender Block Grant
To compensate counties for the increased costs related to
the supervision of juvenile offenders, state law established the
Youthful Offender Block Grant (block grant). State law requires
that counties use block grant funds to enhance the capacity of
various county departments to provide appropriate rehabilitative
and supervision services to juvenile offenders who are transferred
from Juvenile Justice facilities, who are prohibited from being sent
to Juvenile Justice facilities, or who are on parole from Juvenile
Justice facilities for certain offenses. State law also requires counties,

6	

The original realignment law took effect on September 1, 2007. Our analysis of the data contained
in Corrections’ Offender‑Based Information Tracking System is based on fiscal year. Therefore,
some of our analysis may contain data for July and August 2007, the two months prior to
realignment’s effective date.

California State Auditor Report 2011-129

September 2012

in expending block grant funds, to provide all necessary services
related to the custody and parole of these offenders. State law
generally directs the Department of Finance (Finance) to calculate
the amount of the block grant allocated to each county based on
a formula using data obtained from the Department of Justice
(Justice) and from Finance. This formula requires Finance to base
50 percent of the block grant amount on the number of each county’s
juvenile felony court dispositions and the remaining 50 percent on
the total number of each county’s population of juveniles between the
ages of 10 and 17. The text box in Appendix A gives an example of this
calculation. Under state law, each county received a minimum of
$58,500 for fiscal year 2007–08 and a minimum of $117,000 for each
fiscal year thereafter. The State Controller’s Office issues block grant
payments to counties. Table A in Appendix A shows the block grants
by county for fiscal years 2007–08 through 2010–11. The counties
received a total of $93 million in block grants in fiscal year 2010–11.
As shown in Table 1 on the following page, the four counties we
visited provided us with information demonstrating that block
grant funds make up a small portion of the total funds they have
available to spend on juvenile offenders. In fact, the block grant
made up only about 14 percent of the four counties’ four major
juvenile justice‑related funding sources in fiscal year 2010–11.
The four counties we reviewed held varying opinions about whether
they consider block grant funding to be sufficient. Specifically,
Sacramento County (Sacramento) and Yuba County believe their
block grant funds are not adequate, whereas San Diego County
(San Diego) believes it is adequate, and Los Angeles County believes
they are sufficient for its current needs but will not be sufficient when
the juvenile offender population increases. For example, according
to Sacramento’s chief probation officer, the block grant funds are
not adequate to support the county’s only long‑term secure juvenile
offender commitment facility, which it closed in fiscal year 2010–11
due to budget constraints. He stated that the closure of the facility
resulted in the county placing on probation in the community some
juvenile offenders who should have been housed in secure facilities.
Conversely, according to San Diego’s probation department financial
officer, the block grant funding is sufficient to support its Youthful
Offender Unit. The county established this unit to serve its high‑risk
juvenile offenders who probably would have been sent to Juvenile
Justice prior to realignment.

11

12

California State Auditor Report 2011-129

September 2012

Table 1
Top Four Funding Sources for Services to Juvenile Offenders in Los Angeles, Sacramento, San Diego, and Yuba Counties
Fiscal Year 2010–11
FUNDING RECEIVED
SACRAMENTO
COUNTY

SAN DIEGO
COUNTY

$68,019,000

$13,394,000

$10,823,000

$197,000

The Youthful Offender Block Grant allocates funds
to counties to enhance the capacity of their various
departments to provide appropriate rehabilitative and
supervision services to youthful offenders.

21,572,000

4,522,000

7,711,000

180,000

Juvenile Justice Crime The Juvenile Justice Crime Prevention Act supports
Prevention Act
juvenile probation programs with a record of reducing
crime and delinquency among at‑risk youth and
young offenders.

24,883,000

3,452,000

7,693,000

175,000

Juvenile Probation
and Camps
Funding Program

62,338,000

2,799,000

9,779,000

316,000

$176,812,000

$24,167,000

$36,006,000

$868,000

FUNDING SOURCE

Social Security Act,
Title IV‑E

DESCRIPTION

The Adoption Assistance Program provides funds
to states to facilitate the timely placement of
children whose special needs or circumstances
would otherwise make it difficult to place them with
adoptive families.

LOS ANGELES
COUNTY

YUBA
COUNTY

The Foster Care Program helps states provide safe,
stable out‑of‑home care for children until they
are safely returned home, placed permanently
with adoptive families, or placed in other planned
permanent arrangements.
The Guardianship Assistance Program helps states,
Indian tribes, tribal organizations, and tribal consortia
that provide guardianship assistance payments for the
care of children by relatives who have assumed legal
guardianship of children for whom they previously
cared as foster parents. Unlike the Adoption Assistance
and Foster Care programs, this is an optional program.
Youthful Offender
Block Grant

The Juvenile Probation and Camps Funding Program
allocates funds to counties to support the delivery of
23 categories of services to juveniles authorized by
state law.

Totals

Sources:  California State Auditor’s analysis of data from the State Controller’s Office, the Board of State and Community Corrections, and accounting
records from the counties of Los Angeles, Sacramento, San Diego, and Yuba.

Oversight of the Block Grant
The Board of State and Community Corrections (board)7 is a
12‑member independent state agency. Formerly affiliated with
Corrections and known as the Corrections Standards Authority, the
board is responsible for administering the block grant in addition to
overseeing other federal and state juvenile justice grants. According
to the board’s field representative, state law allows counties to spend
7	

Chapter 36, Statutes of 2011, which became effective July 1, 2012, renamed the Corrections
Standards Authority as the Board of State and Community Corrections.

California State Auditor Report 2011-129

September 2012

block grant funds not only on juvenile offenders but also on services
and programs designed to prevent offenses by juveniles.8 State law
requires the board to collect certain block grant data from counties
and to prepare and publish annual reports for the Legislature and
the public. State law also specifies that it is the duty of the board to
collect and maintain available information and data about, among
other things, state and community correctional policies, practices,
capacities, and needs related to juvenile justice. In fulfilling this
duty, the board must seek to collect and make publicly available
up‑to‑date data and information reflecting the impact of juvenile
justice policies and practices enacted in the State, as well as
information and data concerning promising and evidence‑based
practices from other jurisdictions. Further, the state law authorizing
the block grant allows the board to monitor and inspect any
programs or facilities supported by these funds and to enforce
violations of grant requirements with suspensions or cancellations
of grant funds.
The law originally authorizing the block grant also required the
State Commission on Juvenile Justice and the counties to complete
specific objectives related to realignment. Specifically, state law
required the State Commission on Juvenile Justice to develop
a Juvenile Justice Operational Master Plan by January 1, 2009.
Further, the law that established the block grant required each
county to prepare and submit to the board, by January 1, 2008, a
Juvenile Justice Development Plan (development plan) regarding
the programs, placements, services, and strategies it intended to
fund using the block grant.
In July 2008, following realignment, the Little Hoover Commission’s
juvenile justice reform report, along with other reports, highlighted
the shortcomings of the original realignment law. In an attempt to
bring accountability to the block grant, the Legislature amended
the law in 2009 and created additional requirements that counties
must complete annually. The original juvenile justice realignment
legislation required a county to submit a development plan only once,
whereas the 2009 law requires counties to submit a development
plan for their block grant allocations by May of each year identifying
their proposed expenditures for the upcoming fiscal year. The
law also requires counties to report their actual expenditures and
performance outcomes for the previous fiscal year each October.

8	

A law that took effect on June 30, 2011, and that provides a general funding mechanism for
various grants including the block grant, specifies that block grant funds should be used solely to
provide services to youthful offenders who: are transferred from Juvenile Justice facilities, who
are prohibited from being sent to Juvenile Justice facilities, or are on parole from Juvenile Justice
facilities for certain offenses. According to the board’s field representative, the board is seeking
assistance to obtain a legal opinion to determine whether any change in policy is needed as a
result of this law. We did not review any block grant expenditures made after June 30, 2011.

13

14

California State Auditor Report 2011-129

September 2012

To oversee the implementation of these new accountability
measures, the board established an executive steering
committee (committee) to guide the design and development
of forms and processes necessary to implement the statutory
changes to the block grant program. The committee
was composed of a cross‑section of stakeholders when it was
established in 2009. The committee’s members included
subject matter experts, researchers, chief probation officers,
and members of the public. The committee worked to
clarify and streamline the new statewide reporting requirements.
Given the flexibility counties have in their use of block grant
funds, the board and the committee determined that reporting
on the outcomes designated in state law would be difficult for
counties. The board therefore decided to modify these outcome
measures as permitted by state law.
Table 2 shows the state law’s original performance outcomes and the
board’s adopted modified performance outcomes. Most significantly,
to minimize the burden on counties that may lack the ability to track
certain data, the committee and the board chose to gather performance
outcome data for a sample of juvenile offenders each year rather than
requiring counties to report data on all the juveniles that may have
received services. The board elected to use data from Justice’s
Juvenile Court and Probation Statistical System to select a random
statewide sample of approximately 1,000 juvenile offenders with
felony adjudications for nonviolent, nonsexual offenses in the
previous fiscal year because it believed this sample was most likely
to include the juvenile offenders who would have been sent to
Juvenile Justice prior to realignment. The committee believed this
sampling strategy would allow the board to measure the impact of
block grant‑funded programs, placements, services, and strategies
in a streamlined fashion.
The board also decided to require counties to report a variety of
information about the sample of juvenile offenders at the time
of their dispositions as well as the year following their dispositions.
Specifically, counties must submit data about the juvenile offenders’
characteristics as of the date of the disposition, including whether
they were enrolled in school or employed, the types of services
the juvenile offenders received during the year following their
dispositions, and the funding sources the counties used to pay
for these services. In addition, the board requires the counties to
report juvenile offenders’ educational and criminal justice outcomes
during this period.
The board gathers the performance outcomes and expenditure data
from the counties, analyzes these data, and presents the results in its
annual report to the Legislature. The report compares the juvenile
offenders who received services funded by the block grant with those

California State Auditor Report 2011-129

September 2012

who did not receive services funded by the block grant. For example,
it includes comparisons of the rate of school enrollment and the
rate of new felony adjudications in juvenile court or convictions in
adult court between juvenile offenders who received services paid
for by the block grant and those who did not. The board has issued
two annual reports since the statutory changes in 2009 that imposed
additional accountability measures.
Table 2
Board of State and Community Corrections’ Original and Revised Performance Outcomes
ORIGINAL PERFORMANCE OUTCOMES AND
REPORTING REQUIREMENTS IN STATE LAW

EXECUTIVE STEERING COMMITTEE’S MODIFIED PERFORMANCE OUTCOMES AND
REPORTING REQUIREMENTS ADOPTED BY THE
BOARD OF STATE AND COMMUNITY CORRECTIONS

Performance Outcome Reports
Reported population

Youth served by the Youthful Offender
Block Grant (block grant) funds.

A sample of juveniles with sustained felony offenses in the prior
fiscal year.

Reporting period

Preceding fiscal year.

One year following each juvenile’s disposition date for sustained
felonies. The reporting period varies for each juvenile in the sample.

Demographic characteristics

The number of youth the county
served using the block grant. Reported
characteristics include offense, age,
gender, race, and ethnicity.

The number of youth in the sample. Reported characteristics include
offense, age, gender, race, ethnicity, school enrollment, graduation
status, employment, case plan (if any), substance abuse history,
mental health history, and child welfare dependency.

Programs, placements, services

The rate of successful completion by
juvenile offenders of relevant programs,
placements, services, or strategies.

The number of juveniles in the sample who received placements
or services by the various juvenile justice funding sources that
are available.

Performance outcomes

The arrest, rearrest, incarceration, and
probation violation rates of youth in any
program or placement supported by
block grant funds.

The number of juveniles in the sample who, during and at the end
of the reporting period, were enrolled in school or were placed
on probation, graduated, received new felony adjudications or
convictions in juvenile court or adult court, and were committed to
Juvenile Justice facilities.

Quantification of the annual per capita
cost of any program, placement,
strategy, or activity.

Quantification of the total annual per capita cost of any program,
placement, strategy, or activity paid for with any block grant funding.

Actual Expenditure Reports
Financial information

Number and type of juveniles served by receiving any program,
placement, strategy, or activity paid for with any block grant funding.
Sources:  California Welfare and Institutions Code, Section 1961(c)(2), and the Board of State and Community Corrections’ annual report, performance
outcome report template, and actual expenditure report template.

15

16

California State Auditor Report 2011-129

September 2012

Scope and Methodology
The Joint Legislative Audit Committee (audit committee)
directed the California State Auditor to conduct an audit of the
juvenile justice realignment. We conducted fieldwork at the board,
Corrections, Justice, and at the county probation departments in
Los Angeles, Sacramento, San Diego, and Yuba counties. Table 3
outlines the audit committee’s objectives and our methodology for
addressing each objective.
Table 3
Methods Used to Address Audit Objectives
AUDIT OBJECTIVE

METHOD

1

Review and evaluate the laws, rules,
and regulations significant to the
audit objectives.

We reviewed relevant laws, regulations, and other background materials applicable to juvenile justice
realignment and the Youthful Offender Block Grant (block grant).

2

For each fiscal year, beginning in
2007–08, determine how much
Youthful Offender Block Grant (block
grant) funding counties have received
and expended. Specifically, determine
the amount of unexpended block
grant funds.

•  We identified and documented legal and procedural criteria regarding block grant allocations
and expenditures.

For each year since the passage
of Senate Bill 81 (SB 81) in fiscal
year 2007–08, determine the State’s
juvenile population in the Division
of Juvenile Justice (Juvenile Justice) as
well as the number of juveniles within
the adult prison population.

•  We obtained and analyzed data from the California Department of Corrections and Rehabilitation’s
(Corrections) Offender‑Based Information Tracking System (OBITS).

3

4

•  We obtained and documented the total block grant allocations for all counties from the State
Controller’s Office (State Controller) records for fiscal years 2007–08 through 2010–11.
•  We identified total block grant expenditures by county and by service for fiscal years 2009–10
and 2010–11, using data the Board of State and Community Corrections (board) collected from counties.
•  We were unable to determine counties’ total expenditures and the total amount of their unexpended
block grant funds for two fiscal years as requested because the board collected and reported
county data only for fiscal years 2009–10 and 2010–11. Also, the board’s reports do not reflect any
unexpended funds that counties may have retained for fiscal years 2007–08 and 2008–09. For the
four counties we selected, we performed the following:
–  Obtained total block grant allocations and expenditures for fiscal years 2007–08 through
2010–11 from county financial records.
–  Determined unexpended block grant funds for fiscal years 2007–08 through 2010–11, using
county financial records.
–  Identified counties’ planned uses of any unexpended block grant funds.

•  Using OBITS, we determined the total juvenile population supervised by Juvenile Justice for fiscal
years 2003–04 though 2010–11. We considered data for the four fiscal years prior to realignment to
better assess trends and the impact of realignment on the juvenile offender population.
•  We did not determine the number of juveniles within the adult prison population because
Corrections’ practice is not to house any juvenile offender under the age of 18 in an adult institution,
which we verified by reviewing Corrections’ records.

Assess the trends in the number of
•  We attempted to use Corrections’ Offender-Based Information System (OBIS) to identify the number of
juveniles tried as adults and sent to
juveniles tried as adults and sent to prison. However, as described in Table 4 on page 19, we identified
prison for each year subsequent to the
an area of concern that precluded us from identifying this population.
passage of SB 81.
•  We attempted to use the Department of Justice’s (Justice) Automated Criminal History System
(criminal history system) to determine the number of juveniles tried as adults and sent to prison.
However, as described in Table 4, we identified data limitations that precluded us from doing so.
•  We interviewed the chief probation officers at the four counties we visited regarding their
perceptions of the relationship between realignment and juveniles being tried as adults.

California State Auditor Report 2011-129

September 2012

AUDIT OBJECTIVE

5

To the extent data are available,
determine how many juveniles in
Juvenile Justice were transferred from
state to local control subsequent to
the passage of SB 81.

6

For a sample of counties, determine
the following:
a)	 Whether they accurately
accounted for their block grant
allocations and expenditures.

METHOD

Using OBITS, we obtained and analyzed the number of juveniles transferred from the State to
counties for fiscal years 2003–04 through 2010–11. We considered data for the four fiscal years
prior to realignment to better assess trends and the impact of realignment on the juvenile
offender population.

•  For the four counties we selected, we performed the following:
–  Obtained accounting reports of block grant allocations and compared them to the State
Controller’s block grant allocation records, including remittance advices.
–  Selected five expenditures, including a payroll expenditure, from each fiscal year from 2007–08
through 2010–11. We reviewed the expenditures to determine whether they related to juvenile
justice, as required by state law.
–  Traced the expenditures from the original invoices to the accounting system to determine
whether they were properly recorded.
–  Compared expenditure data from accounting records to data collected by the board for fiscal
years 2009–10 and 2010–11 to ensure that counties accurately reported expenditures to the board.
•  We determined that three of the four counties we visited accurately reported their block grant
allocations and expenditures. Yuba County was not able to separately identify its block
grant allocations or expenditures or its total juvenile justice expenditures.

b)	 What types of services they
provided with block grant funds
and whether these services are
similar to those provided by
Juvenile Justice.

•  We obtained and documented a list of services provided to juveniles in Juvenile Justice.

c)	 Whether they supplement block
grant funds with other funding
sources to provide services to
juvenile offenders.

•  For the four counties we selected, we performed the following:
–  Obtained accounting records to determine the amount supplemented from other
funding sources.
–  Interviewed relevant staff to determine whether they believe the amount of block grant funding
is adequate to provide services to their juvenile offenders.

•  For the four counties we selected, we performed the following:
–  Obtained and documented a list of services counties provided with block grant funds.
–  Determined the number of juveniles the counties served using the block grant.
–  Compared Juvenile Justice’s and the counties’ lists of services to determine whether the
counties’ services were similar to those provided by Juvenile Justice.

•  The board has interpreted state law to mean that counties can spend block grant funds not only on
juvenile offenders but also on services and programs designed to prevent offenses by juveniles. As
a result, counties can spend block grant funds on any aspect of their entire juvenile justice systems.
Because of this, the fact that counties support their juvenile justice systems with other funds in
addition to block grant funds is not an issue.
d)	 The rates of admission to Juvenile
Justice and to adult prison facilities
for each year since the passage of
SB 81.

•  Using Justice’s Juvenile Court and Probation Statistical System (JCPSS), we obtained and analyzed
data regarding the number of dispositions admitting juvenile offenders to Juvenile Justice since
fiscal year 2003–04. We considered data for the four fiscal years prior to realignment to better assess
trends and the impact of realignment on the juvenile offender population.
•  We interviewed county staff for explanations of causes of trends in the number of juveniles admitted
to Juvenile Justice.

e)	 Whether they are meeting block
grant requirements, including
those related to the annual
application process and the timely
reporting of expenditure and
performance outcomes.

•  The board has interpreted state law to mean that counties can spend block grant funds not only
on juvenile offenders but also on services and programs designed to prevent offenses by juveniles.
Based on the work we performed for objective 2, we determined that all four of the counties we reviewed
appropriately expended block grant funds on juvenile justice activities.
•  State law requires counties to submit block grant applications by May 1 and performance
outcome and expenditure reports by October 1. The four counties we reviewed submitted the required
reports and generally did so on time. The board indicates that it does not take any adverse action
against counties that fail to submit their reports on time; thus, timely submission of reports has no
effect on counties’ block grant funding.
continued on next page . . .

17

18

California State Auditor Report 2011-129

September 2012

AUDIT OBJECTIVE

METHOD

7

Determine the extent to which
block grant information, including
performance outcomes and county
financial data, is available to the public.

We reviewed the board and county Web sites and interviewed relevant staff to determine the amount
and type of information that is available to the public.

8

Determine the State’s and counties’
level of oversight and monitoring of
the block grant.

We interviewed board and county staff for each of the four counties we reviewed and obtained
relevant criteria or policies and procedures related to monitoring the block grant.

9

Determine what enforcement
actions the board can take against
counties that do not meet block grant
requirements, and if the board has
ever taken any enforcement action.

•  We interviewed relevant board staff and documented criteria related to enforcement actions.

Review and assess the performance
standards used and the outcomes
reported, including the
reasonableness of the methods used
to develop the standards and whether
the standards are applied consistently
among counties. Additionally,
determine how the board measures
counties’ success.

•  We interviewed board staff and reviewed board reports to determine the performance standards
and outcomes the State currently uses.

10

•  State law allows the board to enforce violations of block grant requirements by withholding
counties’ block grant payments. The board’s field representative stated that the board has never
taken any enforcement action against counties because it has not needed to do so and because the
board believes that the State Controller is the fiduciary agent for the block grant. During our review,
we found that the four counties we reviewed generally met block grant requirements.

•  We determined whether counties consistently report performance outcomes by interviewing board
and county staff and by reviewing performance reports for the four counties we selected.
•  We assessed the reasonableness of the performance standards by analyzing limitations to the
data the board currently collects and uses. In addition, we reviewed the board’s and counties’ data
collection and reporting practices to determine whether they are reasonable. Finally, we interviewed
members of the board and staff from the four counties we visited regarding the reasonableness of
the performance standards.
•  For the four counties we selected, we performed the following:
–  Obtained and documented practices related to developing and/or tracking
performance outcomes.
–  Requested and documented other methods the counties use to measure success, such as recidivism.
–  Reviewed a selection of case files to determine the accuracy of performance outcomes the
counties reported to the board.

11

12

13

Based on the data the board has
collected, determine which counties
have most significantly increased or
decreased the rates of admission to
Juvenile Justice and to adult prison
facilities since the passage of SB 81.

•  We did not use the data that the board collected to determine which counties have most
significantly increased or decreased the rates of admission to Juvenile Justice and to adult prison
facilities because the board collects data for only a sample of juveniles. Additionally, the board has
collected information from counties only for fiscal years 2009–10 and 2010–11.

Determine what happens to
programs found to be successful
and unsuccessful and how much
block grant funding the best
performing and poorest performing
counties receive.

•  We interviewed board staff to determine how the board measures counties’ success and to
determine what, if anything, happens to counties found to be successful and unsuccessful.

•  Using Justice’s JCPSS, we determined the statewide number of dispositions admitting juvenile
offenders to Juvenile Justice and the number of dispositions sending juvenile offenders to adult
court for all counties since realignment.

•  For each of the four counties we selected, we interviewed chief probation officers to obtain their
perspectives about the success of the realignment and block grant.

Review and assess any other issues
that are significant to the realignment
of juvenile offenders from state to
local control.
During the course of the audit,
we identified additional issues
that we believe are significant.
Therefore, we conducted
additional testing to address the
following objectives:
a)	 Determine whether the
block grant funding formula
promotes long‑term
juvenile rehabilitation.

•  We identified opportunities to improve the formula through interviews of county and board staff
and a review of the Juvenile Justice Operational Master Plan.
•  We reviewed fluctuations in block grant funding allocations to determine whether a decrease in
counties’ juvenile felony dispositions affected their subsequent funding allocations.

California State Auditor Report 2011-129

September 2012

AUDIT OBJECTIVE

METHOD

b)	 Determine whether quality
juvenile justice‑related
data exists.

We reviewed Corrections’ OBIS and OBITS, and Justice’s JCPSS and criminal history system. In
addition, we reviewed the availability of data from the four counties we selected.

c)	 Assess whether fluctuations
in crime statistics could
represent an outcome
of realignment.

•  Using Justice’s JCPSS, we obtained and analyzed data for the total number of juveniles who received
dispositions for all counties for fiscal years 2003–04 through 2010–11.
•  We classified juvenile offenders as repeat and first‑time offenders. We defined repeat offenders as
any juvenile who has received two or more dispositions that were not dismissals.

d)	 Determine whether Juvenile
Justice’s expenditures
have decreased from fiscal
year 2006–07—prior to
realignment—to fiscal
year 2010–11.

We obtained and trended statewide total expenditure data for Juvenile Justice for fiscal years 2006–07
through 2010–11.

e)	 Determine counties’ other
significant sources of juvenile
justice funding.

•  For the four counties we selected, we obtained and reviewed accounting records and other
documentation to determine the major funding sources and expenditures related to providing
services to juvenile offenders.
•  We identified other juvenile justice grant programs available to counties and the performance
outcomes for those grants.

Sources:  The California State Auditor’s analysis of the Joint Legislative Audit Committee request number 2011‑129, planning documents, and analysis
of information and documentation identified in the column titled Method.

In performing this audit, we relied upon electronic data files extracted
from a variety of information systems. The U.S. Government
Accountability Office, whose standards we follow, requires us to
assess the sufficiency and appropriateness of computer‑processed
information. Table 4 shows the results of this analysis.
Table 4
Methods Used to Assess Data Reliability
INFORMATION SYSTEM

California
Department of
Corrections and
Rehabilitation’s
(Corrections)
Offender‑Based
Information Tracking
System (OBITS).
Data as of
March 2012.

PURPOSE

•  To identify the number
of juvenile offenders
supervised by the
Division of Juvenile
Justice (Juvenile Justice)
who committed certain
serious, violent, or sexual
offenses and the number
of juvenile offenders who
committed nonserious
and nonviolent offenses
for fiscal years 2003–04
through 2010–11.
•  To calculate the number of
juvenile offenders within
the adult prison population
for fiscal years 2003–04
through 2010–11.
•  To determine the number
of juveniles tried as adults
and sent to state prison
for fiscal years 2003–04
through 2010–11.

METHODS AND RESULTS

CONCLUSION

•  We performed data‑set verification procedures and electronic testing of key
data elements and did not identify any issues.
•  The OBITS data captures information on all juvenile offenders who have
been supervised by Juvenile Justice. However, OBITS’ data do not capture
information on juvenile offenders who were tried as adults and sent directly to
an adult prison without spending time in Juvenile Justice facilities. As a result,
we were not able to use OBITS to identify the number of juvenile offenders
within the adult prison population and categorize them by the type of crime
committed, nor could we use OBITS to identify the number of juveniles tried
as adults and sent to state prison for fiscal years 2003–04 through 2010–11.

Not sufficiently
reliable for the
purposes of
this audit.

•  We recently conducted a separate review of selected Corrections’ system
controls, which included general and business process application controls.
During this review, we identified significant deficiencies in Corrections’
general controls over its information systems. General controls support the
functioning of business process application controls; both are needed to
ensure complete and accurate information processing. If the general controls
are inadequate, the business process application controls are unlikely to
function properly and could be overridden. Due to persuasive weaknesses in
Corrections’ general controls, we did not perform any testing of the business
process application controls. Consequently, until Corrections implements
adequate general controls over its information systems, the completeness,
accuracy, validity, and confidentiality of its data will be at risk.

continued on next page . . .

19

20

California State Auditor Report 2011-129

September 2012

INFORMATION SYSTEM

Corrections’
Offender‑Based
Information System
(OBIS).
Data as of
March 2012.

Department of
Justice’s (Justice)
Automated Criminal
History System
(criminal history
system).

PURPOSE

•  To identify the number
of juvenile offenders
within the adult
prison population for
fiscal years 2003–04
through 2010–11.
•  To determine the number
of juvenile offenders
tried as adults and sent
to state prison for fiscal
years 2003–04 through
2010–11.

To determine the
dispositions for juveniles
tried as adults for
fiscal years 2003–04
through 2010–11.

Data as of April 2012.

METHODS AND RESULTS

CONCLUSION

•  We performed data‑set verification procedures and did not identify any issues.

Not sufficiently
reliable for the
purposes of
this audit.

•  We performed electronic testing of key data elements and identified an  
area of concern related to our audit objectives. Specifically, we identified
a significant number of incomplete offense dates for offenders in the OBIS
system. Without complete offense date information, we were unable to
determine whether offenders were juveniles at the time they committed
their offenses. As a result, we were not able to use OBIS to identify the
number of juvenile offenders within the adult prison population or to
identify the number of juveniles tried as adults and sent to state prison for
fiscal years 2003–04 through 2010–11.
•  We conducted a separate review of selected Corrections’ system
controls, which included general and business process application controls.
During this review, we identified significant deficiencies in Corrections’
general controls over its information systems. General controls support the
functioning of business process application controls; both are needed to
ensure complete and accurate information processing. If the general controls
are inadequate, the business process application controls are unlikely to
function properly and could be overridden. Due to persuasive weaknesses in
Corrections’ general controls, we did not perform any testing of the business
process application controls. Consequently, until Corrections implements
adequate general controls over its information systems, the completeness,
accuracy, validity, and confidentiality of its data will continue to be at risk.
•  We experienced trouble in completing data‑set verification procedures due
to the fact that Justice counted the records in the criminal history system
tables that it provided to us before it actually extracted the files from the
system. Since the criminal history system is a live system, it is constantly
updating. As a result, many of the record counts Justice provided to us do
not match the number of records in the tables we used. They are, however,
reasonably close.

Not sufficiently
reliable for the
purposes of
this audit.

•  We did not conduct accuracy and completeness testing on the data
because Justice receives the data in the criminal history system from local
law enforcement agencies, district attorney offices, and courts throughout
California, making such testing impractical.
•  Many juvenile offenders in the criminal history system were erroneously
categorized as having received adult dispositions. During our fieldwork,
the assistant bureau chief in Justice’s Bureau of Criminal Information
and Analysis indicated that inconsistencies in training and procedural
documentation, coupled with a heavy workload and high turnover, have led
to Justice’s technicians assigning adult disposition codes to juvenile records.
In addition, Justice indicated that some local law enforcement agencies
submit juvenile dispositions using adult disposition codes. However, Justice
was unable to provide us with an example of an incorrect submission. We
discuss this issue in more detail in Chapter 1.

Justice’s Juvenile
Court and Probation
Statistical System
(JCPSS).
Data as of April 2012.

•  To determine the total
number of first‑time and
repeat juvenile offenders
in each county for fiscal
years 2003–04 through
2010–11.

•  We performed data‑set verification procedures and electronic testing of key
data elements and did not identify any issues.
•  We did not conduct accuracy and completeness testing on the data
because Justice receives the data in the JCPSS from 57 of 58 counties’
probation departments located throughout California, making such
testing impractical.

•  To determine the number
of first‑time and repeat
juvenile offenders in
each county who had
their cases direct filed or
remanded to adult court
for fiscal years 2003–04
through 2010–11.
Sources:  Various documents and data from Corrections and Justice.

Undetermined
reliability for
the purposes of
this audit.

California State Auditor Report 2011-129

September 2012

Chapter 1
AVAILABLE DATA RELATED TO REALIGNMENT ARE
LIMITED AND COULD BE MISLEADING
Chapter Summary
Despite the significant potential human consequences and financial
impact of the State’s decision to shift the care of thousands of
juvenile offenders from the California Department of Corrections
and Rehabilitation’s (Corrections) Division of Juvenile Justice
(Juvenile Justice) to the counties, very limited data exist to measure
whether this realignment has been successful. Although state law
requires the Board of State and Community Corrections (board) to
submit an annual report to the Legislature that contains Youthful
Offender Block Grant (block grant) funds performance outcomes
and county expenditure data, the board currently collects and
reports county data that may not accurately represent the outcomes
related to either the block grant or realignment as a whole. These
reports, which are based on a flawed methodology, could lead
decision makers and the public to draw misleading conclusions
about the effectiveness of the block grant and realignment. For
example, the board’s reports focus primarily on the counties’ use
of block grant funds even though outcomes for juvenile offenders
cannot always be directly correlated to the block grant using the
board’s current methodology. The usefulness of the reports is
further eroded because the board does not give adequate guidance
to counties and does not adequately verify the accuracy of the
information it collects from them. As a result of these problems,
decision makers should not use the reports to assess the success or
failure of either realignment or the block grant.
Because the board’s reports cannot be used to assess the outcomes
of realignment, we had hoped that we could rely on county or
statewide data for this purpose; however, we discovered data
limitations at both the county and state level. For example,
three of the four counties we visited are not able to easily report
realignment outcomes. Further, the Department of Justice (Justice)
cannot ensure the reliability of the state‑level data within its
systems, a problem that is further exacerbated by inherent technical
shortcomings in one of its databases. Finally, Corrections does not
have a system that is capable of identifying the number of juvenile
offenders tried as adults and sent to adult prison. As a result of
these limitations, neither we nor decision makers can meaningfully
assess the outcomes of realignment at this time.

21

22

California State Auditor Report 2011-129

September 2012

The Board’s Annual Reports on the Block Grant Could Mislead
Decision Makers
State law does not require the board to include an assessment of
the outcomes of realignment in its annual report on block grant
outcomes. However, the board’s reports include data that could
mislead decision makers and the public. In addition, the board
aggregates data and presents trends only on a statewide level.
Therefore, the trends within a given county may be obscured by
the data reported by other counties in the State.
The Board Reports on Outcomes That May Not Be Representative
of Realignment

The board’s reports are based
on a flawed methodology and
could mislead decision makers to
potentially inaccurate conclusions,
including making it appear that the
block grant and realignment are
not effective.

The outcomes developed and reported by the board may not
accurately represent the outcomes of realignment. As discussed
in the Introduction, the Legislature amended the realignment
law in 2009 to include accountability mechanisms, including a
requirement that counties submit block grant performance and
expenditure data to the board. Because the law requires the board
to compile these data into annual reports that it submits to the
Legislature, we would expect the reports to allow the Legislature to
draw conclusions regarding the success or failure of realignment.
However, the board’s reports are based on a flawed methodology
and could mislead decision makers to potentially inaccurate
conclusions, including making it appear that the block grant and
realignment are not effective.
The board’s reports may not reflect the outcomes of realignment
in part because the law requires the board to focus its reports
primarily on the counties’ use of block grant funds rather than on
their entire juvenile justice systems. Although state law requires
the board to submit annual reports regarding counties’ use of block
grant funds, attempting to assess the outcomes of realignment
through an examination focused on the use of these funds is not
meaningful for several reasons. First, the board has interpreted state
law to mean that counties can spend block grant funds to enhance
their juvenile justice systems as a whole rather than requiring them
to spend the funds on specific juvenile offenders who might have
been sent to Juvenile Justice prior to realignment. In addition,
counties may use a number of different sources of funds to serve
realigned juvenile offenders, including those shown in Table 1 in the
Introduction, rather than solely using block grant funds. As a result,
the outcomes of realignment cannot be directly correlated to the
block grant using the board’s current methodology.

California State Auditor Report 2011-129

September 2012

Nonetheless, the board’s reports compare outcomes for juvenile
offenders who receive block grant services and those who do
not. Such a comparison implies that one can identify block grant
outcomes—and thus realignment outcomes—by examining
the outcomes for juvenile offenders that receive block grant
services. According to the board’s field representative, the board
chose to present a comparison of these two groups in order
to assess whether juveniles’ outcomes improved when they
participated in programs and services funded by the block grant.
However, the field representative also noted that presenting
comparisons of outcomes is valid only if no one attempts to derive
any conclusions from the comparisons because counties select which
juvenile offenders receive block grant‑funded services. The board’s
reports similarly disclose that caution must be taken in drawing
conclusions because the board has no information from counties
concerning the juveniles who receive these services. If the board
did not intend for the Legislature to draw conclusions from these
comparisons, we question why it elected to present the comparisons
at all, especially given that the results can be misleading.
The misleading nature of these results is caused in part by the board’s
decision not to report on the type of juvenile offenders upon whom
counties choose to spend their block grant funds, such as high‑risk
offenders or juvenile offenders at various risk levels. Among the
four counties we visited, Los Angeles County (Los Angeles) and
San Diego County (San Diego) reported focusing their block grant
funds on their high‑risk or higher‑risk offenders—such as those who
are considered most likely to reoffend—while Sacramento County
(Sacramento) and Yuba County (Yuba) reported spending block
grant funds on nearly all types of juvenile offenders. It is reasonable
to assume that criminal justice outcomes for juvenile offenders within
counties that spend block grant funds only on high‑risk offenders
would be worse than outcomes in counties that spend block grant
funds on lower‑risk offenders because high‑risk offenders are more
likely to be convicted of new offenses.
The data we reviewed generally support this assumption. For
example, as shown in Table 5 on the following page, in fiscal
year 2010–11 Los Angeles, a county that focuses its spending
on higher‑risk offenders, reported to the board that 6 percent
of the juvenile offenders in its sample who received block
grant‑funded services were convicted of new felonies in adult court,
compared to only 1 percent of the juvenile offenders who did not
receive block grant‑funded services. Conversely, Sacramento, which
uses block grant funds to serve juvenile offenders at various risk
levels, reported to the board that 6 percent of its juvenile offenders
who received block grant services were convicted of new felonies in
adult court, compared to 14 percent of the juvenile offenders who
did not receive block grant‑funded services. These results likely

The misleading nature of these
results is caused in part by the
board’s decision not to report on
the type of juvenile offenders upon
whom counties choose to spend
their block grant funds.

23

24

California State Auditor Report 2011-129

September 2012

do not signify that realignment is succeeding in Sacramento and failing
in Los Angeles; rather, they present different outcomes for counties
that elected to spend their block grant funds in different ways.
Table 5
Selected Performance Outcomes for the Counties We Reviewed as Reported to the
Board of State and Community Corrections
Fiscal Year 2010–11
JUVENILE OFFENDERS WHO RECEIVED
BLOCK GRANT SERVICES
SUMMARY OF PERFORMANCE OUTCOMES

JUVENILE OFFENDERS WHO DID NOT
RECEIVE BLOCK GRANT SERVICES

YES

NO

YES

NO

61%

39%

70%

30%

Los Angeles County
Was the youth enrolled in school at the end of the one year
reporting period?
Did the youth graduate from high school or achieve a General
Education Development (GED) test or equivalent?
Did the youth receive a new felony adjudication (juvenile court)?
Did the youth receive a new felony conviction (adult court)?

4

96

5

95

17

83

11

89

6

94

1

99

Sacramento County
Was the youth enrolled in school at the end of the one year
reporting period?
Did the youth graduate from high school or achieve a GED test
or equivalent?
Did the youth receive a new felony adjudication (juvenile court)?
Did the youth receive a new felony conviction (adult court)?

56%

44%

57%

43%

6

94

0

100

16

84

0

100

6

94

14

86

81%

19%

San Diego County
Was the youth enrolled in school at the end of the one year
reporting period?

81%

19%

Did the youth graduate from high school or achieve a GED test
or equivalent?

13

87

9

91

Did the youth receive a new felony adjudication (juvenile court)?

6

94

8

92

Did the youth receive a new felony conviction (adult court)?

6

94

4

96

NA

NA

Yuba County
Was the youth enrolled in school at the end of the one year
reporting period?

100%

0%

Did the youth graduate from high school or achieve a GED test
or equivalent?

0

100

NA

NA

Did the youth receive a new felony adjudication (juvenile court)?

0

100

NA

NA

Did the youth receive a new felony conviction (adult court)?

0

100

NA

NA

Sources:  California State Auditor’s analysis of data from the Board of State and Community Correction’s performance outcome reports submitted by
counties for fiscal year 2010–11.
Note:  The one‑year reporting period is the one year following the juvenile offender’s adjudication date, which varies for each juvenile offender in
the sample.
NA = Not applicable. All of the juvenile offenders in the Yuba County sample received block grant services.

California State Auditor Report 2011-129

September 2012

The board’s reports may also be misleading because they do not
include outcomes for every type of juvenile offender who receives
block grant services. Rather, the board focuses only on juvenile
offenders who have committed felonies and who might have been
sentenced to Juvenile Justice before realignment. As a result,
outcomes may be skewed toward more negative results than if
the board included outcomes for all types of juvenile offenders.
As mentioned previously, the board allows counties to spend
block grant funds in many different ways, including services for
juveniles who have not committed felonies. For example, according
to the board’s data, 17 counties reported that they had elected to
spend block grant funds on juveniles who were at‑risk but had not
yet become involved in the juvenile justice system. In addition,
44 counties reported spending money on programs, placements,
or services that serve offenders with misdemeanors as opposed to
serving only those offenders with felonies. However, the board does
not incorporate outcomes for juveniles who are at‑risk or who have
committed only misdemeanors into its reports.
Because of these deficiencies, the board’s reports could suggest
that realignment has been ineffective, which may misrepresent the
facts. In particular, the board stated in the executive summaries of
both of its reports that a significantly higher percentage of juvenile
offenders who had received block grant‑funded services statewide
had new felony adjudications compared to those who had not
received block grant‑funded services. Based on this information,
decision makers could conclude that the block grant is actually
increasing the likelihood that a juvenile will reoffend, when it would
be more accurate to conclude simply that some counties have
focused their use of block grant funds on high‑risk offenders.
Although the board cautions against drawing conclusions from
the results that it presents in its reports, we question why it would
create and issue reports that do not allow decision makers to make
determinations regarding whether realignment is working. The
four chief probation officers of the counties we visited believe that
the outcomes the board collects and reports do not accurately
reflect the outcomes of realignment.9 In addition, one of the
executive steering committee’s (committee) co‑chairs stated that
although the decrease in crime statewide demonstrates that juvenile
realignment has been effective, the board’s reports do not reflect
this trend. By issuing misleading reports, the board is missing an
opportunity to inform decision makers and the public about the
impact of realignment.
9	

For example, the chief probation officer of San Diego believes that the measures collected by the
board are more process than outcome focused. He suggested that to better track the impact of
the block grant, the measures should include data such as arrests during probation supervision,
employment rates of the targeted population, or recidivism of juvenile offenders served in block
grant-funded programs.

The board’s reports could suggest
that realignment has been
ineffective, which may misrepresent
the facts.

25

26

California State Auditor Report 2011-129

September 2012

The board could address some of the weaknesses in its reports
by collecting and reporting additional information. For example,
it could ask counties to report outcomes for all juveniles they
served using block grant funds, including those who commit
misdemeanors only or who have not yet committed offenses.
This would also address a problem identified by Sacramento’s chief
probation officer, who stated that the board’s sample is too small
and thus may not accurately reflect the entire population within the
county. The board’s field representative agreed that it would make
sense to have counties report countywide statistics such as rearrest
rates, new juvenile adjudications, and new convictions in adult
court; further, she acknowledged that these statistics would allow
decision makers to see a county’s overall philosophy and approach to
juvenile justice. In addition, one of the committee’s co‑chairs stated
that asking counties for more global information, such as the number
of minors in detention and the juvenile crime rate, would provide a
more accurate representation of the outcomes of realignment.
The board’s field representative
believes that asking counties to
report additional data regarding
the outcomes of realignment would
be onerous and may constitute a
state mandate.

However, the field representative also believes that asking counties
to report these data would be onerous and may constitute a state
mandate. While we acknowledge that data collection could be
challenging because, as we discuss later, counties generally do not
have systems capable of producing outcome data, counties already
report several pieces of key information, such as arrest rates and
the rates of successful completion of probation, for the 10 other
state and federal grant programs the board administers. Thus, the
changes we are suggesting may not necessarily require counties to
collect additional information. Further, to minimize the potential
for creating a state mandate, the committee and board could work
with counties to determine what outcome information is already
collected and reported by counties.
The Board Primarily Reports Aggregate Data, Making It Difficult to
Assess Each County’s Performance
State law requires the board to prepare and make available to the public
on its Web site summaries of the annual performance outcome reports
that counties submit. However, the board aggregates the data the
counties submit and presents trends only on a statewide level. When
presented in aggregate, statewide data may obscure the trends within
a given county. For example, using the data that counties submitted
to the board, we determined that the percentage of juvenile offenders
enrolled in school increased by 2 percent statewide between fiscal
years 2009–10 and 2010–11. However, when we looked at the data for
each county, we noted that the enrollment numbers for 10 counties
contradicted this trend. For example, the percentage of juvenile
offenders enrolled in school in Sacramento reportedly decreased by
10 percent between fiscal years 2009–10 and 2010–11.

California State Auditor Report 2011-129

September 2012

Moreover, the board does not provide expenditure information for
individual counties in its annual reports to the Legislature. As a
result, stakeholders cannot use the annual reports to determine how
a specific county’s spending on juvenile justice compares to that of
other counties in the State. Currently, for each placement or service
that receives block grant funds, the board reports per capita costs
statewide rather than by county. For example, in fiscal year 2010–11,
15 counties reported spending between $104 and $309,000 of
block grant and other funds per juvenile offender for juvenile halls.
However, it is unclear what services are included within these costs.
In addition, the board did not disclose which county spent the
least per juvenile offender (Inyo) and which county spent the most
(San Mateo). Because variances in funding can provide insights into
how a county manages its juvenile justice system, we believe decision
makers should know which county spent $104 per juvenile offender
and which spent 3,000 times that amount, if those amounts are
accurate. Opportunities such as this allow stakeholders to identify
potential efficiencies or inefficiencies and to understand the different
approaches that counties employ for the services they provide.
Finally, because the board aggregates data, its reports generally
reflect only the outcomes for large counties. To evaluate how well
the statewide performance outcome data represents all counties,
we divided the counties into three groups—small, medium, and
large—based on the size of their annual block grant expenditures.10
Our review found that the vast majority of the juvenile offenders
in the sample analyzed by the board—883 of the 1,011 juveniles,
or 87 percent—were from large counties. However, in some cases,
smaller counties’ reported outcomes differed markedly from
those of the large counties and thus from the statewide trends.
For example, the board’s compiled data for fiscal year 2009–10
shows that only 9 percent of the juvenile offenders statewide
graduated from high school or obtained equivalent diplomas.
However, 17 percent of the offenders in the sample from the small
counties achieved this educational goal. Because the board reports
predominantly on the outcomes in large counties, neither decision
makers nor the public have enough information to assess the
performance of small or medium counties.
According to the field representative, the board does not present
county‑level data because it believes such a presentation could
violate the confidentiality of that data. For example, she stated
that if the board reported county‑level performance outcomes,
the counties with very small juvenile offender populations would
be forced to disclose performance outcome data that might allow

10	

We classified counties receiving over $1 million annually in block grant funds as large, those
receiving $200,000 to $1 million as medium, and those receiving less than $200,000 as small.

Because the board reports
predominantly on the outcomes
in large counties, neither decision
makers nor the public have
enough information to assess
the performance of small or
medium counties.

27

28

California State Auditor Report 2011-129

September 2012

identification of certain juveniles because many of the counties
had only one or two juveniles in their samples. However, when
we discussed with the board the option of presenting certain
information, such as the information we display in Table 5 on page 24,
including the number of juvenile offenders who receive a new
conviction, the staff agreed that this would not violate confidentiality.
The Board Could Improve the Quality of the Information That It
Collects From Counties
Because the board has provided the counties with insufficient
guidance, they at times submit inaccurate and inconsistent
information. Although the board takes some limited steps to verify
the accuracy of the information the counties submit, these could
be improved to ensure that it detects and addresses problems.
According to the board’s field representative, the block grant did
not provide the board funding for its administration, and thus its
block grant oversight is limited. However, we believe that correcting
the weaknesses we noted in the board’s existing process for data
collection would not significantly increase the board’s current
oversight efforts. In addition, with even minimally improved data, the
board’s annual reports to the Legislature would allow policy makers
to make more informed decisions about juvenile justice, including
outcomes related to realignment and the block grant program.
The Board Provides Insufficient Guidance to the Counties
The counties report inconsistent
information to the board, which
further limits the usefulness
of the board’s annual reports to
the Legislature.

Although the board provides some instructions to the counties
regarding the information it requires them to track and submit,
the guidance it provides is not sufficient. As a result, the counties
report inconsistent information to the board, which further limits
the usefulness of the board’s annual reports to the Legislature. The
board includes instructions, frequently asked questions, and contact
information on the annual reports it requires counties to submit.
For example, the board requires the counties to report the services
they have provided to juveniles during the previous one‑year
period. However, it does not specify how counties should define
what constitutes receiving a service. As a result, the counties we
visited take different approaches for reporting this information. For
example, Sacramento reports a juvenile offender as having received
a service such as a drug treatment program, which frequently
takes a month or more to complete, if he or she participated for at
least one day, but San Diego reports a juvenile offender as having
received a service only if he or she successfully completed it.
According to the board’s limited guidance, both interpretations
could be correct, yet any analysis that attempted to derive
conclusions using these numbers would yield questionable results.

California State Auditor Report 2011-129

September 2012

Furthermore, the board has knowingly allowed counties to continue
submitting inconsistent information. As mentioned previously,
each year counties must submit expenditure reports with their
actual expenditures of block grant funds as well as the number
and type of juveniles served by those expenditures. However, we
found that counties use different criteria in reporting the number
of juveniles who received the same type of block grant‑supported
services. For example, 10 counties indicated on their expenditure
reports for fiscal year 2010–11 that they spent a total of $492,000 of
block grant funding on staff training and professional development.
Although eight of these counties did not indicate that any juveniles
were served by these expenditures, two counties, Inyo and Sierra,
specified that all 118 of their juveniles benefited from this service.
When we asked about this, the board’s field representative
acknowledged the inconsistency. She stated that although she
encourages counties either to omit the number of juvenile
offenders served indirectly by expenditures such as staff training
or to reclassify the expenditures to a direct service category,
some counties insist that certain expenditures serve their juvenile
offenders. Yet despite its knowledge of such inconsistencies, the
board uses this information in its reports. For example, the board
attempts to quantify the number of juvenile offenders who receive
block grant services by adding the number of juvenile offenders
each county reports as having received these services. In this
instance, the board’s lack of explicit guidance could cause the
reports to imply that no juveniles benefited from staff training and
professional development in at least eight counties.
According to the field representative, the board does not have any
policies or procedures related to the administration and oversight of
the block grant. In addition, we found that none of the four counties
we visited have policies or procedures specifically related to
administering the block grant. However, San Diego maintains
policies for its Youthful Offender Unit that it believes are sufficient
for governing the use of block grant funds. Nonetheless, because
the board has not provided sufficient guidance to counties, it
cannot ensure the information it receives and reports is useful.
The Board Does Not Adequately Verify the Accuracy of the Data the
Counties Submit
The usefulness of the board’s reports is further compromised
by the fact that it does not perform sufficient reviews to ensure the
accuracy of the data it collects from counties. As a result, three of
the four counties we visited submitted inaccurate data, yet their
errors went undetected by the board. For example, Sacramento
reported that four of the 57 juvenile offenders in the board’s

The board does not have any
policies or procedures related to
the administration and oversight
of the block grant.

29

30

California State Auditor Report 2011-129

September 2012

fiscal year 2010–11 sample had received new felony convictions in
adult court. However, when we examined the case file of one of
these offenders, we found that the juvenile offender had not
received a new felony conviction. Although the other three juvenile
offenders with new felony adjudications were not among the
five case files we reviewed, and we therefore do not know if they
received new felony convictions, Sacramento’s reporting error is
of concern because fluctuations in the number of repeat offenders
could be considered a key outcome of realignment. If these types of
inaccuracies occur frequently, the board’s reports may be significantly
misinforming users about key criminal justice outcomes. In our
testing, we also found that Los Angeles reported similar inaccurate
information for one juvenile offender and San Diego reported similar
inaccurate information for two juvenile offenders.
Although we did not find evidence that Yuba has submitted
the same sort of inaccurate data to the board as the other
three counties, our review revealed accounting and reporting
practices that suggest that it may likely submit inaccurate data to
the board in the future. In particular, Yuba could not separately
identify its block grant expenditures because it records all of these
expenditures along with other expenditures occurring within its
general fund. Beginning in fiscal year 2012–13, Yuba accounts
for its block grant funds in a separate account. Moreover, we
discovered that Yuba deliberately submitted incorrect performance
outcome data for every juvenile offender in its sample. Despite the
board’s instructions to report on the one‑year period following
each juvenile offender’s disposition, Yuba chose to report on a
different time period—the fiscal year during which the disposition
occurred—which it believed was more meaningful because the
board’s time frame would not have captured all the outcomes for
the juvenile offenders in Yuba’s sample. 

We found that even though the
board does attempt to verify some
of the data collected, its current
efforts are limited in nature.

While Yuba’s reported results are unlikely to skew the board’s data
because it is a small county, the board’s reports will lack accuracy
if other counties also report incorrect expenditure or performance
outcome data. Although the primary responsibility for submitting
accurate information rests with the counties, we would expect
the board to conduct sufficient reviews to assure the accuracy of the
data it reports. However, we found that even though the board does
attempt to verify some of the data collected, its current efforts are
limited in nature. According to the board’s field representative, the
board imports performance outcome data from the counties into
a statistical analysis program and produces reports to determine
whether they contain incomplete or potentially erroneous data.
It then follows up with counties if the program identifies any
errors. However, to find the types of errors that we identified,
the board would need to verify the data it received from counties
by conducting regular site visits or providing similar monitoring

California State Auditor Report 2011-129

September 2012

such as reviewing documentation from counties to validate the
sample data that they submitted to the board. The board indicated
that it has performed only one site visit since the establishment of
the block grant program in 2007, which it conducted because a
member of its governing body requested that it visit Los Angeles.
According to the board’s field representative, the board has not
received approval for funding to monitor the counties’ use of
block grant funds and instead has decided to visit counties only by
request. In addition, the field representative stated that the board
has assigned only one part‑time employee to work on the block
grant. Nevertheless, the board has not explored ways to minimize
the costs of verifying county data, such as adopting a risk‑based
approach using the results of its statistical analysis or reviewing
data for a sample of counties each year.
A Meaningful Assessment of Realignment Outcomes Is Difficult
Because of the Poor Quality of Available Data
As already discussed, we identified numerous problems with
the board’s reports that raise questions concerning the accuracy
of the data the reports present and their usefulness in drawing
meaningful conclusions. For this reason, quality county‑ and
state‑level juvenile justice data are even more essential for
stakeholders to determine the outcomes of realignment. However,
we discovered data limitations at both the county and state level.
Specifically, three of the four counties we reviewed do not have data
systems that are capable of generating reports on the outcomes
related to realignment. At the state level, Justice cannot provide
assurance that data within its systems are accurate or reliable,
and Corrections’ data are not always complete. Until these data
limitations are addressed, any assessments of the outcomes of
realignment may be misleading.
County‑Level Data Related to the Outcomes of Realignment Are Limited
Several key stakeholders have recognized that quality county‑level
data are critical for tracking statewide juvenile justice trends. As
noted in the Introduction, state law required the State Commission
on Juvenile Justice (commission) to develop strategies related
to realignment through the creation of the Juvenile Justice
Operational Master Plan (master plan). The commission concluded
that every county should have a data system that captures the
elements necessary to assess outcomes. In addition, the master
plan emphasizes a statewide need for data reporting and for
a system to measure intermediate‑ and long‑term outcomes.
Two chief probation officers we interviewed also stressed the

The board has not explored ways
to minimize the costs of verifying
county data, such as adopting a
risk-based approach using the
results of its statistical analysis
or reviewing data for a sample of
counties each year.

31

32

California State Auditor Report 2011-129

September 2012

importance of quality juvenile justice data to assess realignment
outcomes. Sacramento’s chief probation officer, who was one of
the commission’s co‑chairs, stated that data systems and data
management are very important for recording outcomes and thus
for tracking the effectiveness of programs and services, and the
chief probation officer for Los Angeles made similar observations.
Yet despite the agreement on the need for quality county‑level
juvenile justice data to assess the outcomes of realignment,
Sacramento’s chief probation officer noted that most counties’ data
systems, including Sacramento’s, were not designed for tracking
and reporting outcome information. The chief probation officer
for Los Angeles also acknowledged that data limitations are a
major impediment to assessing the outcomes of realignment. This
was confirmed in our review where we identified limitations that
hindered our ability to analyze county‑level data. Specifically, we
found that three of the four counties we visited were not capable
of generating reports that we could use to measure outcomes of
realignment. For example, Sacramento could not generate
historical information about its total juvenile caseloads, so we
were unable to assess certain trends over time. Yuba also could
not generate reports related to certain realignment trends, such
as the fluctuations in the number of juveniles tried as adults over
time. Finally, according to a deputy chief in the Los Angeles County
probation department, Los Angeles’s current system is capable
of reporting data related to outcomes; however, the data are not
easily available. Los Angeles estimated that full implementation of
its system, which will allow it to easily generate applicable data for
reports, will be completed in two years.
Of the four counties, only
San Diego was able to generate
performance outcome reports
related to realignment using several
data sources.

Of the four counties, only San Diego was able to generate
performance outcome reports related to realignment using several
data sources. Specifically, San Diego can produce reports on the
programs and services specific juvenile offenders receive, and it can
also generate statistics for all juvenile offenders within the county.
For example, San Diego can create outcome reports related to
juvenile offenders who received employment readiness services,
as well as the recidivism rates of juvenile offenders that terminate
probation. According to the board’s field representative, San Diego
surpasses most other counties in terms of data capabilities.
San Diego’s ability to generate these reports makes it easier for
stakeholders to evaluate the outcomes of realignment within the
county and could be considered a best practice for other counties
to follow if resources permit.
We recognize that budget constraints may limit some counties’
ability to upgrade their data systems to make them capable of
generating such reports. However, the board could do more to
ensure that stakeholders have access to the limited county‑level

California State Auditor Report 2011-129

September 2012

information that is already available. For example, the master plan
notes that most counties record certain critical information about
program participation in electronic data systems; however, none
of the four counties we reviewed provided information regarding
performance outcomes or financial data relating to the block grant
on their Web sites. Moreover, as noted previously, counties already
report several key pieces of outcome information for the 10 other
state and federal grant programs the board administers. Thus, the
changes we are suggesting may not necessarily require counties
to collect additional information. By working with counties to
determine the data that are currently available and ensuring that
these data are made available to the public, the board could provide
stakeholders with more information that would enable them to
better assess the outcomes of realignment.
Justice Cannot Provide Assurance That Its Juvenile Justice Data Are
Reliable for Assessing Certain Outcomes of Realignment
According to Justice’s Web site, the Office of the Attorney General
has a duty to collect, analyze, and report statistical data that
provide valid measures of crime and the criminal justice process
to the government and the citizens of California. Although Justice
indicates that it did not design its databases for the purpose of
assessing the outcomes of realignment, Justice could do more to
ensure that its data are accurate. By not ensuring that its databases
contain accurate information, Justice limits the usefulness of the
information it collects.
Justice’s primary system for tracking juvenile justice‑related
information is the Juvenile Court and Probation Statistical
System (JCPSS). The JCPSS collects juvenile offender data from
57 counties, including names, birthdates, number of arrests,
referrals to probation departments, and dispositions in juvenile
court.11 According to its JCPSS user manual, Justice compiles these
data into reports that aid decision makers, including the Office of
the Governor (governor’s office) and the Legislature, in allocating
resources, planning for the future, and developing new ways to
deal with juvenile delinquency problems. However, Justice cannot
provide assurance that the data it uses to produce these reports
are reliable, thereby limiting the reports’ usefulness as an aid for
policy making or for assessing the outcomes of realignment.

11	

Sierra County does not submit data to the JCPSS. According to its chief probation officer, it does
not submit data to the JCPSS because it was unaware of the system’s existence. Further, he
indicated that Sierra County does not have an advanced case management system that would
allow it to easily capture the related information and it has a very small juvenile caseload.

By not ensuring that its databases
contain accurate information,
Justice limits the usefulness of the
information it collects.

33

34

California State Auditor Report 2011-129

September 2012

The program manager who oversees the JCPSS indicated that
Justice designed the system to track statistics only within individual
counties rather than statewide trends and that the system was
not intended to track individual juveniles. However, even given
these limitations, Justice could do more to ensure that the system
contains accurate information. According to the program manager,
Justice has several processes in place to ensure the accuracy of
the JCPSS data. For example, he stated that Justice programmed the
JCPSS to perform regular data validation checks and quarterly
quality control checks. In addition, he stated that Justice provides
counties with an annual summary report of the data they have
submitted and conducts a semiannual survey to have counties
confirm the number and completeness of cases for juveniles
sent to adult court. However, while all four counties we visited
confirmed that Justice asks them to verify the number of juveniles
tried in adult court, three of the four counties indicated that they
did not receive summary reports of the data they had submitted
for verification purposes. Los Angeles noted that although Justice
has been sending summary reports to it since 2004, Justice asked
it to verify and confirm the data in the report for the first time in
March 2012, after we began our fieldwork. The program manager
did not know why the three other counties did not confirm
receiving Justice’s summary reports.

We question why Justice would
have a procedure to conduct
occasional audits to verify the
accuracy and completeness of
county-level data if it does not
intend to conduct those audits.

Justice also has an internal procedure to conduct occasional audits
of counties’ records. The program manager stated that Justice
does not conduct these audits primarily because its databases
are repositories of the data the counties submit and therefore the
counties are responsible for their accuracy. However, we question
why Justice would have a procedure to conduct occasional audits to
verify the accuracy and completeness of county‑level data if it does
not intend to conduct those audits.
An additional data limitation precludes using the JCPSS to fully
assess certain outcomes of realignment, such as the number
of repeat offenders. Specifically, our analysis of the number of
first‑time and repeat offenders using data from the JCPSS may
not accurately classify juvenile offenders because the system
does not consistently use a unique identification number for each
juvenile offender in the State regardless of where the offenses are
committed. Instead, Justice allows counties to assign their own
unique identification numbers to juveniles within their counties.
Each time a juvenile commits an offense in another county, that
county assigns him or her a new unique identification number,
identifying the juvenile as a first‑time offender in that county no
matter how many offenses he or she may have committed in other
counties. As a result, our analysis of the JCPSS may misclassify
juveniles as first‑time offenders even though they previously
committed offenses in other counties.

California State Auditor Report 2011-129

September 2012

The program manager noted that the modifications to the JCPSS
that would be necessary to track statewide statistics would add
significant costs and require Justice to comply with new state and
federal laws regarding the collection of such data. Specifically, for
the JCPSS to reliably track individual juveniles, Justice would need
to incorporate into the system a method for positively identifying
individuals, such as fingerprints. The program manager also
noted that making these modifications to the JCPSS would be
redundant because Justice designed another one of its systems, the
Automated Criminal History System (criminal history system), to
track individuals.
However, we have concerns about the reliability of the data in
the criminal history system. According to its Juvenile Detention
Disposition Manual, Justice uses the criminal history system to
provide stakeholders with complete and accurate information for
making budgetary decisions and for statistically evaluating crime
prevention programs and evaluating existing and proposed laws.
Like the JCPSS, the criminal history system contains information
including names, birthdates, number of arrests, and dispositions.
According to the program manager, the criminal history system
uses biometric information—fingerprints—to assure that criminal
history information is associated with a specific individual.
Nevertheless, when we analyzed the data in the criminal history
system, we found that we could not reliably determine the
dispositions for juveniles tried as adults, which could be considered
a key outcome of realignment. In particular, for fiscal year 2007–08,
the criminal history system data that we analyzed indicated
that more than 21,000 juvenile offenders received adult court
dispositions. However, the JCPSS indicated that only 1,115 juvenile
cases were referred to adult court in that same fiscal year. When
we discussed this issue with the assistant bureau chief in Justice’s
Bureau of Criminal Information and Analysis, she indicated that
inconsistencies in training and procedural documentation have
led to Justice’s technicians incorrectly assigning adult disposition
codes to juvenile court records in the criminal history system.
In addition, Justice indicated that some local law enforcement
agencies submit juvenile dispositions using adult disposition codes.
However, Justice was unable to provide us with an example of
such an incorrect submission. Because so many juvenile offenders
were erroneously categorized as having received an adult court
disposition, we determined that we could not use Justice’s criminal
history system to reliably determine the dispositions for juveniles
tried as adults.
We acknowledge that Justice did not create the JCPSS or the criminal
history system to track or assess statewide trends or the outcomes
of realignment; however, the JCPSS and criminal history systems are

When we analyzed the data in
the criminal history system, we
found that we could not reliably
determine the dispositions for
juveniles tried as adults, which
could be considered a key outcome
of realignment.

35

36

California State Auditor Report 2011-129

September 2012

the only state‑administered databases we identified that can provide
county‑level juvenile justice data. By making improvements to the
JCPSS and ensuring the accuracy of the data in it and in the criminal
history system, Justice can better aid the governor’s office and the
Legislature in allocating resources and assessing the outcomes
of realignment.
Corrections’ Information Systems Cannot Identify Certain
Juvenile Offenders
Corrections has two systems for tracking information about juvenile
and adult offenders: the Offender‑Based Information Tracking
System (OBITS) and the Offender‑Based Information System (OBIS).
However, neither of these systems is able to provide the number of
juvenile offenders tried as adults and sent to adult prisons, making it
difficult to assess certain outcomes of realignment. OBITS primarily
provides information about confinement time, daily movements,
characteristics, behavior, and other activities of juvenile offenders
while in Juvenile Justice or on parole. Therefore, it does not track
juvenile offenders who do not enter Juvenile Justice. OBIS, on
the other hand, captures offender information from the time that
offenders are committed to Corrections until they are discharged.
OBIS contains adult offenders and juvenile offenders tried in adult
court and sent directly to adult prison. Although Corrections’ staff
informed us that we could use OBIS to calculate the total number of
juvenile offenders who were sent directly to adult prisons, we found
that the method Corrections provided us was not always reliable.

Corrections did not always obtain
the month and day portion of the
offense dates.

Specifically, Corrections’ staff stated that we could obtain the
population of juvenile offenders sent directly to adult prisons by
calculating the age of offenders, using their birthdates and offense
dates. Although Corrections has a policy to obtain necessary
offender information, Corrections did not always obtain the
month and day portion of the offense dates. In fact, we found
that 3.5 percent of all offense records—approximately 112,000 of
3.2 million records—contained incomplete or invalid offense
dates. The incomplete offense dates sometimes included only the
year or the year and month of the offense rather than the day,
month, and year. When we narrowed down the data to include
only records for offenders who were in prison between fiscal
years 2003–04 and 2010–11, we still found more than 500 offenders
who might or might not have been juveniles at the time of their
offenses. Considering that our analysis of the JCPSS data suggests
that an average of fewer than 900 juveniles are even sent to adult
court each year, these 500 offenders potentially represent a large
portion of the juvenile offenders sent directly to adult prisons.

California State Auditor Report 2011-129

September 2012

When we asked Corrections why it did not always populate this
field, it explained that the courts sometimes provide incomplete
data. When a date is needed, Corrections’ policy requires staff to
obtain complete dates from the courts, if possible. However, in
some circumstances, Corrections stated that exact offense dates
may not be known. Because Corrections did not provide us with
its policy until late August 2012 after our fieldwork had ended, we
were not able to review the records with incomplete offense dates
to determine whether Corrections was following its policy or if the
offense dates were not available.
In an effort to streamline and automate offender management,
Corrections is in the process of implementing the Strategic Offender
Management System (SOMS). According to the SOMS project
director, SOMS will consolidate over 50 existing databases into a
single system. Corrections implemented the first module of SOMS
in 2010 and will continue to implement it in modules. SOMS is
intended to address a variety of issues such as data inconsistencies
in the systems that are being consolidated as well as to replace
unsupported legacy systems. Corrections asserted that when it is
fully implemented, SOMS will streamline and automate processes
such as maintaining commitment information, tracking and
scheduling inmates’ programs, classifying inmates’ security levels,
calculating inmates’ release dates, and planning inmates’ pre‑releases
and transitions. Corrections has chosen not to include data relating
to juvenile offenders into SOMS because the population in Juvenile
Justice is so small that including it would not be cost‑effective.
However, according to the SOMS project manager, Corrections
will fully incorporate OBIS into SOMS and retire OBIS. Currently,
Corrections is projecting an unofficial project completion date of
June 2014, but this date is dependent on approval of proposed changes
to the project.
Although Corrections’ systems cannot identify the population of
juvenile offenders that committed offenses as juveniles and were sent
to adult prison, Corrections could increase the amount of information
available to stakeholders regarding realignment by completely
populating the date field in OBIS. If Corrections completely populates
these fields, the data that it consolidates into SOMS will also be
complete. Stakeholders would then be able to use OBIS or SOMS to
gather information about juvenile offenders tried as adults to help
determine the effectiveness of juvenile justice realignment.
Recommendations
To ensure that it has the information necessary to meaningfully
assess the outcomes of juvenile justice realignment, the Legislature
should consider amending state law to require counties to collect

37

38

California State Auditor Report 2011-129

September 2012

and report countywide performance outcomes and expenditures
related to juvenile justice as a condition of receiving block
grant funds. In addition, the Legislature should require the board
to collect and report these data in its annual reports, rather than
outcomes and expenditures solely for the block grant.
To improve the usefulness of its reports so that they can be
used to assess the outcomes of realignment, the board should do
the following:
•	 Work with counties and relevant stakeholders, such as the
committee that established performance outcome measures
for the block grant, to determine the data that counties should
report. To minimize the potential for creating a state mandate,
the board should take into consideration the information that
counties already collect to satisfy requirements for other grants.
•	 If the Legislature chooses not to change the law as suggested,
or if the counties are unable to report countywide statistics, the
board should discontinue comparing outcomes for juveniles who
receive block grant services to those who do not in its reports.
To maximize the usefulness of the information it makes available to
stakeholders and to increase accountability, the board should do
the following:
•	 Create policies and procedures that include clear, comprehensive
guidance to counties about all aspects of performance outcome
and expenditure reporting. At a minimum, such guidance should
include specifying how counties should define when a juvenile
has received a service and whether certain services, such as
training, should qualify as serving juveniles.
•	 Publish performance outcome and expenditure data for each
county on its Web site and in its annual reports.
•	 Consider verifying the counties’ data by conducting regular site
visits on a rotating basis or by employing other procedures to
verify data that counties submit.
To increase the amount of juvenile justice data the counties make
available to the public, the board should work with counties on how
best to report these data.
To ensure the accuracy and completeness of the data the counties
submit into the JCPSS, Justice should follow its procedure to send
annual summaries of the JCPSS data to the counties for review and
to conduct occasional field audits of the counties’ records.

California State Auditor Report 2011-129

September 2012

To ensure that its criminal history system contains complete
and accurate data related to juvenile offenders, Justice should do
the following:
•	 Implement a process to ensure that staff enter data correctly into
the system.
•	 Implement a procedure similar to the one it employs for the
JCPSS to verify the accuracy of information the counties submit.
To increase the amount of information related to realignment
and to allow stakeholders to identify the population of juvenile
offenders sent directly to adult prison, Corrections should obtain
complete offense dates from the courts, if possible.

39

40

California State Auditor Report 2011-129

September 2012

Blank page inserted for reproduction purposes only.

California State Auditor Report 2011-129

September 2012

Chapter 2
BECAUSE STATE LAW DOES NOT CLEARLY DEFINE
THE GOALS OF REALIGNMENT, MEASURING ITS
EFFECTIVENESS IS CHALLENGING
Chapter Summary
Neither state law nor the Board of State and Community
Corrections (board) has provided clear goals for realignment. As a
result, measuring its success or failure is challenging. Nevertheless,
all four counties we visited asserted that realignment has been
effective, citing reduced juvenile crime and improved services to
juvenile offenders. With these goals in mind, we have identified
several possible indicators that could be used to assess outcomes
of realignment if the goals are defined and the data reliability issues
that we identified in Chapter 1 are resolved.
Our analysis using these indicators suggests that realignment
may be resulting in positive outcomes, although we cannot be
certain of many of our conclusions because of the limitations
we identified with the data. For example, our analysis of the data
currently in the Department of Justice’s (Justice) Juvenile Court
and Probation Statistical System (JCPSS) suggests that counties
have reduced the total number of juvenile offenders who received
dispositions12 by more than 21 percent from fiscal year 2007–08—
the year realignment began—through fiscal year 2010–11, implying
that realignment may have decreased crime. In addition, all
four counties we visited reported being able to provide new or
enhanced services to their juvenile offenders since realignment,
which also could be considered a positive outcome.
In our review of the counties’ performances, we noted that any
assessment of the outcomes of realignment should include an
evaluation of the Youthful Offender Block Grant (block grant)
funding formula to determine whether it effectively supports
counties’ juvenile justice operations. The structure of the block
grant funding formula may have unintended adverse consequences
for counties because it produces fluctuating allocations that may
make it more difficult for counties to plan. In addition, the current
formula may create a disincentive for counties to reduce their
number of felony court dispositions because their block grant funds
decrease to the extent that felony dispositions decrease.
12	

Our analysis included those juveniles who received the following types of dispositions: direct
file in adult court, diversion, probation, remanded to adult court, or wardship. Further, offenders
could be counted more than once if they received dispositions for multiple referrals.

41

42

California State Auditor Report 2011-129

September 2012

Without clear goals and specific ways to consistently measure
those goals, determining the success or failure of realignment with
certainty is not possible. Until such time as the Legislature develops
clear goals and a definition of success for realignment, data related
to the outcomes are subject to misinterpretation.
Although the State Has Not Clearly Defined the Goals of Realignment,
Counties Point to Indicators of Effectiveness
State law does not provide clear goals for realignment, nor does it
require the board to define or assess the outcomes of realignment.
Rather, the law asserts that local juvenile justice programs are
better suited to provide rehabilitative services than state‑operated
facilities. In addition, a Senate floor analysis, written while the
Legislature was considering the realignment law, noted that a
projected impact of the law would be to decrease the number
of juvenile offenders housed in the Division of Juvenile Justice
(Juvenile Justice). However, these goals are both vague and
nonspecific. Without clear goals, measuring whether realignment
has been successful is challenging. According to its field
representative, the board has not developed goals or a definition
of success because state law does not require it to do so. However,
as the only state administering body referenced in the law that
realigned juvenile offenders, the board is best positioned to propose
the goals of realignment and the elements of success in meeting
those goals, in the absence of legal or other authoritative criteria.
Despite the fact that the State
has not provided clear goals,
the four chief probation officers
of the counties we visited—
Los Angeles, Sacramento,
San Diego, and Yuba—all believe
realignment has been effective
based on various indicators.

Despite the fact that the State has not provided clear goals, the
four chief probation officers of the counties we visited—Los Angeles
(Los Angeles), Sacramento (Sacramento), San Diego (San Diego),
and Yuba (Yuba)—all believe that realignment has been effective
based on various indicators, suggesting that it is possible to develop
goals that would indicate the success or failure of realignment.
Both Yuba and San Diego believe that one indicator of the success
of realignment is to decrease juvenile crime, an objective that both
counties believe they have met. Specifically, San Diego indicates
that its overall juvenile arrest rate decreased by 4 percent from
fiscal years 2009–10 through 2010–11. San Diego also compares its
actual performance outcomes to internally established target rates.
For example, in the second quarter of fiscal year 2010–11, San Diego
established a goal that 70 percent of juvenile offenders would
complete their probation without being convicted of a new offense.
Its internal reports show that San Diego achieved a 72 percent rate,
which exceeded its goal. Yuba tracks its caseloads from year to year.
According to Yuba’s program manager, its caseload for juvenile
offenders has declined by more than 50 percent since realignment,
and Yuba’s chief probation officer stated that this decline reflects the
success of realignment.

California State Auditor Report 2011-129

September 2012

The chief probation officers of Sacramento and Yuba also
asserted that realignment has allowed their counties to provide
programs that fit their counties’ individual needs. For example,
Sacramento uses a portion of its block grant funds to provide
community‑based programs to its juvenile offenders. Sacramento’s
chief probation officer believes that community‑based programs
are more effective at rehabilitating juvenile offenders than other
programs, because the juveniles are not removed from potential
participation in school, employment, and other positive social
activities. He further stated that the intent of realignment was
to shift juvenile offenders from the state level to the county level
because counties can better serve juvenile offenders. Further, Yuba
indicated that it uses block grant funds to support preventative
activities, which it believes are effective. For instance, its probation
department partners with schools to identify problematic juveniles
and uses early prevention strategies such as substance abuse
counseling and family counseling.
Furthermore, Los Angeles and Sacramento cited financial benefits
as a goal of realignment. Los Angeles’s chief probation officer stated
that realignment has been successful because it has met its primary
goal, which was to reduce the number of juvenile offenders within
Juvenile Justice and thus to save the State money. Additionally,
Sacramento’s chief probation officer stated that realignment
funding allowed Sacramento to continue to provide important
services to juveniles by offsetting its general fund shortfalls due to
budget cuts. For example, as we discuss later, the State eliminated
state funding for Sacramento’s juvenile Mentally Ill Offender Crime
Reduction (MIOCR) grant program in fiscal year 2008–09, but
Sacramento was able to continue to partially fund the program
using block grant funds.
Our Analysis of the JCPSS’s Data Suggests That Certain Juvenile Crime
Statistics Have Decreased Since Realignment, Though These Data
Have Limitations
As some of the counties have asserted, the State could consider
a decrease in juvenile crime to be a potential indicator of the
success of realignment. As described in Chapter 1, Justice’s JCPSS
may be one of two state‑administered databases that collects the
information necessary to determine whether juvenile crime has
decreased because the JCPSS collects a variety of juvenile offender
data from 57 counties. We therefore chose to use these data in our
analysis despite the limitations we identified. Our analysis of the
JCPSS’s data indicates that the total number of juvenile offenders
who received dispositions decreased by more than 21 percent after

43

44

California State Auditor Report 2011-129

September 2012

realignment,13 from nearly 78,900 in fiscal year 2007–08—the
year realignment began—to just over 62,200 in fiscal year 2010–11.
Conversely, during the four years prior to realignment, the number
of juvenile offenders who received dispositions increased by more
than 12 percent, from almost 73,100 in fiscal year 2003–04 to nearly
81,900 in fiscal year 2006–07.
According to San Diego’s chief probation officer, the State could also
use recidivism rates to determine the outcomes of realignment. Our
analysis of the JCPSS’s data showed that although the number of
repeat offenders generally increased during the four‑year period prior
to realignment, the number of repeat offenders14 in the counties
decreased from roughly 42,400 to 36,600—nearly 14 percent—after
realignment (fiscal years 2007–08 through 2010–11). According to
the JCPSS’s data, some counties had more significant changes in
their numbers of both first‑time and repeat offenders during this
time period. For example, the number of first‑time offenders in
Santa Cruz County decreased from 318 to 120. We display the
results for all counties in Appendix B.
Because several factors could have contributed to the decreases in
the number of juvenile offenders who receive dispositions and the
number of repeat offenders, we cannot conclude from our analysis
that realignment has been successful. For example, a deputy chief in
the Los Angeles County probation department attributed the declining
number of juvenile dispositions in part to a decrease in overall arrests.
Similarly, according to an administrative service officer in Sacramento,
reductions in police staffing due to city and county budget cuts could
have resulted in a decreased number of arrests of juvenile offenders.

Given that counties now retain
more serious offenders that were
formerly sent to Juvenile Justice
prior to realignment, it may not be
surprising that the percentage of
repeat offenders has increased.

As shown in Figure 3, we calculated the statewide percentage of
first‑time and repeat offenders who received dispositions, and
found that the percentage of first‑time offenders decreased by
about 4 percent from fiscal years 2003–04 through 2010–11. As we
discuss later in this chapter, the slight decrease in the percentage of
first‑time offenders who received dispositions could be attributed
to some counties’ decisions to spend block grant funds on
preventative services for juveniles who have not yet committed
offenses or who have committed lower‑level offenses. On the other
hand, the percentage of repeat offenders who received dispositions
increased by about 4 percent after realignment. Given that counties
now retain more serious offenders that were formerly sent to
Juvenile Justice prior to realignment, it may not be surprising that
the percentage of repeat offenders has increased.
13	

The original realignment law took effect on September 1, 2007. Our analysis of the data contained
in Justice’s JCPSS is based on fiscal years. Therefore, some of our analysis may contain data for
July and August 2007, the two months prior to realignment’s effective date.
14	 For purposes of this analysis, we classified a repeat offender as any offender that has received
two or more dispositions that were not dismissed.

California State Auditor Report 2011-129

September 2012

Figure 3
Statewide Percentages of First‑Time and Repeat Juvenile Offenders Who Received Dispositions
Fiscal Years 2003–04 Through 2010–11

Percentage of Juvenile Offenders
With Dispositions

70%

Realignment

60

Repeat offenders

50
First-time offenders

40
30
20
10
0

2003–04 2004–05 2005–06 2006–07 2007–08 2008–09 2009–10 2010–11
Fiscal Year

Source:  California State Auditor’s (state auditor) analysis of data obtained from the Department of Justice’s Juvenile Court and Probation Statistical
System (JCPSS). Please refer to the Introduction’s Scope and Methodology for the state auditor’s assessment of the reliability of these data.
Notes:  Sierra County does not submit data to JCPSS.
Our analysis included those juveniles who received the following types of dispositions: direct file in adult court, diversion, probation, remanded to adult
court, or wardship. Further, some offenders could be counted more than once if they received dispositions for multiple referrals.

Realignment does not appear to have caused an increase in
the number of juvenile offenders sent to adult court. The chief
probation officer for Los Angeles expressed a concern that district
attorneys may be more inclined to try juvenile offenders as adults
since realignment if counties do not maintain secure detention
facilities for more serious offenders. If these juvenile offenders
were convicted as adults, they would likely be sent to a state‑run
adult prison. However, based on our analysis of the JCPSS’s data we
found that the total number of juvenile offenders with dispositions
sending them to adult court decreased from about 1,100 to nearly
900 between fiscal years 2007–08 and 2010–11. The results for
all counties can be found in Appendix B. When considering these
data in terms of percentage of change, the results in Figure 4
on the following page indicate that the statewide percentages of
first‑time and repeat juvenile offenders who received dispositions
that sent them to adult court have generally remained constant
since realignment.
Nevertheless, the individual numbers for some counties suggest
that they have significantly increased the number of juvenile
offenders they send to adult court since realignment. For instance,
Sacramento’s records indicate that the number of juvenile

45

California State Auditor Report 2011-129

September 2012

offenders convicted in adult court increased from 10 to 32 from
fiscal years 2007–08 through 2010–11. However, this increase
does not appear to be reflective of overall statewide trends after
realignment. The chief probation officer for Sacramento told us
that one potential reason that more juvenile offenders are tried
as adults in the county is for public safety reasons because of the
closure of the county’s only long‑term secure commitment facility
for juvenile offenders.
Figure 4
Statewide Percentages of First‑Time and Repeat Juvenile Offenders Who Received Dispositions
Sending Them to Adult Court
Fiscal Years 2003–04 Through 2010–11
Realignment

1.4%
Percentage of Juvenile Offenders
With Dispositions

46

1.2

Total offenders

1.0
0.8

Repeat offenders

0.6
First-time offenders

0.4
0.2
0.0

2003–04 2004–05 2005–06 2006–07 2007–08 2008–09 2009–10 2010–11
Fiscal Year
Source:  California State Auditor’s (state auditor) analysis of data obtained from the Department of Justice’s Juvenile Court and Probation Statistical
System (JCPSS). Please refer to the Introduction’s Scope and Methodology for the state auditor’s assessment of the reliability of these data.
Notes:  Sierra County does not submit data to JCPSS.
Some offenders could be counted more than once if they received dispositions for multiple referrals.

Our analysis shown in Figure 4 indicates that the most significant
increase in juvenile offenders with dispositions sending them
to adult court actually occurred prior to realignment: The total
number of juvenile offenders with dispositions that sent them to
adult court increased by 64 percent between fiscal years 2003–04
and 2006–07. This increase may be explained in part by the
passage of Proposition 2115 in 2000. According to the California
Department of Corrections and Rehabilitation’s (Corrections)
Office of the Inspector General’s 2003 report, this proposition
made it easier for district attorneys to prosecute juvenile offenders
15	

Proposition 21 is known as the Gang Violence and Juvenile Crime Prevention Act of 1998.

California State Auditor Report 2011-129

September 2012

as adults and required that juvenile offenders over the age of
16 who were convicted in adult court be sent to state prison.
Moreover, according to Corrections’ associate director of intake
and court services (associate director), counties do not have to
pay the State for juveniles who are sentenced to a Juvenile Justice
facility by an adult court. Conversely, counties must pay a share of
the State’s costs to house juvenile offenders sent to Juvenile Justice
through juvenile court. We found that Justice’s Automated Criminal
History System frequently miscategorized juvenile offenders as
having received adult dispositions and, therefore, was not sufficiently
reliable for the purposes of this audit. As a result, we were not able to
assess whether the number of convictions, acquittals, or dismissals
for the juvenile offenders tried as adults increased or decreased
after realignment.
We caution against drawing conclusions regarding the outcomes
of realignment based on the data we discuss here because many
factors contribute to fluctuations in crime statistics; state law did
not include a goal to reduce crime or recidivism, and we found
the data to be of undetermined reliability. However, should the
Legislature identify reductions in crime and recidivism as potential
goals of realignment, the sort of analyses we have performed might
enable it to determine the effectiveness of realignment, particularly
if Justice resolves the limitations we identified with the JCPSS’s data.
The Four Counties We Reviewed Reported Providing New or
Enhanced Services to Juvenile Offenders Since Realignment
The four counties we reviewed have generally reported being able to
provide new or enhanced services to juvenile offenders compared
to the services they provided before realignment because of the
infusion of block grant funds. These new or enhanced services
may also be considered a positive outcome of realignment. As
discussed in the Introduction, the board allows counties to use
block grant funds to enhance the capacity of local communities
to respond to juvenile crime, and as a result, some counties
designed new or enhanced services based on the needs of juvenile
offenders within their counties. For instance, San Diego used the
block grant funds solely for the support of its Youthful Offender
Unit (YOU program) to lower the risk of recidivism for juvenile
offenders who would previously have been sent to Juvenile Justice.
San Diego designed the YOU program to rehabilitate high‑risk
offenders through programs and intensive supervision while
assisting them in developing and sustaining positive social lifestyles.
The YOU program offers education, aggression replacement
training, parenting classes, substance abuse counseling, work
readiness training, and counseling related to family, gangs, and
mental health. San Diego reported that 73 percent of the offenders

We were not able to assess
whether the number of convictions,
acquittals, or dismissals for
the juvenile offenders tried as
adults increased or decreased
after realignment.

47

48

California State Auditor Report 2011-129

September 2012

who participated in the program after its creation in the fall of
2007 did not commit new offenses during the first six months after
completing probation.
Yuba focuses its block grant funds on prevention and early
intervention services for juveniles who are at risk of entering the
juvenile justice system. Yuba’s prevention and early intervention
services include partnering with schools to identify potential
juvenile offenders before they commit offenses. As an example,
a Yuba victim witness program manager described an instance
in which a juvenile offender expressed interest in joining a dance
class. The juvenile offender’s file specified that participating in an
extracurricular activity could help prevent the individual from
reoffending. Therefore, Yuba agreed to enroll the youth in the
dance class using block grant funds. Yuba’s chief probation officer
indicated that prevention and early intervention services may be
contributing to the significant reduction in the number of first‑time
offenders since realignment. Specifically, according to the JCPSS’s
data, first‑time offenders with dispositions in Yuba have decreased
by nearly 50 percent since realignment.
Sacramento uses some of its block grant funds for programs that it
would otherwise have had to eliminate due to budget reductions.
For example, Sacramento’s juvenile MIOCR program is intended
to reduce the number of mentally ill juvenile offenders in the
justice system. The State previously funded this program but
eliminated Sacramento’s grant in the fiscal year 2008–09 budget.
Because Sacramento believed that the juvenile MIOCR program
was effective, the county used block grant money to fund a portion of
the program at reduced levels. Sacramento also indicated using block
grant funds to save other services that would have been eliminated
by budget cuts, including its risk and needs assessments, its family
counseling services, and its home‑on‑probation placements.

Assessing whether counties actually
provide juvenile offenders with
better services than the State does
is problematic because the quality
of service is difficult to measure.

Although the state realignment law asserts that local communities
are better able than the State to provide certain juvenile offenders
with the programs they require, assessing whether counties actually
provide juvenile offenders with better services than the State does
is problematic because the quality of service is difficult to measure.
According to the associate director, Juvenile Justice’s services can
vary greatly from those offered by the counties. Further, although
Juvenile Justice and the counties may provide similar services,
the associate director indicated that the extent or quality of the
services may differ, which will also affect the costs of the services.
For example, Juvenile Justice’s reentry, mental health, and education
services may differ significantly from the same services provided by
the counties. Nonetheless, the four counties we reviewed reported
that they generally provided similar services to those that Juvenile
Justice provides.

California State Auditor Report 2011-129

September 2012

Interestingly, although the board allows counties to spend block
grant funding at their own discretion, the size of the county appears
to affect whether it expends most of its block grant funds on
placements, such as juvenile halls, camps, or home on probation,
or on direct services, such as alcohol and drug treatment, family
counseling, and job readiness training. As shown in Figure 5,
small‑ and medium‑sized counties spend the majority of their
block grant funds on direct services, whereas large counties spend
most of their funds on placements. However, we noted that in
large counties, direct services may be incorporated in the cost
of placements. For example, Los Angeles reported that it only
spent approximately $554,000 on direct services compared to
the $28.8 million on placements for fiscal year 2010–11; however,
its annual funding application clarified that camps in the county
also offer services such as increased mental health services and
aggression replacement therapy.
Figure 5
Statewide Percentage of Block Grant Funds Counties Reported Spending,
by Expenditure Category
Fiscal Year 2010–11

Percentage of Block Grant Funds

100%

Placements*
Direct services†
Other‡

80

60

40

20

0
Small Counties

Medium Counties

Large Counties

Expenditures
up to
$200,000

Expenditures
between
$200,000 and $1,000,000

Expenditures
over
$1,000,000

Source:  California State Auditor’s analysis of data from expenditure forms counties submitted to the Board of State and Community Corrections for fiscal
year 2010–11.
Note:  There are 24 small counties, 15 medium counties, and 19 large counties.
*	 Placements include juvenile halls, ranches, camps, and home on probation.
†	 Direct services include services such as alcohol and drug treatment, anger management counseling, mental health screening, and job
readiness training.
‡	 Other includes staff salaries and benefits, equipment, and contract services.

49

California State Auditor Report 2011-129

September 2012

State Costs Related to the Juvenile Justice System Have Declined
Since Realignment
Another potential outcome of realignment is a decrease in state
costs related to the juvenile justice system. Specifically, Juvenile
Justice’s expenditures for fiscal year 2006–07—the year prior to
realignment—were $481 million, compared to $294 million for
fiscal year 2010–11. This represents a 39 percent reduction of about
$187 million. Furthermore, if all other factors remain constant and
the State continues to spend at levels similar to the amounts for
fiscal year 2010–11, including the annual block grant allocation,
realignment could result in an annual savings of $93 million as
depicted in Figure 6.
Figure 6
State Juvenile Justice Expenditures
Fiscal Years 2006–07 Through 2010–11
$600
500
State Costs in Millions

50

$24

$67

400

$35
$93

$93
$93

300
$481

$484

$443

$353

$294

2006–07

2007–08*

2008–09

2009–10

2010–11

Potential state savings
from fiscal year 2006–07
spending level
Youthful Offender Block
Grant allocations
Division of Juvenile
Justice's expenditures

200
100
0

Fiscal Year
Sources:  California State Auditor’s analysis of data from the Department of Juvenile Justice’s accounting records and the State Controller’s Office’s block
grant allocation amounts.
*	 Under state law, beginning on September 1, 2007, juvenile courts can only send juveniles adjudicated for serious, violent, or sexual offenses to
state facilities.

The State administers several types of state‑funded juvenile
justice grants as described in the Introduction. Thus, total state
costs related to the juvenile justice system are not limited to
Juvenile Justice’s expenditures and the block grant. Our analysis
revealed that the State’s costs for other major state‑administered
grants have also declined since fiscal year 2006–07. For example,
the combined funding for two of the larger juvenile justice
grants, the Juvenile Justice Crime Prevention Act and the

California State Auditor Report 2011-129

September 2012

Juvenile Probation and Camps Funding Program have declined
from $312 million to $223 million from fiscal years 2006–07
through 2010–11.
Although we cannot conclude that realignment is responsible for
the entire decrease in state costs, the significant reduction in the
number of juvenile offenders within Juvenile Justice—as a result
of realignment—is likely a key contributor. Since realignment, the
population of juvenile offenders within Juvenile Justice institutions
has decreased by 51 percent, from 2,665 on June 30, 2007, to
1,298 on June 30, 2011. As a result, Juvenile Justice has reduced
staffing and closed several facilities, including four institutions and
one fire camp. Although state law did not include state savings as
a goal of realignment, the significant decrease in costs to the State
could nevertheless be considered a positive outcome.
The Block Grant Funding Formula May Pose Some Challenges
for Counties
A comprehensive assessment of the outcomes of realignment
should include an evaluation of the block grant funding formula to
determine whether it enhances counties’ juvenile justice operations.
Based on our review, we found that the funding formula established
by state law may have unintended negative consequences on certain
realignment outcomes. Specifically, the funding formula may pose
a financial challenge for counties because it could adversely affect
how counties plan for their juvenile justice programs. As discussed
in the Introduction, the State generally bases block grant allocation
amounts on a formula that weighs equally the number of juveniles
between the ages of 10 and 17 in the county and the number of
juvenile felony court dispositions within the county. However,
counties’ juvenile populations and felony court dispositions do not
remain constant. Consequently, the amount of funds that a county
receives fluctuates from year to year. For example, from fiscal
years 2009–10 through 2010–11, Placer County’s block grant allocation
decreased by more than 20 percent, from $887,000 to $690,000, and
Tulare County’s allocation increased by more than 50 percent,
from $1.1 million to $1.6 million. Appendix A describes how
the Department of Finance calculates the block grant allocation
to counties.
According to a Yuba program manager, such fluctuations make
planning for services or long‑term programs difficult because
the county cannot count on a consistent level of funding. He
indicated that to mitigate these funding fluctuations, Yuba chooses
to save a portion of its block grant funds from one fiscal year
for use in the following fiscal year. State law does not require

Since realignment, the population
of juvenile offenders within
Juvenile Justice institutions has
decreased by 51 percent, from
2,665 on June 30, 2007, to 1,298 on
June 30, 2011.

51

52

California State Auditor Report 2011-129

September 2012

counties to spend all the block grant funds allotted to them. As 
shown in Table 6, three of the four counties we visited have not
fully expended the block grant funds they received over the past
four fiscal years. Los Angeles plans to use its unexpended funds
on enhanced programs. San Diego indicated that planned projects
and expenditures did not materialize in previous fiscal years, which
caused the balance of unexpended funds. According to the board’s
field representative, the board has recently begun to monitor
counties’ unexpended funds. However, the board does not yet have
procedures in place to follow up with counties in the event that the
balance of unexpended funds becomes abnormally high.
Table 6
Unexpended Youthful Offender Block Grant Funds at Four Counties We Reviewed
Fiscal Years 2007–08 Through 2010–11
LOS ANGELES COUNTY

SACRAMENTO COUNTY

SAN DIEGO COUNTY

RECEIVED

UNSPENT

RECEIVED

UNSPENT

RECEIVED

2007–08

$5,460,396

$5,457,942

$1,103,062

$922,877

$1,610,147

$579,470

$58,500

undetermined

2008–09

16,394,743

4,257,754

3,069,674

1,231,760

5,020,964

483,622

118,518

undetermined

2009–10

22,008,743

1,997,878

4,355,366

–

7,759,234

866,291

212,473

undetermined

21,572,410

–
$3,713,574†

2010–11
Totals

$65,436,292

4,522,433
$13,050,535

740,362
$655,341†

UNSPENT

YUBA COUNTY

FISCAL YEARS

RECEIVED

UNSPENT*

7,710,853

1,639,732

179,594

undetermined

$22,101,198

$3,569,115

$569,085

undetermined

Sources:  California State Auditor’s analysis of information and documentation provided by Los Angeles (Los Angeles), Sacramento (Sacramento),
San Diego, and Yuba (Yuba) counties. Amounts listed do not include interest.
*	 Yuba was not able to provide us with its annual block grant expenditures because it combines these expenditures with other department
expenditures for juveniles and adults. Therefore, we could not calculate Yuba’s unexpended funds.
†	 Los Angeles reported using $8 million of previous years’ unspent funds in fiscal year 2010–11 and Sacramento reported using $2.2 million of
previous years’ unspent funds in fiscal year 2009–10. Because these amounts are not included in the table, the amount of unspent funds for these
two counties will not add up to the totals shown.

In addition, the funding formula may create an inherent disincentive
for counties to reduce the number of juvenile felony dispositions
because doing so would decrease the amount of block grant
funds they receive. For example, according to our analysis of the
JCPSS’s data, the number of juvenile offender dispositions in
Yuba decreased by nearly 6 percent from fiscal years 2008–09
to 2009–10. This decrease may have contributed to the fact that
Yuba’s allocation of block grant funds declined from $212,000 in
fiscal year 2009–10 to $180,000 in fiscal year 2010–11. As noted
previously, Yuba elected to spend block grant funds on preventative
and early intervention services for juveniles who are at risk of
entering the juvenile justice system. The county’s chief probation
officer believes that the number of felony juvenile dispositions
decreased as a result of these services. Therefore, even though Yuba’s
actions may be effective in reducing crime, the State has reduced
its allocation of block grant funds. If Yuba—or any other county—

California State Auditor Report 2011-129

September 2012

continues to reduce felony dispositions, the resulting reduction
in funding could impair the county’s ability to continue offering
the services that led to the reductions. Three of the four chief
probation officers we interviewed expressed particular concern
over the funding formula. For example, the chief probation officers
for Los Angeles and Yuba noted that the formula is ironic because
counties receive more money if they have more felony dispositions
and receive less funding when they are successful in reducing
felony dispositions.
To offset the instability in the formula and to counteract the
disincentives it creates, the chief probation officer for Los Angeles
suggested averaging the number of felony dispositions over a
three‑ or four‑year period to prevent sharp fluctuations in funding.
This approach could help counties plan for services or long‑term
programs. In addition, the Juvenile Justice Operational Master
Plan (master plan) notes that every county needs stable funding
and suggests that the State should tie funding to incentives.
For example, the master plan suggests awarding some funds as
challenge or incentive grants to promote the use of validated
risk‑and‑needs assessments and evidence‑based programs.
However, the master plan cautions that the State needs to develop
standards to determine how to prioritize funding allocations,
what to do when outcomes fall short of expectations, and when to
withdraw state funding and redirect it elsewhere.
Recommendations
The Legislature should consider revising state law to specify the
intended goals of juvenile justice realignment. To assist the
Legislature in this effort, the board should work with stakeholders
to propose performance outcome goals to use to measure the
success of realignment.
To offset potential disincentives and provide counties with a more
consistent level of funding from year to year, the Legislature should
consider amending the block grant funding formula. For example,
the formula could be adjusted to use the average number of felony
dispositions over the past several fiscal years instead of using only
annual data.
To ensure that counties do not maintain excessive balances
of unexpended block grant funds, the board should develop
procedures to monitor counties’ unspent funds and follow up with
them if the balances become unreasonable.

53

54

California State Auditor Report 2011-129

September 2012

We conducted this audit under the authority vested in the California State Auditor by Section 8543
et seq. of the California Government Code and according to generally accepted government auditing
standards. Those standards require that we plan and perform the audit to obtain sufficient, appropriate
evidence to provide a reasonable basis for our findings and conclusions based on our audit objectives
specified in the scope section of the report. We believe that the evidence obtained provides a
reasonable basis for our findings and conclusions based on our audit objectives.
Respectfully submitted,

ELAINE M. HOWLE, CPA
State Auditor
Date:	

September 11, 2012

Staff:	

John Baier, CPA, Audit Principal
Kathleen Klein Fullerton, MPA
Kelly Christine Chen
Scilla Outcault, MBA
Sandra L. Relat, CPA
Katrina Solorio

Legal Counsel:	

Scott A. Baxter, JD

IT Audit Support:	 Michelle J. Baur, CISA, Audit Principal
Benjamin Ward, CISA, ACD
Ryan P. Coe, MBA
For questions regarding the contents of this report, please contact
Margarita Fernández, Chief of Public Affairs, at 916.445.0255.

California State Auditor Report 2011-129

September 2012

Appendix A
YOUTHFUL OFFENDER BLOCK GRANT
FUNDING FORMULA
The Department of Finance (Finance) generally
determines the allocation of Youthful Offender
Block Grant (block grant) funds to counties using
a formula in which 50 percent of the grant amount
is based on the number of a county’s juvenile felony
court dispositions and 50 percent is based on the
county’s total juvenile population between the ages
of 10 and 17. Finance uses the most recent data
compiled by the Department of Justice to determine
the number of felony dispositions. It uses its own
data to determine the county’s juvenile population.

Steps to Calculate County X’s
Youthful Offender Block Grant Allocation Amount
Step 1:	Obtain variables.
	

(1) County X’s population of juveniles between
the ages of 10 and 17: 2,000 minors.

	

(2) State’s total population of juveniles
between the ages of 10 and 17: 80,000 minors.

	

(3) County X’s number of juvenile felony court
dispositions:  40 felony court dispositions.

	

(4) State’s number of juvenile felony court
dispositions:  200 felony court dispositions.

	

(5) Total state grant amount: $93 million.

To demonstrate how Finance determines the block
grant allocations, we calculated the block grant
amount it would award to a hypothetical county X,
as shown in the text box. To calculate county X’s
allocation, we needed five variables:

Step 2:	Calculate 50 percent of the state grant
amount to be allocated based on both the
population of minors and the number of felony
court dispositions.
$93 million / 2 = $46.5 million

•	 The number of juveniles in County X between
the ages of 10 and 17.

Step 3:	Compute county X’s ratio of minors to the State’s
total population.
2,000 / 80,000 = 0.025

•	 The State’s total population of juveniles between
the ages of 10 and 17.

Step 4:	Calculate county X’s allocation for minors based
on 50 percent of the state grant amount.
$46.5 million x 0.025 = $1,162,500

•	 The number of juvenile felony court dispositions
in County X.

Step 5:	Compute county X’s ratio of juvenile felony court
dispositions to the State’s total juvenile felony
court dispositions.
40 / 200 = 0.2

•	 The State’s total number of juvenile felony
court dispositions.
•	 The total state grant amount for the current year.
Table A on the following page shows the block grant
amounts Finance allocated to each county for fiscal
years 2007–08 through 2010–11.

Step 6:	Calculate county X’s allocation for juvenile felony
court dispositions based on 50 percent of the
state grant amount.
$46.5 million x 0.2 = $9,300,000
Step 7:	Add the amounts calculated in steps 4 and 6 to
obtain county X’s total allocation.
$1,162,500 + $9,300,000 = $10,462,500
Result:	 County X is allocated $10,462,500 in block
grant funds.
Source:  Generated by the California State Auditor based on the
block grant formula in state law.

55

56

California State Auditor Report 2011-129

September 2012

Table A
Youthful Offender Block Grant Allocations to Counties
Fiscal Years 2007–08 Through 2010–11
FISCAL YEAR
COUNTY

Alameda
Alpine
Amador

2007–08

2008–09

2009–10

2010–11

TOTAL ALLOCATIONS

$730,128

$2,195,631

$3,149,550

$3,087,405

$9,162,714

58,500

117,000

117,000

117,000

409,500

58,500

117,000

117,000

117,000

409,500

119,232

366,222

533,792

476,058

1,495,304

Calaveras

58,500

117,000

117,000

117,000

409,500

Colusa

58,500

117,000

117,000

117,000

409,500

Contra Costa

443,277

1,355,917

2,026,337

2,055,006

5,880,537

Del Norte

147,357

117,000

117,000

117,000

498,357

Butte

El Dorado

94,387

316,670

411,482

359,596

1,182,135

Fresno

689,807

2,124,543

2,602,775

3,106,964

8,524,089

Glenn

58,500

117,000

117,000

117,000

409,500

Humboldt

58,851

166,321

218,186

234,468

677,826

Imperial

74,364

224,746

347,715

334,239

981,064

Inyo

58,500

117,000

117,000

117,000

409,500

Kern

849,966

2,450,911

3,117,491

2,834,568

9,252,936

Kings

96,499

299,227

468,793

471,070

1,335,589

Lake

214,500

214,500

166,644

181,057

776,701

58,500

117,000

117,000

117,000

409,500

Lassen
Los Angeles

5,460,396

16,394,743

22,008,743

21,572,410

65,436,292

Madera

101,441

294,698

378,745

480,562

1,255,446

Marin

103,118

380,084

638,412

615,713

1,737,327

58,500

117,000

117,000

117,000

409,500

Mariposa
Mendocino

58,500

133,197

182,797

189,102

563,596

Merced

266,127

736,830

988,330

1,064,119

3,055,406

Modoc

58,500

117,000

117,000

117,000

409,500

Mono

58,500

117,000

117,000

117,000

409,500

244,627

686,532

1,053,995

1,058,464

3,043,618

92,250

310,251

413,781

440,392

1,256,674

Monterey
Napa
Nevada

58,500

150,716

220,562

257,372

687,150

Orange

1,597,593

5,243,451

6,881,391

7,010,986

20,733,421

Placer

147,000

493,700

887,233

690,415

2,218,348

Plumas

58,500

117,000

117,000

117,000

409,500

Riverside

1,814,310

3,577,005

5,839,735

5,387,106

16,618,156

Sacramento

1,103,062

3,069,674

4,355,366

4,522,433

13,050,535

San Benito

100,366

117,000

117,000

117,000

451,366

San Bernardino

1,648,906

5,593,225

8,223,171

8,244,151

23,709,453

San Diego

1,610,147

5,020,964

7,759,234

7,710,853

22,101,198

San Francisco

287,150

763,010

1,054,408

981,461

3,086,029

San Joaquin

602,322

1,600,059

2,299,765

2,283,566

6,785,712

California State Auditor Report 2011-129

September 2012

FISCAL YEAR
COUNTY

2007–08

2008–09

2009–10

2010–11

TOTAL ALLOCATIONS

San Luis Obispo

100,274

330,890

462,207

421,516

1,314,887

San Mateo

363,742

1,250,540

1,980,175

2,006,829

5,601,286

Santa Barbara

259,089

805,254

1,086,949

1,002,924

3,154,216

Santa Clara

790,663

2,383,972

3,073,403

3,164,987

9,413,025

Santa Cruz

94,752

270,312

380,512

406,844

1,152,420

Shasta

149,095

406,964

388,790

315,546

1,260,395

Sierra

58,500

117,000

117,000

117,000

409,500

Siskiyou

58,500

117,000

124,787

117,000

417,287

Solano

409,064

1,210,953

1,713,712

1,582,335

4,916,064

Sonoma

261,015

715,568

898,519

904,850

2,779,952

Stanislaus

278,735

719,772

948,505

1,218,626

3,165,638

Sutter

58,568

176,352

287,878

241,691

764,489

Tehama

58,500

117,000

178,372

166,268

520,140

Trinity

58,500

117,000

117,000

117,000

409,500

Tulare

395,455

851,750

1,048,644

1,612,326

3,908,175

58,500

117,000

134,741

117,000

427,241

Ventura

419,279

1,314,805

1,915,583

2,076,235

5,725,902

Yolo

213,756

334,436

504,441

507,524

1,560,157

Tuolumne

Yuba
Totals

58,500

118,518

212,473

179,594

569,085

$23,602,170

$67,158,913

$93,323,124

$93,446,631

$277,530,838

Source:  State Controller’s Office’s Youthful Offender Block Grant allocations for fiscal years 2007–08 through 2010–11.
Note:  State law mandated that each county receive a minimum of $58,500 for fiscal year 2007–08. For each fiscal year beginning in 2008–09, counties
received a minimum of $117,000.

57

58

California State Auditor Report 2011-129

September 2012

Blank page inserted for reproduction purposes only.

California State Auditor Report 2011-129

September 2012

Appendix B
JUVENILE JUSTICE STATISTICS BY COUNTY
The Department of Justice (Justice) uses the Juvenile Court and
Probation Statistical System (JCPSS) to report on county‑level
data related to juvenile justice. As we mentioned in Chapter 1,
Justice compiles the JCPSS’s data from 57 counties into reports
that aid decision makers, including the Office of the Governor
and the Legislature, in allocating resources and planning for the
future. However, Justice cannot provide assurance that the data
contained within the system are accurate. Because the JCPSS’s data
are the only data available, we present them in tables B.1 and B.2
on the following pages. However, we caution that the data may not
accurately reflect county‑level statistics and trends.
Table B.1 shows the total number of juvenile offenders across the
counties from fiscal years 2003–04 through 2010‑11.16 Prior to
realignment, the State experienced a nearly 12 percent increase in
the total number of first‑time offenders who received dispositions.
After realignment, the total number of first‑time offenders who
received dispositions decreased by nearly 30 percent. Similarly,
the total number of repeat offenders who received dispositions
increased by more than 12 percent prior to realignment and
decreased by nearly 14 percent after realignment. As we discuss in
Chapter 2, the statistics presented in Table B.1 could help measure
outcomes of realignment.
In Table B.2 on page 62, we provide information about the number
of juvenile offenders with dispositions sending them to adult court
with either a direct file or remanded to adult court disposition
from fiscal years 2003–04 through 2010–11 for each county. Before
realignment,17 the number of juvenile offenders with dispositions
sending them to adult court increased by about 72 percent for
first‑time offenders and approximately 59 percent for repeat
offenders. We discuss some of the possible reasons for this increase
in Chapter 2. However, after realignment, the number of juvenile
offenders with dispositions sending them to adult court decreased
by nearly 25 percent for first‑time offenders and 18 percent for
repeat offenders. This would appear to indicate that realignment
has not caused an increase in the number of juvenile offenders sent
to adult court.

16	

For purposes of this analysis, we classified a repeat offender as any offender that has received
two or more dispositions that were not dismissed.
17	 The original realignment law took effect on September 1, 2007. Our analysis of the data contained
in Justice’s JCPSS is based on fiscal years. Therefore, some of our analysis may contain data for
July and August 2007, the two months prior to realignment’s effective date.

59

6

Calaveras

Colusa

127

33

1,406

251

76

41

Imperial

Inyo

Kern

Kings

Lake

Lassen

6

161

454

23

19

398

186

110

Mendocino

Merced

Modoc

Mono

Monterey

Napa

Nevada

232

Marin

Mariposa

205

Madera

6,795

626

Humboldt

Los Angeles

32

143

Glenn

1,480

El Dorado

Fresno

–

Del Norte

831

95

Butte

Contra Costa

34

263

Amador

22

1,162

FIRST‑TIME

Alpine

Alameda

COUNTY

72

230

512

–

11

670

326

1

102

263

7,962

20

116

66

1,187

35

219

328

43

1,883

214

–

922

4

22

624

31

2

980

REPEAT

2003–04

102

156

467

16

21

323

166

11

166

236

6,995

66

92

260

1,222

46

144

156

34

1,454

155

1

990

9

69

239

41

8

1,084

FIRST‑TIME

41

229

396

–

9

831

349

4

207

264

9,320

21

107

230

1,247

38

269

306

51

1,883

233

–

1,029

2

18

504

23

–

1,107

REPEAT

2004–05

91

232

359

13

14

321

142

30

120

237

7,140

67

158

194

1,570

56

165

136

40

1,321

185

1

882

26

71

247

52

6

1,041

FIRST‑TIME

57

260

252

1

4

706

328

11

112

273

10,250

53

133

285

1,553

57

180

346

31

1,788

264

3

885

22

23

517

44

3

1,199

REPEAT

2005–06

94

259

382

9

9

269

148

25

153

240

7,310

40

131

168

1,595

49

185

158

37

1,278

245

4

964

57

52

221

52

3

1,228

FIRST‑TIME

44

364

352

–

5

620

370

16

133

244

10,000

33

200

283

1,588

73

221

269

39

1,588

321

11

1,075

88

12

478

48

–

1,034

112

197

461

6

26

310

147

15

154

196

7,400

–

120

174

1,342

43

171

159

43

1,501

330

16

875

27

38

269

35

2

1,262

64

322

474

3

13

671

481

4

152

215

9,691

–

219

257

1,522

80

32

348

45

1,694

123

44

1,076

26

21

428

50

3

1,113

REPEAT

2007–08
FIRST‑TIME

FISCAL YEAR

REPEAT

2006–07

Table B.1
Number of First‑Time and Repeat Juvenile Offenders Who Received Dispositions in Each County
Fiscal Years 2003–04 Through 2010–11

128

188

385

3

13

305

126

11

176

357

7,230

22

107

184

1,338

35

106

127

27

1,400

236

19

835

19

61

242

27

5

1,213

127

330

495

–

7

700

432

7

195

151

10,158

2

197

265

1,357

77

18

370

40

1,729

214

66

1,359

12

16

377

18

4

1,029

REPEAT

2008–09
FIRST‑TIME

130

179

314

5

14

346

151

17

119

277

4,683

48

59

136

921

44

162

77

31

1,385

170

28

802

19

69

125

31

1

1,051

113

373

439

3

5

649

347

3

187

236

9,599

20

114

234

1,037

54

27

258

34

1,669

233

170

1,218

9

15

335

44

–

1,014

REPEAT

2009–10
FIRST‑TIME

118

132

356

24

8

269

113

12

140

120

5,190

55

33

179

866

24

94

76

24

928

167

39

733

29

67

140

30

3

1,055

67

386

461

5

6

672

365

6

207

118

8,973

28

77

247

955

42

29

217

44

1,325

283

228

1,035

6

23

310

38

–

969

REPEAT

2010–11
FIRST‑TIME

1,470

4,023

6,503

107

188

8,116

4,152

179

2,555

3,632

128,696

516

1,939

3,413

20,706

786

2,648

3,474

595

24,306

3,500

630

15,511

361

672

5,319

598

62

17,541

TOTALS

60
California State Auditor Report 2011-129

September 2012

33,198

39,862

58

214

2,032

122

1,056

7

57

112

687

1,904

805

20

N/A

438

299

345

1,727

888

427

–

193

3,688

1,589

10

1,869

203

3

186

4,078

33,374

103

160

1,069

59

882

19

131

96

629

538

514

43

N/A

307

297

877

516

837

279

636

380

2,254

2,868

75

2,301

106

23

205

2,441

42,417

70

200

1,606

113

1,186

4

138

104

921

3,251

815

13

N/A

586

279

655

1,767

865

517

236

224

1,297

2,165

4

2,391

12

3

196

4,081

REPEAT

2004–05

FIRST‑TIME

36,750

78

176

1,258

42

857

30

144

84

540

614

553

51

N/A

271

266

973

653

752

264

1,315

501

2,603

2,856

47

2,162

1,881

38

298

2,526

45,410

67

182

1,312

123

1,159

11

183

130

858

3,092

929

12

N/A

513

291

950

1,911

914

534

951

235

1,426

2,243

4

2,730

879

23

80

4,028

REPEAT

2005–06
FIRST‑TIME

37,107

81

296

1,178

62

647

35

105

80

401

655

588

69

N/A

220

229

951

642

817

298

945

522

2,800

2,844

51

1,962

2,217

38

385

2,624

44,766

41

186

1,356

155

1,253

14

204

119

611

1,112

1,255

13

N/A

514

265

924

2,138

1,004

531

927

311

1,422

2,274

12

2,721

1,735

38

23

4,099

REPEAT

2006–07
FIRST‑TIME

36,495

140

267

1,272

36

655

37

137

143

223

631

564

62

N/A

261

318

866

628

759

270

910

388

2,895

2,868

41

1,714

1,738

22

328

2,891

42,376

72

208

1,494

134

716

26

223

95

188

1,258

1,113

21

N/A

510

337

1,002

1,773

1,015

493

604

258

1,484

2,137

9

2,189

1,116

9

12

4,709

REPEAT

2007–08
FIRST‑TIME

34,254

98

242

1,144

33

716

44

167

136

458

474

512

81

N/A

168

293

856

542

754

221

814

439

2,633

2,733

14

1,453

1,620

24

279

2,381

41,790

105

134

1,608

122

350

21

154

121

397

1,082

1,090

36

N/A

398

269

1,115

1,570

1,076

527

1,105

238

1,373

2,256

5

1,227

1,436

15

32

4,176

REPEAT

2008–09
FIRST‑TIME

27,982

76

193

801

27

552

21

119

88

446

380

364

43

N/A

176

289

840

487

670

183

629

305

2,531

2,626

5

1,095

1,205

19

223

2,195

39,083

115

123

1,581

99

832

15

91

110

889

976

856

53

N/A

304

257

1,319

1,406

1,048

167

1,062

178

1,200

2,154

1

1,039

963

18

27

3,761

REPEAT

2009–10
FIRST‑TIME

25,643

52

170

781

53

393

29

75

80

351

343

366

50

N/A

146

120

675

486

529

123

439

304

2,295

2,252

13

817

1,103

15

172

2,387

36,572

102

97

1,347

90

856

24

120

92

886

1,047

771

74

N/A

313

137

1,089

1,421

893

37

751

144

1,005

2,026

–

774

762

15

39

4,538

REPEAT

2010–11
FIRST‑TIME

597,079

1,339

3,069

20,384

1,317

13,202

352

2,119

1,716

9,025

17,988

11,644

661

N/A

5,378

4,227

14,101

18,248

13,658

5,124

11,326

4,980

32,955

38,967

349

28,826

17,276

312

2,732

53,606

TOTALS

Source:  California State Auditor’s (state auditor) analysis of data obtained from the Department of Justice’s Juvenile Court and Probation Statistical System (JCPSS). Please refer to the Introduction’s Scope and
Methodology for the state auditor’s assessment of the reliability of these data.
Note:  Our analysis included those juveniles who received the following types of dispositions: direct file in adult court, diversion, probation, remanded to adult court, or wardship. Further, some offenders could be
counted more than once if they received dispositions for multiple referrals.
N/A = Sierra County does not submit data to JCPSS.

Totals

81

Tulare

221

Trinity

Yuba

15

1,092

Tehama

Yolo

71

Sutter

47

126

Stanislaus

545

540

Sonoma

Ventura

631

Solano

Tuolumne

20

549

Siskiyou

253

281

Santa Cruz

N/A

664

Santa Clara

Sierra

581

Santa Barbara

Shasta

837

San Mateo

2

253

San Joaquin

San Luis Obispo

360

2,049

San Diego

San Francisco

3,076

San Bernardino

58

300

Riverside

San Benito

9

Plumas

2,382

247

Placer

Sacramento

2,691

REPEAT

2003–04

FIRST‑TIME

Orange

COUNTY

FISCAL YEAR

California State Auditor Report 2011-129

61

September 2012

1

2

–

–

–

1

1

5

El Dorado

Fresno

Glenn

Humboldt

Imperial

Inyo

Kern

Kings

–

Mono

Nevada

–

Modoc

2

–

Merced

–

5

Mendocino

Napa

1

Mariposa

Monterey

1

–

Marin

2

–

Del Norte

71

11

Contra Costa

Madera

–

Colusa

Los Angeles

–

Calaveras

2

1

Butte

–

2

Amador

Lassen

–

Lake

–

Alpine

FIRST‑TIME

Alameda

COUNTY

–

1

1

–

1

10

1

–

–

2

177

–

2

–

8

–

–

1

–

6

–

–

6

–

–

–

–

–

3

REPEAT

2003–04

–

1

3

–

3

5

–

–

2

4

64

–

1

10

1

–

–

2

–

3

1

–

7

–

–

–

–

–

2

FIRST‑TIME

1

3

4

–

–

9

–

–

–

2

118

–

2

4

7

–

–

4

–

1

1

–

4

–

–

–

–

–

–

REPEAT

2004–05

1

1

1

–

2

5

1

–

2

8

107

–

–

6

3

–

–

1

–

5

3

–

10

2

2

2

3

–

5

FIRST‑TIME

–

1

1

–

–

6

1

–

1

7

181

–

2

8

14

–

–

8

–

6

1

–

5

–

–

7

5

–

3

REPEAT

2005–06

–

3

8

–

–

5

2

–

4

7

117

–

3

12

11

–

–

1

–

2

3

–

6

–

–

3

6

–

5

FIRST‑TIME

–

2

2

–

1

2

5

–

1

12

194

–

1

4

13

–

1

–

–

4

1

–

5

–

–

–

2

–

–

4

–

3

2

1

4

4

–

–

2

2

100

–

–

10

4

–

1

–

–

13

–

–

9

–

–

2

5

–

–

7

7

–

2

12

–

–

–

4

221

–

2

11

29

–

1

6

–

21

2

–

3

–

–

3

–

–

5

REPEAT

2007–08
FIRST‑TIME

FISCAL YEAR

REPEAT

2006–07

–

3

–

–

1

–

–

–

5

9

66

–

–

23

15

–

3

2

–

10

3

–

6

–

–

3

1

–

5

FIRST‑TIME

–

10

1

–

1

2

–

–

–

5

187

–

6

16

40

–

4

1

1

28

6

–

7

–

–

7

1

–

17

REPEAT

2008–09

Table B.2
Number of First‑Time and Repeat Juvenile Offenders Who Received Dispositions Sending Them to Adult Court
Fiscal Years 2003–04 Through 2010–11

9

–

2

2

–

–

3

–

–

1

4

63

1

–

15

8

–

–

1

–

10

–

–

14

–

–

–

2

–

–

8

4

–

2

8

–

–

–

2

178

–

3

16

38

1

1

1

–

18

–

–

6

–

–

4

–

–

25

REPEAT

2009–10
FIRST‑TIME

–

3

3

1

–

4

–

–

2

7

34

–

1

9

6

–

2

–

–

12

–

–

2

–

–

1

1

–

12

–

2

3

–

–

8

–

–

6

5

123

–

1

9

31

–

2

3

–

19

1

–

7

2

–

1

–

–

21

REPEAT

2010–11
FIRST‑TIME

2

50

44

2

17

88

11

–

27

82

2,001

1

26

158

229

2

15

31

1

160

23

–

108

4

2

34

28

–

116

TOTALS

62
California State Auditor Report 2011-129

September 2012

12

Sacramento

1

San Mateo

Santa Barbara

–

Yuba

350

1

5

1

–

5

–

1

1

1

4

1

–

N/A

–

–

3

9

5

1

–

–

11

13

–

21

4

–

–

44

REPEAT

256

3

9

7

1

2

–

–

1

2

1

5

–

N/A

–

–

4

6

2

5

5

–

5

33

–

9

17

–

–

30

FIRST‑TIME

326

–

7

4

–

11

–

1

–

2

18

4

–

N/A

–

1

7

4

6

3

–

–

13

26

–

25

1

–

–

33

REPEAT

2004–05

372

3

3

7

–

7

–

1

–

2

3

6

–

N/A

1

1

9

4

1

1

4

1

9

33

–

31

36

–

7

32

FIRST‑TIME

452

–

5

3

–

5

–

1

2

6

13

7

1

N/A

1

4

2

5

–

–

2

–

23

24

–

40

15

–

4

32

REPEAT

2005–06

457

1

20

4

–

12

–

–

1

8

3

6

3

N/A

–

4

1

4

–

–

5

–

14

47

–

17

51

4

2

52

FIRST‑TIME

555

–

17

12

–

11

–

–

4

13

9

9

3

N/A

–

1

3

13

4

–

9

–

7

47

1

44

31

–

–

67

442

3

2

21

–

27

–

1

3

6

6

5

2

N/A

1

1

8

6

2

–

13

1

6

65

–

26

22

–

3

46

673

1

10

29

–

6

–

1

4

2

11

4

–

N/A

1

4

10

20

4

–

14

–

15

65

–

48

28

–

–

60

REPEAT

2007–08
FIRST‑TIME

FISCAL YEAR

REPEAT

2006–07

402

–

2

16

–

17

1

–

3

11

5

4

–

N/A

4

2

10

10

9

–

6

1

7

42

–

24

24

–

6

43

727

3

5

44

–

6

–

5

7

6

10

15

–

N/A

–

3

26

16

11

1

11

–

12

56

1

50

32

–

2

65

REPEAT

2008–09
FIRST‑TIME

374

1

6

16

–

8

–

1

6

9

1

7

–

N/A

3

6

5

1

3

4

7

4

8

50

5

28

21

–

6

33

623

2

4

22

–

5

–

1

4

6

3

4

1

N/A

–

4

25

16

2

1

17

–

10

48

1

44

26

–

–

62

REPEAT

2009–10
FIRST‑TIME

332

2

2

21

–

6

–

–

4

12

–

3

2

N/A

3

4

15

1

6

2

39

2

12

37

4

16

17

–

2

20

550

2

5

13

–

11

–

1

3

12

2

1

1

N/A

–

–

25

15

10

2

32

–

12

47

–

25

26

–

1

60

REPEAT

2010–11
FIRST‑TIME

Source:  California State Auditor’s (state auditor) analysis of data obtained from the Department of Justice’s Juvenile Court and Probation Statistical System (JCPSS). Please refer to the Introduction’s Scope
and Methodology for the state auditor’s assessment of the reliability of these data.
Note:  Some offenders could be counted more than once if they received dispositions for multiple referrals.
N/A = Sierra County does not submit data to JCPSS.

266

10

Yolo

Totals

8

Tehama

–

1

Sutter

Ventura

1

Stanislaus

Tuolumne

2

Sonoma

–

5

Solano

14

4

Siskiyou

Tulare

–

Sierra

Trinity

–

N/A

Shasta

3

5

San Luis Obispo

–

6

San Joaquin

Santa Cruz

–

San Francisco

Santa Clara

4

–

San Diego

–

15

Riverside

32

–

Plumas

San Bernardino

–

Placer

San Benito

35

FIRST‑TIME

Orange

COUNTY

2003–04

7,157

22

112

228

1

153

1

15

44

100

94

85

13

N/A

14

35

156

131

70

26

164

9

168

665

12

460

366

4

33

714

TOTALS

California State Auditor Report 2011-129

63

September 2012

64

California State Auditor Report 2011-129

September 2012

Blank page inserted for reproduction purposes only.

California State Auditor Report 2011-129

September 2012

(Agency comments provided as text only.)
August 21, 2012
Board of State and Community Corrections
600 Bercut Drive
Sacramento, CA 95811
Ms. Elaine M. Howle, State Auditor*
Bureau of State Audits
555 Capitol Mall, Suite 300
Sacramento, CA 95814
Dear Ms. Howle:
Enclosed you will find the Board of State and Community Corrections’ (Board) response to the Bureau of
State Audits’ (BSA) report entitled “Juvenile Justice Realignment: Limited Information Prevents a Meaningful
Assessment of Realignment’s Effectiveness.”
The BSA identified a number of perceived shortcomings in the methods and procedures used by the Board
to collect and report expenditures and outcomes for the Youthful Offender Block Grant program. In many
instances, I concur with the BSA’s observations and if the resources were available, I would gladly address
the shortcomings, as recommended. In other instances, I respectfully disagree. Please refer to the enclosed
response to the BSA’s report for our complete analysis.
I appreciate the work and insights provided by the BSA as a result of its work on this audit. As the Board
moves forward with both juvenile and criminal justice realignment and brings the full vision of SB 92
(Chapter 36, Statutes of 2011) to fruition, I have no doubt this work will prove useful.
Should you wish to discuss our response, please feel free to contact me at (916) 445-5073.
Respectfully,
(Signed by: Jean L. Scott for Patricia Mazzilli)
PATRICIA MAZZILLI
Executive Director

*  California State Auditor’s comments begin on page 77.

65

66

California State Auditor Report 2011-129

September 2012

Board of State and Community Corrections
Response to:
Juvenile Justice Realignment:
Limited Information Prevents a Meaningful Assessment of Realignment’s Effectiveness
(prepared by the Bureau of State Audits)
On August 8, 2011, the Joint Legislative Audit Committee approved a request from Assemblymember
Alyson Huber to conduct an audit of the Youthful Offender Block Grant (YOBG) Program. Consistent with
that action, the Bureau of State Audits (BSA) began its YOBG audit in February 2012. On August 15, 2012,
the Board of State and Community Corrections (Board) received BSA’s report of audit findings and
recommendations. The following is our response to that report.

1

Pursuant to the legislation that created the YOBG Program, the Board of State and Community Corrections1
(Board) has three major responsibilities: (1) to annually collect information regarding each county’s planned
YOBG expenditures; (2) to annually collect and report to the Legislature information regarding each county’s
actual YOBG expenditures; and (3) to annually collect and report to the Legislature on outcomes pertaining
to the youth who receive one or more placements, services, programs, etc. that are funded in whole or
in part by YOBG. The Board has no fiduciary responsibility with regard to YOBG expenditures and up until
July 1,  2012, when the Board was established, had no responsibility to guide policy with regard to the larger
issue of juvenile justice realignment.
The Board uses three different data collection instruments to collect information on planned YOBG
expenditures, actual YOBG expenditures, and outcomes for juveniles who are recipients of YOBG-funded
placements, services, programs, etc. Each consists of one or more EXCEL spreadsheets (formatted to look
like forms) that are sent out annually to each county. The content and format of the forms were developed
in conjunction with an Executive Steering Committee (ESC) consisting of probation chiefs, researchers, and
other juvenile justice stakeholders. Forms were pilot tested in a small number of counties and modified
accordingly prior to operational use.

2

3
2

The Board has met its obligations to annually collect information and report on planned and annual YOBG
expenditures and outcomes despite receiving no resources to do so. Previous board initiated requests to
obtain administrative resources for this program have been denied. As a result, the Board has met these
obligations to report on a program that allocates approximately $93 million a year to counties, by redirecting
a part time staff person (who spends much of her time on other projects) and approximately 380 hours
annually (approximately 1/5th of a person year) of a contract researcher. These two individuals have total
responsibility for all activities associated with YOBG data collection and reporting (form development and
maintenance), processing and checking all data submitted on the forms, data analysis, and generating all
required reports either on the Board website (county reports of planned YOBG expenditures) and/or in the
annual report to the Legislature (actual expenditures and outcomes). It should be noted the Board has now
submitted two annual reports to the Legislature, yet has received no feedback regarding a desire for more or
different data than what has been provided.
BSA identified a number of perceived shortcomings in the methods and procedures used by the Board
to collect and report on YOBG expenditures and outcomes. In many instances we concur with BSA’s
observations and if the resources were available, would gladly address the shortcomings, as recommended.
In other instances, we do not agree with BSA’s observation or recommendation.
1  Previously known as the Corrections Standards Authority (CSA), California Department of Corrections and Rehabilitation (CDCR).

California State Auditor Report 2011-129

67

September 2012

Planned and Actual YOBG Expenditures and Data Collection and Reporting
BSA Recommendation:
•  Create policies and procedures that include clear, comprehensive guidance to counties about all
aspects of expenditure reporting.
Board Comment:
Standard practice has been to provide training to county personnel who are responsible for submitting
data to the Board. In addition, the Board has typically developed and made available a “users manual” or
similar document to guide those responsible for data submission. This has not been possible for the YOBG
program due to staffing and budgetary constraints. In addition, the forms were disseminated at a time when
staff travel was generally prohibited. As mentioned previously, all instruments were pilot tested prior to
implementation and each year the Board takes into account both the feedback from counties and personal
observations to make enhancements to the form. The feedback received has not indicated significant
confusion on the part of the users of the forms. Also, because it was not possible to develop separate “user’s
manuals” for the forms, attempts were made to incorporate more guidance within the forms or in other
communications that accompanied distribution of the forms. Finally, Board staff manually review each
submitted form to look for missing or questionable data and automated procedures are also used to check
for missing or conflicting information.
The Board recognizes that these steps fall short of providing the kind of scrutiny that is desirable, and given
the necessary resources, the Board would welcome the opportunity to institute the steps and measures it
typically institutes when collecting survey information, as enumerated above.

2

4

2

BSA Recommendation:
•	 Consider verifying the counties’ data by conducting regular site visits on a rotational basis or by
employing other procedures to verify data that counties submit.
Board Comment:
While, as mentioned previously, the Board has no fiduciary responsibility with regard to county expenditures
of YOBG funds, we concur that it would be highly desirable to at least periodically conduct site visits to
verify that local records are consistent with what is reported to the Board with regard to YOBG expenditures.
Again, the Board lacks the resources to conduct such periodic reviews, and for the time being, at least, must
assume county reporting is accurate.

2

When counties submit their annual expenditure reports, both planned expenditures and actual
expenditures, they are reminded of the Legislative mandate to do so and are also notified that all
information submitted on the forms may be posted on the Board website and/or included in the annual
reports to the Legislature.
We are uncertain as to what BSA has in mind with regard to the Board verifying the data collected by means
of “other procedures to verify the data counties submit.”

5

68

California State Auditor Report 2011-129

September 2012

BSA Recommendation:
•	 Publish expenditure data for each county on its website and in its annual reports.
Board Comment:
6
7

8

9

This recommendation appears to be based on the belief that the reporting of county level expenditure data
is important “…Because variances in funding can provide insights into how a county manages its juvenile
justice system…” (see page 24 of the report). The implication appears to be that knowing about funding
variances in per capita costs is particularly important in this regard.
The Board has chosen not to report such information for several reasons. First, we believe that the most
meaningful information to be gleaned from the per capita cost data is the statewide per capita costs for
different placements, services, etc. That is, from a statewide perspective, how do the per capita costs (both
YOBG per capita costs and total per capita costs) differ by type of placement, service, etc., and are YOBG
funds being used for placements, services, etc., with high or low overall per capita costs. Second, for any
given placement, service, etc., there can be considerable differences in county per capita costs. There may be
very legitimate reasons for these differences, but without additional information (information not available
from the data collected by the Board), these reasons cannot be discerned. Thus, in our opinion, to compare
county differences in per capita costs without having this additional information can erroneously lead to
inferences that counties with higher per capita costs are somehow doing a poorer job of managing costs.
Third, to publish county per capita costs across all placements, services, etc., that is, one overall per capita
cost figure for each county, can easily lead to similar erroneous conclusions about county management
of funds, given that some counties may use funds for placements, programs, etc. that have relatively high
overall costs, while other counties may be using funds for placements, programs, etc., that have relatively
low overall costs.
In addition to the above, we take issue with the premise that how a county uses YOBG funds should be
used to draw any conclusions about how a county manages its juvenile justice system (see above quote
from page 24 of the report).
Performance Outcomes
Many of the perceived deficiencies and recommendations for change described above with regard to
collecting and reporting on YOBG expenditure data are similarly made with reference to the performance
outcome data. Before commenting on the specifics of these criticisms and recommendations, we believe
it is necessary to provide some background information on the circumstances and steps leading to the
development of these measures.
The report correctly states that the Board is responsible for reporting to the Legislature regarding outcomes
for juveniles who receive YOBG-funded services and programs (as well as placements and strategies).
Pursuant to this mandate, the Board convened the previously mentioned Executive Steering Committee
in October 2009 for the purposes of addressing this mandate. Membership on the committee included
individuals who played an instrumental part in drafting the YOBG legislation. At the initial meeting, the ESC
was given a demonstration of the on-line reporting system developed by the Board pursuant to passage
of the Juvenile Justice Crime Prevention Act in 2000. The reporting system for JJCPA was demonstrated, in
part, because language in the enabling legislation for YOBG (Welfare & Institutions Code (WIC) §1961(c)(2)
(A)-(D)) closely parallels that found in the enabling legislation for JJCPA (Government Code §30061-30064).
Thus, an apparent presumption by the crafters of the YOBG legislation was that the reporting of outcomes

California State Auditor Report 2011-129

69

September 2012

for YOBG would closely parallel that of JJCPA – with respect to both the methods and outcomes that would
be reported upon.
However, following demonstration of the JJCPA reporting system, the ESC concluded that adoption of
this system for YOBG was neither appropriate nor practical. It was deemed inappropriate because JJCPA,
unlike YOBG, requires that all funds be spent exclusively on programs shown previously to be effective in
reducing juvenile crime and delinquency. Furthermore, for all such programs counties are also required to
report on a group of youth comparable to the youth who received program services as a means of assessing
program impact. In contrast, there is no requirement that YOBG funds be spent on programs (and the data
show that the majority of YOBG funds are not used for this purpose), and the total number of juveniles
expected to receive one or more services, programs, placements, or strategies funded in whole or in part
by YOBG in any given year was anticipated to be substantial. The burden on counties to report annually on
such a large group was considered by the ESC members to be unreasonable, unattainable, and of limited
value given the wide range of permissible uses of YOBG funds and the lack of any reference groups for
comparative purposes.
Given all of the above, an alternative was sought to collecting outcome information for all juveniles who
directly benefitted from YOBG funding. The most desirable alternative would have been to collect this
information from a representative sample of all such youth. However, absent any available information on
the individuals in the statewide population of this group, the only way to draw such a sample would be
to have each county identify all the individual juveniles for whom they spent YOBG dollars and submit this
individual case level data to the Board. Upon receipt of this information from all counties, the Board would
then randomly select a representative statewide sample and each county would be notified of the juveniles
within their county for whom they must report outcome data. This process would have to be repeated
each year to comply with the annual reporting responsibilities of the Board. Because of the time and cost of
having each county annually identify all youth within their county for whom YOBG funds were spent, this
approach to sampling, i.e., collecting outcome information for a representative sample of all juveniles who
received some YOBG funding, was also deemed to be unworkable.
As a result, the ESC was informed of the Juvenile Court & Probation Statistical System (JCPSS) database
maintained by the Department of Justice. As described in BSA’s report, this database contains information
on all juveniles who are referred to probation, as well as the disposition of all such juveniles who are
found to have committed a criminal offense. The ESC was further informed that using this database, the
population  of all juveniles with adjudicated offenses could be identified and from this group, a random
sample of the statewide population could be identified. Then each county could be notified of the
individuals from the statewide sample who were from their county, and be asked to provide outcome and
other information for each youth. While it was recognized there was no guarantee that YOBG funds would
have been spent on the youth who were randomly selected, this approach was adopted, and in the hope
of maximizing the potential that any given youth in the sample had benefitted from YOBG funding, the
decision was made to base the sample on the population of youth with adjudicated felonies. The decision
to focus on these offenders was also made in the belief that this population was most likely to be made
up of the types of offenders who previously would have been candidates for DJJ commitment, and in the
knowledge that WIC §1951 requires counties to use YOBG funds, in part, to provide appropriate services to
such offenders.
Given all of the above, and in consideration of the authority given the Board in WIC §1961(5)(e) to modify
the performance outcome measures cited in §1961(c)(2)(A)-(D) upon determination that counties are
substantially unable to provide this information, the ESC approved an approach wherein every year a
random sample of 1,000 juveniles with felony adjudications during the prior year are drawn from the JCPSS

10

70

California State Auditor Report 2011-129

September 2012

10

database. Counties then report on a limited number of outcomes for the youth during the one-year period
from the disposition date of the youth’s adjudicated felony. The proportion of youth sampled from each
county is based on the proportion of total statewide YOBG funds received by the county, which, in turn,
closely parallels the proportion of all juveniles in the state with adjudicated felonies. Thus, the total sample
of 1,000 is made up largely from juveniles in larger counties, which tend to have the greatest number of
felony-adjudicated juveniles. This approach to sampling was considered optimum for providing a statewide
approximation of outcomes for juvenile offenders most likely to be subject to DJJ commitment prior to
enactment of the YOBG program and it was never intended that the resultant data would be used to
examine outcomes at the county level.
The ESC also agreed on collecting data for a limited number of outcomes specific to subsequent felony
adjudications/commitments, educational status and achievement, and completion of probation; as well
as the collection of certain background information items for each youth at the time of felony disposition
(school enrollment, employment, record of substance abuse, etc.). In addition, for each youth, information
was collected on whether they received each of over 40 placements, programs or services during the
one‑year period from date of disposition and whether they received any such placement, program or service
that was funded in part or in whole by YOBG.
In adopting this approach it was acknowledged that the conclusions that could be drawn from the data
would be limited, as elaborated upon in the first two annual reports that have been submitted to the
Legislature, and that year-to-year comparisons of the data would most likely be of greatest value.

1

The Board never intended, nor does it advocate, that the information contained in the annual reports
it submits to the Legislature for the YOBG program be used to draw conclusions about the “outcomes
of realignment.”
BSA Recommendation:
•	 Create policies and procedures that include clear, comprehensive guidance to counties about all
aspects of outcome reporting.
Board Comment:
The same general comments made in response to this recommendation as it pertained to YOBG
expenditures are applicable here.
In addition, the Board would like to comment on a criticism unique to performance outcome data.
Specifically, the Board is criticized for collecting data on whether a youth received each of a large number
of services but does not provide guidance to the counties as to the threshold that counties should use in
determining whether a given service was provided (see page 27 of the report). Consequently, it is noted that
counties use different criteria for reporting whether a specific service was received and it is recommended
that the Board provide counties with a standardized definition of what constitutes receipt of each specific
type of service.

11

The Board is aware that absent standardized definitions, counties can and will use varying criteria for
reporting whether a specific service was received. However, the Board believes the BSA recommendation
to develop such standardized definitions is neither practical nor necessary. It is impractical given that vast
differences can exist with regard to the specific nature and scope of any given category of service. Using
the report’s example of drug treatment programs, significant county variation in program scope and

California State Auditor Report 2011-129

71

September 2012

duration most likely exists as a function of factors such as the type of drug(s) that are the focus of treatment,
the underlying approaches/philosophies of the programs, and location of the programs, e.g., whether or
not provided within a secure detention facility. The Board collects data on 35 different services. To do as
recommended by the BSA, would require the Board to develop a separate definition of receipt of service
for each of these 35 different services with the goal of crafting each definition in a manner that would be
meaningful and acceptable to all 58 counties.
Aside from the enormity and potential infeasibility of accomplishing this task, the perceived need for such
definitions seems to be rooted in the desire to use the service received data for purposes of conducting
county-to-county comparisons. Rather, the intended use of this data is not that of collecting detailed
information that can be used to hone in on county differences but rather to obtain very basic information
that can be used to provide a statewide perspective of where YOBG funds are being spent. From this
perspective, the value of developing standardized operational definitions of what is considered receipt
of each of 35 specific types of service is not considered sufficient to justify the time and effort that would
be required.

11

10
11

BSA Recommendation:
•	 Consider verifying the counties’ data by conducting regular site visits on a rotational basis by employing
other procedures to verify data that counties submit.
Board Comment:
Again, the same general comments in response to this recommendation for YOBG expenditures are also
applicable to performance outcomes.
In addition, it should be noted that there is reason to question whether Board staff could legally review
local files pertaining to any individual juvenile’s mental health, use of medications, or criminal history – all of
which are reported upon in the form used to collect performance outcome data. Confidentiality procedures
are built into the data collection process to preclude the possibility of Board staff being able to associate any
outcome and associated data with the name of a juvenile. Assuming local probation staff would be willing
to provide Board staff with access to local files for purposes of verifying the above referenced information, it
would most likely be necessary for local staff to scrub all names before granting this access.

12

BSA Recommendation:
•	 Publish performance outcome data for each county on its website and in its annual reports.
Board Comment:
Given the overall size and nature of the annual performance outcome sample (approximately
1,000 adjudicated felons), the intended use of the performance outcome data, i.e., to provide a statewide
perspective of outcomes and other variables related to the expenditure of YOBG funds, and the sampling
plan used to accomplish this intended purpose, i.e., to sample from each county in a way that reflects the
number of juveniles for which it receives YOBG funds, which in turn, takes into account the number of
juveniles from each county with felony dispositions, we question the wisdom of this recommendation.
To do so would in some instances result in publishing outcome results in a given year for a given county
based on one juvenile. It is difficult to comprehend how this information could be useful for purposes
of assessing trends within and between counties, especially when one considers that for a majority of

13
14

72

California State Auditor Report 2011-129

September 2012

15

counties (approximately 30) outcome results are based on five or fewer juveniles. For example, in Table 5 of
BSA’s report, the performance outcome results reported for Yuba County are based on only two juveniles.
Furthermore, on page 21 of BSA’s report the reader is warned about drawing conclusions about the
differences between Los Angeles County and Sacramento County with respect to offenders who received
YOBG-funded services versus offenders who did not receive YOBG-funded services given that Los Angeles
County spends more of its YOBG funds on high risk offenders, while Sacramento County uses YOBG funds
on juvenile offenders at various risk levels. The admonition seems to be contrary to reporting performance
outcome results by county for purposes of assessing trends within and between counties. Perhaps this
recommendation is predicated on the condition that in the future more detailed performance outcome and
associated data be collected on much larger numbers of youth?
Other BSA Criticisms:
1.  The Board has never taken any enforcement action against counties because the Board believes the State
Controller’s Office is the fiduciary agent for the block grant.
Board Comment:

16

While it is certainly factual that the Board has no fiduciary responsibility for the YOBG program that has
nothing to do with whether or not enforcement action has or would be taken. In fact, the Board has not
pursued enforcement action against any of the counties because it has had no reason to do so. Should the
Board ever become aware that a county is using YOBG funds inappropriately, we would certainly work with
the appropriate control agencies to enforce all provisions of the law.
2.	 Because outcome data are collected only for high risk juveniles, i.e., juveniles with a recent adjudication
for a felony offense, the results reported for outcomes are not reflective of the results one might expect
if based on the full range of youth who receive YOBG services, including less serious offenders and at-risk
youth, and are subject to misinterpretation with regard to effects of realignment.
Board Comment:

10

1

As to the first part of this criticism, we have detailed the shortcomings and reasons for adoption of the
sampling process by which youth are selected for purposes of collecting outcome data. We do not purport
that the results reported for outcomes are reflective of the performance outcome results one might obtain
if based on the full range of youth who benefit from YOBG funds. We have been careful to describe that all
results are reported for youth (juveniles with recent felony adjudicated offenses) who would have most likely
been considered candidates for DJJ previously.
With regard to the second part of this criticism, we appreciate the potential for misinterpretation but wish
to reiterate that the focus of the Board’s work has been on collecting and reporting findings specific to the
expenditure of YOBG funds. We have never intended nor purported that the work we have done should be
construed as reflecting on the overall effects of realignment.
3.	 The Board’s annual reports to the Legislature on the YOBG program are characterized in the report as
using a flawed methodology in which outcomes results for youth who were the beneficiaries of some
YOBG funding are compared with the outcome results for youth who were not the beneficiaries of
YOBG funding, which could mislead decision makers about the effectiveness of realignment by making
it appear that realignment has not been effective. As a case in point, the results in the first two annual
reports to the Legislature have shown that a significantly higher percentage of YOBG-funded youth

California State Auditor Report 2011-129

73

September 2012

received a new felony adjudication, “…which implies that the block grant actually increases the likelihood
that a juvenile offender will reoffend when a more plausible explanation is that offenders who pose
a higher risk of recidivism are more likely to receive block grant services.” (see page 4 of the report).
Additionally, “…Although the reports state that caution must be taken in drawing conclusions regarding
outcome difference for juvenile offenders who receive block grant services and those who do not…”
(see page 4 of the report), the Board should cease from making such comparisons because the results
could mislead decision makers regarding the effectiveness of realignment.
Board Comment:
The concern over our methodology is that the results will mislead policy makers into drawing erroneous
conclusions about the effects of realignment. As stated previously, the focus of the Board’s annual reports
to the Legislature are specific to the YOBG program, and not the overall effects of realignment. In fact, this
is acknowledged by BSA on page 3 of its report (“State law authorizes the Board of State and Community
Corrections (board) to … and requires the board to issue annual reports to the Legislature regarding the
outcomes for juveniles who receive block grant-funded services and programs.”) BSA further concludes
on page 3 that the Board’s reports should not be used to draw conclusions about realignment. Specifically,
BSA states that YOBG reports should not be used for this purpose because of, among other things,
the flawed methodology of comparing outcomes for youth supported by YOBG funds with youth not
supported by YOBG funds. In other words, the methodology is flawed because it could mislead the reader
about the outcomes of realignment even though it is acknowledged by both BSA and the Board that the
annual YOBG reports are not intended to be used to draw conclusions about realignment. BSA seems to
also infer the Board’s annual reports to the Legislature should include an assessment of the outcomes of
realignment: “Although the law does not specifically require the board’s reports to include an assessment
of realignment, because the board is the only state administering body referenced in the law that realigned
juvenile offenders, we would expect its annual reports would give the Legislature information regarding the
outcomes of realignment.”
We concur that there is an expectation that the Board will provide policy makers with information
germane to the overall effects of juvenile realignment. We are in the initial planning process for meeting
this challenge. However, this responsibility just became effective on July 1, 2012 with establishment of the
Board in Penal Code §6024. Further, this newly assigned responsibility does not take away from the Board’s
ongoing responsibility to report annually on the YOBG program. We do not concur that we should cease
from comparing outcomes for youth supported by YOBG funds with youth who are not supported by YOBG
funds when reporting on this program due to concerns that to do so could mislead policy makers about the
overall effects of realignment.
It is also important to note that the comparison of new felony adjudication rates for youth supported
and not supported by YOBG funds is just one of many comparisons provided in our annual reports to
the Legislature on the YOBG program. We also report on the outcomes of educational enrollment and
achievement rates (a significantly higher percentage of youth supported by YOBG funds are enrolled in
school sometime during the one year evaluation period), probation status (no differences in percentage
of youth on probation at the end of the one year evaluation period), and new adult felony conviction
rates (higher for youth not funded by YOBG in one of two years). We also look at baseline differences in the
characteristics of the two groups (a greater percentage of youth supported by YOBG funds have substance
abuse indicated in their file; a greater percentage of youth not supported by YOBG funds have a mental
health diagnosis/symptoms indicated in their file); differences in the number and types of placements
and services received by the two groups (youth supported by YOBG funds receive a significantly greater
number placements and services); the relationships between baseline differences and outcomes (e.g., youth

1

17

18

74

California State Auditor Report 2011-129

September 2012

18

with substance abuse indicated in their file are more likely to have a new felony adjudication); and the
relationships between service levels and outcomes (those who receive more direct services are more likely
to be enrolled in school, to be on probation at the end of the year, and to receive a new felony adjudication
during the year). We also report on year-to-year comparisons (new felony adjudication rates slightly lower
for both youth funded and not funded by YOBG in FY 2010-11 compared to FY 2009-10). All of these
analyses are conducted in an attempt to uncover patterns in the relationships between who receives YOBG
funding support, what it means to receive YOBG support (in terms of numbers and types of services), and
an admittedly limited number of educational and criminal justice outcomes. And as duly noted by BSA, we
acknowledge the limitations of the data we are working with and caution the reader against drawing any
firm conclusions based on the results we report, especially as they relate to outcome differences between
youth funded and not funded by YOBG.
We would also be remiss if we did not mention that considerable attention is also given in our annual
reports to the Legislature on county-reported expenditure of YOBG funds, and in this regard, counties
have consistently reported that approximately three-fourths of the funds are used to pay staff salaries and
benefits, and the majority of funds (approximately 70%) are spent in conjunction with placements (camps,
juvenile halls, etc.) and not direct services.3

10

And finally, it should be mentioned that even with the acknowledged shortcomings of reporting on only
youth with felony adjudicated offenses in our annual reports on the YOBG program (and thus not including
lesser offenders and non-offenders who are also supported by YOBG), the research literature has consistently
shown that program effects are greatest for serious offenders. Thus, the youth we are reporting on in our
annual reports to the Legislature (felony adjudicated youth) are the type of youth for whom programs have
the greatest potential to impact outcomes.
BSA Observations and Recommendations Regarding Assessment of Juvenile Justice Realignment:

1

While fundamentally disagreeing with the notion that the Board’s responsibilities for reporting on the
expenditures and outcomes for the YOBG program are one in the same with assessing the impact of juvenile
realignment (or that the results reported for YOBG program were ever intended to be used [or should
be used] to assess the overall impact of juvenile realignment), we nevertheless take great interest in the
observations and recommendations made by the BSA with regard to assessing juvenile justice realignment,
especially in light of the Board’s recent mandate (effective July 1, 2012) to address this topic along with
criminal justice (adult offender) realignment.
In this regard, BSA presents several different types of data from a variety of sources, including county-specific
data based on on-site interviews and observations; longitudinal data extracted from the JCPSS reporting
system on first time offenders, repeat offenders, and offenders sent to adult court; and longitudinal data
on net costs to the State related to the juvenile justice system. BSA was careful to point out the limitations
of drawing conclusions based on this data, especially the crime statistics, note the available data are not
sufficient to assess outcomes of realignment, and point to the need for the Legislature to clarify the intended
goals of realignment. All of this work and the insights drawn from this work, will be extremely useful to the
Board as it tackles its new mandate to assess and report on juvenile realignment. We appreciate BSA’s efforts
and concur there is a fundamental need to reach agreement on the goals of juvenile realignment.

2  It is acknowledged that YOBG expenditures reported for placements undoubtedly include expenditures for other than
   strictly custodial operations.

California State Auditor Report 2011-129

75

September 2012

BSA recommends on page 8 of the report that “…To ensure that it has the information necessary to
meaningfully assess the outcomes of juvenile justice realignment, the Legislature should consider amending
state law to require counties to collect and report performance outcomes and expenditures related to
juvenile justice as a condition of receiving block grant funds. In addition, the Legislature should require the
board to collect and report these data in its annual reports, rather than outcomes and expenditures solely
for the block grant.”
In our opinion, it would be premature to adopt this recommendation for two fundamental reasons. First
and foremost, absent clear agreement on the goals and objectives of realignment, we have concerns about
putting into law the scope and method by which juvenile realignment will be evaluated. Second, we have
concerns about the SB 90 implications of such an action.4 As to this second point, BSA notes in its report
that a variety of data elements are collected as part of the Board’s administration of “10 other state and
federal grant programs.” While we do administer other programs, the funding does not go to all counties
and in some cases the funding goes to only one small program within a county. To assume the availability of
countywide data based on the existence of other programs is erroneous.

19

20

Summary
While we believe all of the above noted issues are significant, the following provides a brief summary of the
three most pervasive areas of concern:
1.	 Many of BSA’s criticisms of the Board’s YOBG reporting rely upon an assumption of responsibility that far
exceeds the legislative mandate. The Board has met all of its mandated reporting requirements. Although
the Board now has a mandate to provide leadership for statewide realignment efforts, that only came into
effect as of July 1, 2012.
2.	 BSA’s assertion that counties should report data for their juvenile justice systems as a whole exceeds
not only the legislative mandate but also the capability of most counties. The legislation clearly requires
counties to annually report expenditure and outcome data related to YOBG and similarly requires the
Board to report that data to the Legislature. However, the notion that either counties or the Board would
have the data needed to report on juvenile justice systems as a whole is unfounded. Moreover, the
suggestion that the Board could measure the success of realignment given the limited scope of the YOBG
legislation, the lack of data at both the State and local levels, and the lack of defined goals for the
program, is misguided.
3.	 Despite repeated requests, the Board has never received any funding to support administration of the
YOBG program or expenditures. Approximately $93 million is allocated to counties every year, yet not
$1 has been provided for administration or oversight of this program. Through a limited redirection of
existing resources, the Board has been able to meet its mandate. To the extent there is interest in the
Board performing additional work related to the YOBG program, the allocation of administrative funds is
absolutely critical.

1

20
1

2

As noted above, we appreciate the work and insights provided by BSA as a result of its work on this audit. As
we move forward with both juvenile justice and criminal justice (adult offender) realignment and more fully
bring the vision of SB 92 to fruition, this work will undoubtedly prove useful.

3  Elsewhere in the report (page 23) BSA suggests that given the existence of 10 other state and federal grant programs administered by
    the Board, the workload that would be placed on counties to meet this requirement, at least as it relates to reporting certain outcomes for
    all juveniles who are served by block grant funds, might be minimal (in fact, counties might not need to collect any additional information).
    We have serious doubts that this would be the case.

20

76

California State Auditor Report 2011-129

September 2012

Blank page inserted for reproduction purposes only.

California State Auditor Report 2011-129

September 2012

Comments
CALIFORNIA STATE AUDITOR’S COMMENTS ON
THE RESPONSE FROM THE BOARD OF STATE AND
COMMUNITY CORRECTIONS
To provide clarity and perspective, we are commenting on the
Board of State and Community Corrections’ (board) response to
our audit. The numbers below correspond to the numbers we have
placed in the margin of the board’s response.
We acknowledge on page 1 of our report that the law does not
specifically require the board’s reports to include an assessment of
the outcomes of juvenile justice realignment. However, as we also
indicate on page 1, because the board is the only state administering
body referenced in the law that realigned juvenile offenders, we
would expect that its annual reports would give the Legislature
information with which to make such an assessment. Furthermore,
we are puzzled by the board’s reluctance to pursue identifying and
reporting information related to realignment considering that the
board’s mission, as of July 1, 2012, includes providing statewide
leadership, coordination, and technical assistance to promote
effective state and local efforts and partnerships in California’s adult
and juvenile criminal justice system.

1

We are aware of only one formal board-initiated request for
resources. However, the board made this request before state law
changed in 2009 to require the board to report expenditures and
performance outcomes to the Legislature and post them on its
Web site. We are unaware of any other formal board request for
funding to administer data collection and reporting for the block
grant, or for verifying the data that counties submit.

2

Although the board notes that it concurs with our observations,
it appears that the board does not intend to implement any of the
recommendations referenced in its response. This implies that
the board believes its current practices are adequate, when, as our
report concludes, they are not. Moreover, the board fails to address
other recommendations of our audit. Therefore, we look forward to
the board’s 60-day, six-month, and one-year updates on whether it
is making progress in implementing our recommendations.

3

Although the board asserts that the feedback it has received does
not indicate significant confusion on the part of the counties, we
note on page 29 of our report that three of the four counties we
visited submitted inaccurate data to the board. Because the board
compiles this inaccurate data in its annual report, users of that
report may reach incorrect conclusions related to the counties’ use
of block grant funds.

4

77

78

California State Auditor Report 2011-129

September 2012

5

We specifically discussed the intent of this recommendation with
the board’s executive director and staff during our August 2012
exit conference. To reiterate, because our review revealed that
counties had difficulty providing us documentation to support the
information they submitted to the board, one possible approach
would be for the board to ask counties to retain and, upon the
board’s request, submit supporting documents for some of
the information they provide. By exploring the viability of this
approach and other strategies, the board may be able to improve the
quality of data that counties submit without incurring significant
additional cost. Finally, because counties are required to submit
performance outcome and expenditure information, we believe that
the board should be concerned about the quality of county data
and should take steps to ensure that the information it receives and
subsequently reports is accurate.

6

The board is correct when it states that our recommendation
is based on the belief that the reporting of county level data is
important. However, we are not alone in this belief. In fact, as we
acknowledge on page 26, state law requires the board to prepare
and make available to the public on its Web site summaries of the
annual performance outcomes that counties submit. However,
currently the board only posts limited county-level data on its
Web site. Although we believe that collecting and reporting data
related to counties’ entire juvenile justice systems would be ideal,
the board can increase the amount of information available to
assess the outcomes of realignment by publishing more of the
county-level data on its Web site that it currently receives, as we
recommend on page 38.

7

While preparing our draft report for publication, page numbers
shifted. Therefore, the page numbers that the board cites
throughout its response do not correspond to the page numbers in
our final report.

8

We are dismayed by the board’s reluctance to provide potentially
valuable information regarding the effectiveness of juvenile justice
realignment to the Legislature and other stakeholders. Although
we agree that there may be legitimate reasons for the differences
in per capita costs across counties and that caution must be
taken in making inferences, the board’s approach of primarily
reporting aggregate data does not give users of its reports insights
to understand the different approaches that counties use to provide
juvenile justice services. Moreover, state law requires counties
to report to the board the annual per capita costs of block grant
programs, placements, strategies, or activities as well as requires
the board to prepare and post summaries of county reports on its

California State Auditor Report 2011-129

September 2012

Web site. Thus, the board already has the information available
that would mitigate its concern about drawing inferences from
per capita information.
The board misunderstands our statement. We do not say that
conclusions can be drawn about how a county manages its
juvenile justice system through an examination of how a county
uses block grant funds alone. To the contrary, we recommend
on pages 37 and 38 that to ensure that it has the information
necessary to meaningfully assess the outcomes of juvenile justice
realignment, the Legislature should consider amending state law
to require counties to collect and report countywide performance
outcomes and expenditures related to juvenile justice, rather
than outcomes and expenditures solely for the block grant.

9

We stand by our conclusion that the board’s decision to focus
reporting of county performance outcomes for only a sample of
juvenile offenders who committed felonies results in misleading
information. If designed appropriately, a sample of 1,000 juvenile
offenders would likely be large enough to make meaningful
inferences about counties. However, a significant shortcoming of
the board’s sampling method is that it is not designed to gather
information about how most counties choose to spend their block
grant funds. In particular, our review revealed that counties use
block grant funds to provide services to juvenile offenders at various
risk levels, not just those who have committed felonies. Specifically,
on page 25 of our report, we indicate that 44 counties reported
spending block grant funds on programs, placements, or services
that serve juvenile offenders with misdemeanors as opposed to
serving only those offenders with felonies.

10

Contrary to the board’s assertion, we believe that it is both practical
and—as our audit results demonstrate—very necessary, to provide
basic guidance to counties for how to report performance outcome
data in a consistent and meaningful manner. Moreover, as we
indicate on page 28, although the board does provide instructions
and frequently asked questions to assist counties when reporting
on performance outcomes, it does not provide guidance for basic
issues such as defining when a juvenile has completed a service
or whether a service directly impacts juvenile offenders. Further,
the board’s belief that we recommend that guidance is needed for
all 35 service types is an exaggeration. The board could provide
additional guidance applicable to most services and provide several
examples to show the intent of its guidance. Without additional
guidance to the counties, the board will continue to collect
inconsistent and dissimilar information, which results in reports
that can be misleading.

11

79

80

California State Auditor Report 2011-129

September 2012

12

The board is correct that there are potential legal considerations
related to it reviewing juvenile offender records. However, the
board could overcome this potential obstacle by requesting that
counties provide it with redacted copies of the records that counties
used to report performance outcomes. Further, as we recommend
on page 38, the board could decide to verify county data by
conducting regular site visits on a rotating basis or by employing
other procedures to verify data that counties submit.

13

We stand by our recommendation. As noted in point 2, the 2009
law change required the board to make county-level information
available on its Web site. Just because the board’s sampling method
provides limited information about each county does not mean that
the board should not provide this information to the public. If the
board does not believe that its current sampling method is adequate
for this purpose, it should modify the sampling method to gather
more meaningful information from the counties.

14

We agree that the board’s sample design results in performance
outcomes that may not be useful. Thus, as noted on page 38, we
recommend that the board work with the committee that established
performance outcome measures for the block grant and the counties to
determine the data that counties should report, while keeping in mind
the data that counties already collect to satisfy the requirements of
other grants that the board administers.

15

The board misinterprets the example on pages 23 and 24 of our
report in which we caution against drawing conclusions about
the differences between performance outcomes for Los Angeles
and Sacramento counties. The example is intended to show that
incorrect conclusions can be reached on the performance outcomes
that the board reports. The weakness of the board’s current
approach is that it reports only on juvenile offenders who commit
felonies but does not take into consideration that counties use block
grant funds for juvenile offenders at various risk levels.

16

To more thoroughly explain the board’s position, we clarified our
text under Objective 9 in Table 3 to acknowledge that the board
believes it has had no reason to take enforcement action against
the counties.

17

The board incorrectly infers that our most significant concern with
its reporting is the comparison of performance outcomes between
juvenile offenders who did or did not receive block grant‑funded
services. Rather, this concern is one among many. Specifically,
we also note concerns with the board’s use of only a sample of
juvenile offenders who committed felonies on page 25, insufficient
guidance to counties on page 28, and inadequate verification of

California State Auditor Report 2011-129

September 2012

county-reported data on page 29. As a result, we concluded that the
board’s overall reporting methodology is flawed and the results are
potentially misleading.
It is perplexing that the board continues to believe that comparing
outcomes between juveniles who receive block grant-funded
services and those who do not is useful and valid. As we note on
page 23, if the board did not intend for the Legislature to draw
conclusions from these comparisons, we question why it elected to
present the comparisons at all, especially given that the results can
be misleading. Moreover, by continuing to make these comparisons,
the board is missing an opportunity to improve the usefulness of
its reports. Finally, according to the 2012 law change, the board is
now charged with providing statewide leadership, coordination,
and technical assistance to promote effective state and local efforts
and partnerships in California’s adult and juvenile criminal justice
system. Given this law change, we firmly believe the board will have
an integral role to assist stakeholders in assessing the outcomes
of realignment.

18

Contrary to the board’s assertion, we not only recommend that the
Legislature adopt such goals, but on page 53 we also recommend
that the board assist the Legislature in this effort by working with
counties and stakeholders in proposing performance outcome goals
to measure the success of realignment.

19

We are not suggesting that counties presently capture and report
every aspect of their juvenile justices systems. Rather, by reporting
broader information that is currently available, we believe the board
can present better, more complete information about the outcomes
of realignment. Further, we disagree with the board’s assertion
that neither counties nor the board have the data needed to report
on counties’ entire juvenile justice systems. As we indicated on
page 33, counties already report several pieces of key information
for the 10 other state and federal grant programs the board
administers. One major state grant, the Juvenile Justice Crime
Prevention Act, requires counties to report certain countywide
statistics such as the total number of arrests. Moreover, we
recommend on page 38 that the board work with counties and
other relevant stakeholders to determine what data is currently
available to minimize the potential for creating a state mandate.

20

81

82

California State Auditor Report 2011-129

September 2012

Blank page inserted for reproduction purposes only.

California State Auditor Report 2011-129

September 2012

(Agency comments provided as text only.)
August 21, 2012
California Department of Corrections and Rehabilitation
P.O. Box 942883
Sacramento, CA 94283-0001
Ms. Elaine M. Howle, State Auditor
Bureau of State Audits
555 Capitol Mall, Suite 300
Sacramento, CA 95814
Dear Ms. Howle:
This letter serves as the California Department of Corrections and Rehabilitation’s (CDCR) response relative to
the Bureau of State Audits’ (BSA) draft report titled: Juvenile Justice Realignment: Limited Information Prevents a
Meaningful Assessment of Realignment Effectiveness.
CDCR recognizes the value in having the most complete information concerning juvenile offenders who are
sent directly to adult prisons, including specific offense dates. CDCR agrees that it should obtain complete
offense dates from the courts—if possible—as your report recommends and will continue to work with
county courts to do so.
CDCR would like to thank BSA for the opportunity to respond to this draft report. Should you have any
questions or concerns, please contact Kim Holt, Operations Manager, at (916) 255-2701 or Tami Schrock,
External Audits Coordinator at (916) 255-2644.
Sincerely,
(Signed by: Lee E. Seale)
LEE E. SEALE
Director
Division of Internal Oversight and Research

83

84

California State Auditor Report 2011-129

September 2012

Blank page inserted for reproduction purposes only.

California State Auditor Report 2011-129

85

September 2012

Department of Justice
P.O. Box 944255
Sacramento, CA 94244-2550
August 21, 2012			
Elaine M. Howle, CPA*
State Auditor
Bureau of State Audits
555 Capitol Mall, Suite 300
Sacramento, CA 95814
Re:   BSA Report 2011-129
Dear Ms. Howle,
The Department of Justice (DOJ) has reviewed the Bureau of State Audits’ (BSA) draft report titled “Juvenile
Justice Realignment: Limited Information Prevents a Meaningful Assessment of Realignment’s Effectiveness” and
appreciates the opportunity to respond to the report.
Based on the review of DOJ’s criminal history system and the Juvenile Court and Probation Statistical
System (JCPSS), BSA determined that DOJ could do more to ensure juvenile justice data are accurate
and reliable for assessing certain outcomes of realignment. In completing this review, DOJ appreciates
BSA’s recognition that DOJ did not design its JCPSS and criminal history system to track the outcomes of
realignment or to track or assess statewide trends.
JCPSS is designed to and succeeds in maintaining all of the data required by Welfare and Institutions
Code section 13012. This section requires retention of information on the amount and types of offenses,
the personal and social characteristics of criminals and delinquents, all administrative actions taken by law
enforcement, judicial, penal, and correctional agencies related to these individuals, and the number of
citizens’ complaints received by law enforcement agencies of felonies and misdemeanors.
In response to the BSA’s recommendations identified in the report, DOJ submits the following responses:
BSA Recommendation: To ensure the accuracy and completeness of the data the counties submit into the JCPSS,
Justice should follow its procedure to send annual summaries of the JCPSS data to the counties for review and
to conduct occasional field audits of counties’ records.
DOJ Response: DOJ has and will continue to send annual summaries of the JCPSS data to counties for
their review and will obtain a confirmation from each county that the summaries have been received
and accurately reflect the data submitted. DOJ will also continue to conduct semi-annual surveys of
the counties to confirm the number and completeness of cases for juveniles sent to adult court. These
confirmations will be maintained at DOJ.
The JCPSS Users Manual, which cites the field audit function, was originally prepared in 2002 in
cooperation with DOJ’s JCPSS Advisory Committee. The JCPSS Users Manual reflects how JCPSS
would be implemented based on ideas of the users group. It was not intended to be an inflexible set

*  California State Auditor’s comments appear on page 89.

1

86

California State Auditor Report 2011-129

September 2012

Elaine M. Howle, State Auditor
August 21, 2012
Page 2

2

of requirements. Since the field audit function is not considered to be DOJ policy, the JCPSS Users Manual
will be revised to delete the field audit language.
BSA Recommendation: To ensure that its criminal history system contains complete and accurate data related to
juvenile offenders, Justice should do the following:
•    Implement a process to ensure staff enter data accurately into the system.
•    Implement a procedure similar to the one it employs for the JCPSS to verify the accuracy of information the
counties submit.
DOJ Response: With regard to the first bullet above, DOJ will revise its written procedures and provide
follow up training to staff regarding the process of manually updating DOJ’s criminal history system
with juvenile offender information provided to DOJ in paper format. Additionally, while there are separate
juvenile and adult disposition codes already in place for the majority of criminal disposition reporting
scenarios, some codes are interchangeable between adult and juvenile actions. DOJ will address those
interchangeable codes and seek to establish additional disposition reporting codes for juveniles that will
not overlap with the adult disposition codes.

3

4

5

It is important to note that even with the updated procedures and training of DOJ staff cited above, if
criminal justice agencies submit juvenile disposition information electronically through DOJ’s Automated
Tape Disposition Reporting (ATDR) system, it will still be inaccurate. The ATDR system was created for
reporting adult dispositions to DOJ’s criminal history system and will accept only adult disposition codes.
If an agency were to submit a juvenile disposition via this electronic system, they would have only adult
disposition reporting codes available to them. Thus, the criminal history record for the juvenile would
reflect an adult disposition. This electronic system does not have the ability to compare birth dates with
adult disposition codes to ensure adult codes are not being incorrectly used.
With regard to the second bullet above, DOJ always strives to maintain accurate and timely criminal
history information. DOJ is the statutorily mandated repository of criminal history information submitted
by criminal justice agencies. However, DOJ must rely on these agencies to submit the information
from their records in an accurate and timely manner. DOJ staff will continue to contact an agency for
clarification if the forwarded information cannot be reasonably updated to the criminal history system,
but it would not be appropriate for DOJ to audit local agencies or courts as to how they arrived at the
reported disposition in a case. 	
Finally, DOJ would like to address two additional issues from the report. First, according to Chapter 1 of
the audit report, “[t]he program manager noted that the modifications to the JCPSS that would be necessary
to track statewide statistics would add significant costs and require Justice to comply with new state and
federal laws regarding the collection of such data.” This statement incorrectly implies that DOJ’s methods
do not comply with current state and federal laws regarding the collection of statewide statistics data.
Modifying JCPSS to incorporate the use of biometric information to facilitate identifying and tracking first
and repeat juvenile offenders would be costly and would trigger additional new state and federal laws
regarding the collection, storage, and dissemination of biometric data.

California State Auditor Report 2011-129

87

September 2012

Elaine M. Howle, State Auditor
August 21, 2012
Page 3

The second issue is the report’s repeated description of the JCPSS data as being poor quality, inaccurate,
unreliable and of limited usefulness. JCPSS contains many automatic internal checks. Significant variations
in the data provided from month to month by the counties are investigated to ensure they are accurate.
JCPSS will not accept information from counties that lacks certain fields or violates certain rules, to ensure
the quality of the data. While DOJ cannot assure that every record within the JCPSS is accurate, that does
not imply that all or even a significant number of the records in the system are inaccurate. A more accurate
statement would be that the information sought during this audit was not captured by JCPSS and therefore
the data that was available was of limited use to BSA.
Again, thank you for the opportunity to review and comment on this draft audit report. If you have any
questions or concerns regarding this matter, you may contact me at the telephone number listed above or
Tammy Lopes, Assistant Bureau Chief, Bureau of Criminal Information and Analysis, at (916) 227-4777.
						Sincerely,
						(Signed by: Andrew J. Kraus III)
						ANDREW J. KRAUS III, CPA
						Director

6

88

California State Auditor Report 2011-129

September 2012

Blank page inserted for reproduction purposes only.

California State Auditor Report 2011-129

September 2012

Comments
CALIFORNIA STATE AUDITOR’S COMMENTS ON THE
RESPONSE FROM THE DEPARTMENT OF JUSTICE
To provide clarity and perspective, we are commenting on the
Department of Justice’s (Justice) response to our audit. The
numbers below correspond to the numbers we have placed in
the margin of Justice’s response.
For clarity, Justice should have referred to Section 13012 of the
Penal Code rather than the Welfare and Institutions Code.

1

Justice’s response is troubling. Rather than attempting to follow
through with its procedure to conduct occasional audits of
counties’ records with the intent to improve data in the Juvenile
Court and Probation Statistical System (JCPSS), Justice indicates
it will delete the procedure. Further, since it offers no alternative
procedures, it appears that Justice does not intend to take appropriate
action to proactively address the issues we found with JCPSS data.

2

As we acknowledge on page 35, Justice was not able to demonstrate
to us that counties submitted inaccurate data. However, if Justice
believes that counties are submitting inaccurate data, as we
recommend on page 39, it should take steps to verify the accuracy
of information that counties submit.

3

Justice misunderstands our recommendation. We do not suggest
that Justice should audit local agencies or courts as to how they
arrived at a disposition. Rather, we recommended that Justice
ensure that the data counties submit are corroborated by the
counties’ underlying records.

4

Justice is mistaken. Our quoted text Justice refers to is a statement
on page 35 from its program manager and is not intended to imply
or indicate that Justice’s data collection methods fail to comply with
state or federal laws.

5

Justice exaggerates our use of certain terminology when we
describe JCPSS data and takes our statements out of context.
We use the terms because the various data systems we reviewed,
including Justice’s JCPSS and Automated Criminal History System,
cannot provide quality information related to juvenile justice
realignment. Moreover, under generally accepted government
auditing standards, which we are required to follow, we must
disclose limitations on the data we report. In this case, we want to
ensure that readers clearly understand the limitations to the data
that is currently available to assess the effectiveness of juvenile
justice realignment.

6

89

90

California State Auditor Report 2011-129

September 2012

cc:	

Members of the Legislature
Office of the Lieutenant Governor
Little Hoover Commission
Department of Finance
Attorney General
State Controller
State Treasurer
Legislative Analyst
Senate Office of Research
California Research Bureau
Capitol Press

 

 

Disciplinary Self-Help Litigation Manual - Side
CLN Subscribe Now Ad
Stop Prison Profiteering Campaign Ad 2